Sample records for multiplication techniques based

  1. Activity Detection and Retrieval for Image and Video Data with Limited Training

    DTIC Science & Technology

    2015-06-10

    applications. Here we propose two techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a... automata . For our second approach to segmentation, we employ a region based segmentation technique that is capable of handling intensity inhomogeneity...techniques for image segmentation. The first involves an automata based multiple threshold selection scheme, where a mixture of Gaussian is fitted to the

  2. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    NASA Astrophysics Data System (ADS)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  3. Proceedings of the Mobile Satellite System Architectures and Multiple Access Techniques Workshop

    NASA Technical Reports Server (NTRS)

    Dessouky, Khaled

    1989-01-01

    The Mobile Satellite System Architectures and Multiple Access Techniques Workshop served as a forum for the debate of system and network architecture issues. Particular emphasis was on those issues relating to the choice of multiple access technique(s) for the Mobile Satellite Service (MSS). These proceedings contain articles that expand upon the 12 presentations given in the workshop. Contrasting views on Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), and Time Division Multiple Access (TDMA)-based architectures are presented, and system issues relating to signaling, spacecraft design, and network management constraints are addressed. An overview article that summarizes the issues raised in the numerous discussion periods of the workshop is also included.

  4. Multiple Query Evaluation Based on an Enhanced Genetic Algorithm.

    ERIC Educational Resources Information Center

    Tamine, Lynda; Chrisment, Claude; Boughanem, Mohand

    2003-01-01

    Explains the use of genetic algorithms to combine results from multiple query evaluations to improve relevance in information retrieval. Discusses niching techniques, relevance feedback techniques, and evolution heuristics, and compares retrieval results obtained by both genetic multiple query evaluation and classical single query evaluation…

  5. Monitoring technique for multiple power splitter-passive optical networks using a tunable OTDR and FBGs

    NASA Astrophysics Data System (ADS)

    Hann, Swook; Kim, Dong-Hwan; Park, Chang-Soo

    2006-04-01

    A monitoring technique for multiple power splitter-passive optical networks (PS-PON) is presented. The technique is based on the remote sensing of fiber Bragg grating (FBG) using a tunable OTDR. To monitor the multiple PS-PON, the FBG can be used for a wavelength dependent reflective reference on each branch end of the PS. The FBG helps discern an individual event of the multiple PS-PON for the monitoring in collaborate with information of Rayleigh backscattered power. The multiple PS-PON can be analyzed by the monitoring method at the central office under 10-Gbit/s in-service.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tate, John G.; Richardson, Bradley S.; Love, Lonnie J.

    ORNL worked with the Schaeffler Group USA to explore additive manufacturing techniques that might be appropriate for prototyping of bearing cages. Multiple additive manufacturing techniques were investigated, including e-beam, binder jet and multiple laser based processes. The binder jet process worked best for the thin, detailed cages printed.

  7. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.

    1993-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.

  8. Classification of air quality using fuzzy synthetic multiplication.

    PubMed

    Abdullah, Lazim; Khalid, Noor Dalina

    2012-11-01

    Proper identification of environment's air quality based on limited observations is an essential task to meet the goals of environmental management. Various classification methods have been used to estimate the change of air quality status and health. However, discrepancies frequently arise from the lack of clear distinction between each air quality, the uncertainty in the quality criteria employed and the vagueness or fuzziness embedded in the decision-making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies when describing integrated air quality conditions with respect to various pollutants. Therefore, this paper presents two fuzzy multiplication synthetic techniques to establish classification of air quality. The fuzzy multiplication technique empowers the max-min operations in "or" and "and" in executing the fuzzy arithmetic operations. Based on a set of air pollutants data carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matter (PM(10)) collected from a network of 51 stations in Klang Valley, East Malaysia, Sabah, and Sarawak were utilized in this evaluation. The two fuzzy multiplication techniques consistently classified Malaysia's air quality as "good." The findings indicated that the techniques may have successfully harmonized inherent discrepancies and interpret complex conditions. It was demonstrated that fuzzy synthetic multiplication techniques are quite appropriate techniques for air quality management.

  9. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    NASA Astrophysics Data System (ADS)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  10. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    NASA Astrophysics Data System (ADS)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  11. Multiple excitation nano-spot generation and confocal detection for far-field microscopy.

    PubMed

    Mondal, Partha Pratim

    2010-03-01

    An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.

  12. Multiple excitation nano-spot generation and confocal detection for far-field microscopy

    NASA Astrophysics Data System (ADS)

    Mondal, Partha Pratim

    2010-03-01

    An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.

  13. Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues

    NASA Astrophysics Data System (ADS)

    Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.

    Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.

  14. Multiple Contact Dates and SARS Incubation Periods

    PubMed Central

    2004-01-01

    Many severe acute respiratory syndrome (SARS) patients have multiple possible incubation periods due to multiple contact dates. Multiple contact dates cannot be used in standard statistical analytic techniques, however. I present a simple spreadsheet-based method that uses multiple contact dates to calculate the possible incubation periods of SARS. PMID:15030684

  15. Ambiguity Of Doppler Centroid In Synthetic-Aperture Radar

    NASA Technical Reports Server (NTRS)

    Chang, Chi-Yung; Curlander, John C.

    1991-01-01

    Paper discusses performances of two algorithms for resolution of ambiguity in estimated Doppler centroid frequency of echoes in synthetic-aperture radar. One based on range-cross-correlation technique, other based on multiple-pulse-repetition-frequency technique.

  16. Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Alewine, Neal Jon

    1993-01-01

    Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.

  17. Robust myoelectric signal detection based on stochastic resonance using multiple-surface-electrode array made of carbon nanotube composite paper

    NASA Astrophysics Data System (ADS)

    Shirata, Kento; Inden, Yuki; Kasai, Seiya; Oya, Takahide; Hagiwara, Yosuke; Kaeriyama, Shunichi; Nakamura, Hideyuki

    2016-04-01

    We investigated the robust detection of surface electromyogram (EMG) signals based on the stochastic resonance (SR) phenomenon, in which the response to weak signals is optimized by adding noise, combined with multiple surface electrodes. Flexible carbon nanotube composite paper (CNT-cp) was applied to the surface electrode, which showed good performance that is comparable to that of conventional Ag/AgCl electrodes. The SR-based EMG signal system integrating an 8-Schmitt-trigger network and the multiple-CNT-cp-electrode array successfully detected weak EMG signals even when the subject’s body is in the motion, which was difficult to achieve using the conventional technique. The feasibility of the SR-based EMG detection technique was confirmed by demonstrating its applicability to robot hand control.

  18. Enhancement of Seebeck coefficient in graphene superlattices by electron filtering technique

    NASA Astrophysics Data System (ADS)

    Mishra, Shakti Kumar; Kumar, Amar; Kaushik, Chetan Prakash; Dikshit, Biswaranjan

    2018-01-01

    We show theoretically that the Seebeck coefficient and the thermoelectric figure of merit can be increased by using electron filtering technique in graphene superlattice based thermoelectric devices. The average Seebeck coefficient for graphene-based thermoelectric devices is proportional to the integral of the distribution of Seebeck coefficient versus energy of electrons. The low energy electrons in the distribution curve are found to reduce the average Seebeck coefficient as their contribution is negative. We show that, with electron energy filtering technique using multiple graphene superlattice heterostructures, the low energy electrons can be filtered out and the Seebeck coefficient can be increased. The multiple graphene superlattice heterostructures can be formed by graphene superlattices with different periodic electric potentials applied above the superlattice. The overall electronic band gap of the multiple heterostructures is dependent upon the individual band gap of the graphene superlattices and can be tuned by varying the periodic electric potentials. The overall electronic band gap of the multiple heterostructures has to be properly chosen such that, the low energy electrons which cause negative Seebeck distribution in single graphene superlattice thermoelectric devices fall within the overall band gap formed by the multiple heterostructures. Although the electrical conductance is decreased in this technique reducing the thermoelectric figure of merit, the overall figure of merit is increased due to huge increase in Seebeck coefficient and its square dependency upon the Seebeck coefficient. This is an easy technique to make graphene superlattice based thermoelectric devices more efficient and has the potential to significantly improve the technology of energy harvesting and sensors.

  19. Graphical approach for multiple values logic minimization

    NASA Astrophysics Data System (ADS)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  20. Compiler-assisted multiple instruction rollback recovery using a read buffer

    NASA Technical Reports Server (NTRS)

    Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.

    1995-01-01

    Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.

  1. Electrical characterization of a Mapham inverter using pulse testing techniques

    NASA Technical Reports Server (NTRS)

    Baumann, E. D.; Myers, I. T.; Hammoud, A. N.

    1990-01-01

    The use of a multiple pulse testing technique to determine the electrical characteristics of large megawatt-level power systems for aerospace missions is proposed. An innovative test method based on the multiple pulse technique is demonstrated on a 2-kW Mapham inverter. The concept of this technique shows that characterization of large power systems under electrical equilibrium at rated power can be accomplished without large costly power supplies. The heat generation that occurs in systems when tested in a continuous mode is eliminated. The results indicate that there is a good agreement between this testing technique and that of steady state testing.

  2. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  3. Ultrasonic Array for Obstacle Detection Based on CDMA with Kasami Codes

    PubMed Central

    Diego, Cristina; Hernández, Álvaro; Jiménez, Ana; Álvarez, Fernando J.; Sanz, Rebeca; Aparicio, Joaquín

    2011-01-01

    This paper raises the design of an ultrasonic array for obstacle detection based on Phased Array (PA) techniques, which steers the acoustic beam through the environment by electronics rather than mechanical means. The transmission of every element in the array has been encoded, according to Code Division for Multiple Access (CDMA), which allows multiple beams to be transmitted simultaneously. All these features together enable a parallel scanning system which does not only improve the image rate but also achieves longer inspection distances in comparison with conventional PA techniques. PMID:22247675

  4. Periodontal plastic surgery of gingival recessions at single and multiple teeth.

    PubMed

    Cairo, Francesco

    2017-10-01

    This manuscript aims to review periodontal plastic surgery for root coverage at single and multiple gingival recessions. Techniques are assessed based on biological principles, surgical procedures, prognosticative factors and expected clinical and esthetic outcomes. The use of coronally advanced flap, laterally sliding flap, free gingival graft, the tunnel grafting technique, barrier membranes, enamel matrix derivative, collagen matrix and acellular dermal matrix are evaluated. The clinical scenario and practical implications are analyzed according to a modern evidence-based approach. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. [Development of the automatic dental X-ray film processor].

    PubMed

    Bai, J; Chen, H

    1999-07-01

    This paper introduces a multiple-point detecting technique of the density of dental X-ray films. With the infrared ray multiple-point detecting technique, a single-chip microcomputer control system is used to analyze the effectiveness of the film-developing in real time in order to achieve a good image. Based on the new technology, We designed the intelligent automatic dental X-ray film processing.

  6. A novel fiber-free technique for brain activity imaging in multiple freely behaving mice

    NASA Astrophysics Data System (ADS)

    Inagaki, Shigenori; Agetsuma, Masakazu; Nagai, Takeharu

    2018-02-01

    Brain functions and related psychiatric disorders have been investigated by recording electrophysiological field potential. When recording it, a conventional method requires fiber-based apparatus connected to the brain, which however hampers the simultaneous measurement in multiple animals (e.g. by a tangle of fibers). Here, we propose a fiber-free recording technique in conjunction with a ratiometric bioluminescent voltage indicator. Our method allows investigation of electrophysiological filed potential dynamics in multiple freely behaving animals simultaneously over a long time period. Therefore, this fiber-free technique opens up the way to investigate a new mechanism of brain function that governs social behaviors and animal-to-animal interaction.

  7. Multiple-Group Analysis Using the sem Package in the R System

    ERIC Educational Resources Information Center

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  8. Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers

    NASA Astrophysics Data System (ADS)

    Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille

    This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.

  9. Multiple access techniques and spectrum utilization of the GLOBALSTAR mobile satellite system

    NASA Astrophysics Data System (ADS)

    Louie, Ming; Cohen, Michel; Rouffet, Denis; Gilhousen, Klein S.

    The GLOBALSTAR System is a Low Earth Orbit (LEO) satellite-based mobile communications system that is interoperable with the current and future Public Land Mobile Network (PLMN). The GLOBALSTAR System concept is based upon technological advancement in two key areas: (1) the advancement in LEO satellite technology; (2) the advancement in cellular telephone technology, including the commercial applications of Code Division Multiple Access (CDMA) technologies, and of the most recent progress in Time Division Multiple Access technologies. The GLOBALSTAR System uses elements of CDMA, Frequency Division Multiple Access (FDMA), and Time Division Multiple Access (TDMA) technology, combining with satellite Multiple Beam Antenna (MBA) technology, to arrive at one of the most efficient modulation and multiple access system ever proposed for a satellite communications system. The technology used in GLOBALSTAR exploits the following techniques in obtaining high spectral efficiency and affordable cost per channel, with minimum coordination among different systems: power control, in open and closed loops, voice activation, spot beam satellite antenna for frequency reuse, weighted satellite antenna gain, multiple satellite coverage, and handoff between satellites. The GLOBALSTAR system design will use the following frequency bands: 1610-1626.5 MHz for up-link and 2483.5-2500 MHz for down-link.

  10. An inverter-based capacitive trans-impedance amplifier readout with offset cancellation and temporal noise reduction for IR focal plane array

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Han; Hsieh, Chih-Cheng

    2013-09-01

    This paper presents a readout integrated circuit (ROIC) with inverter-based capacitive trans-impedance amplifier (CTIA) and pseudo-multiple sampling technique for infrared focal plane array (IRFPA). The proposed inverter-based CTIA with a coupling capacitor [1], executing auto-zeroing technique to cancel out the varied offset voltage from process variation, is used to substitute differential amplifier in conventional CTIA. The tunable detector bias is applied from a global external bias before exposure. This scheme not only retains stable detector bias voltage and signal injection efficiency, but also reduces the pixel area as well. Pseudo-multiple sampling technique [2] is adopted to reduce the temporal noise of readout circuit. The noise reduction performance is comparable to the conventional multiple sampling operation without need of longer readout time proportional to the number of samples. A CMOS image sensor chip with 55×65 pixel array has been fabricated in 0.18um CMOS technology. It achieves a 12um×12um pixel size, a frame rate of 72 fps, a power-per-pixel of 0.66uW/pixel, and a readout temporal noise of 1.06mVrms (16 times of pseudo-multiple sampling), respectively.

  11. Slowness based CCP stacking technique in suppressing crustal multiples

    NASA Astrophysics Data System (ADS)

    Guan, Z.; Niu, F.

    2016-12-01

    Common-conversion-point (CCP) stacking of receiver function is a widely used technique to image velocity discontinuities in the mantle, such as the lithosphere-asthenosphere boundary (LAB) in the upper mantle, the 410-km and the 660-km discontinuities in the mantle transition zone. In a layered medium, a teleseismic record can be considered as the summation of the direct arrival and a series of conversions and reflections at boundaries below the station. Receiver functions are an attempt to approximate a Green's function associated with structure beneath the receiver by deconvolving one component of a teleseismic signal from another to remove source signals from seismograms. The CCP technique assumes that receiver functions composed solely of P to S conversions at velocity boundaries, whose depths can be mapped out through their arrival times. The multiple reflections at shallow boundaries with large velocity contrasts, such as the base of unconsolidated sediments and the Moho, can pose significant challenges to the accuracy of CCP imaging. In principle, the P to S conversions and multiples originated from deep and shallow boundaries arrive at a seismic station with incident angles that are, respectively, smaller and larger than that of the direct P wave. Therefore the corresponding slowness can be used to isolate the conversions from multiples, allowing for minimizing multiple-induced artifacts. We developed a refined CCP stacking method that uses relative slowness as a weighting factor to suppress the multiples. We performed extensive numerical tests with synthetic data to seek the best weighting scheme and to verify the robustness of the images. We applied the refined technique to the NECESSArray data, and found that the complicated low velocity structures in the depth range of 200-400 km shown in the CCP images of previous studies are mostly artifacts resulted from crustal multiples.

  12. Using cognitive task analysis to develop simulation-based training for medical tasks.

    PubMed

    Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette

    2013-10-01

    Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  13. An expert system shell for inferring vegetation characteristics: Implementation of additional techniques (task E)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1992-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. The VEG subgoal PROPORTION.GROUND.COVER has been completed and a number of additional techniques that infer the proportion ground cover of a sample have been implemented. Some techniques operate on sample data at a single wavelength. The techniques previously incorporated in VEG for other subgoals operated on data at a single wavelength so implementing the additional single wavelength techniques required no changes to the structure of VEG. Two techniques which use data at multiple wavelengths to infer proportion ground cover were also implemented. This work involved modifying the structure of VEG so that multiple wavelength techniques could be incorporated. All the new techniques were tested using both the VEG 'Research Mode' and the 'Automatic Mode.'

  14. Surface-based reconstruction and diffusion MRI in the assessment of gray and white matter damage in multiple sclerosis

    NASA Astrophysics Data System (ADS)

    Caffini, Matteo; Bergsland, Niels; LaganÃ, Marcella; Tavazzi, Eleonora; Tortorella, Paola; Rovaris, Marco; Baselli, Giuseppe

    2014-03-01

    Despite advances in the application of nonconventional MRI techniques in furthering the understanding of multiple sclerosis pathogenic mechanisms, there are still many unanswered questions, such as the relationship between gray and white matter damage. We applied a combination of advanced surface-based reconstruction and diffusion tensor imaging techniques to address this issue. We found significant relationships between white matter tract integrity indices and corresponding cortical structures. Our results suggest a direct link between damage in white and gray matter and contribute to the notion of gray matter loss relating to clinical disability.

  15. Detection of abrupt changes in dynamic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1984-01-01

    Some of the basic ideas associated with the detection of abrupt changes in dynamic systems are presented. Multiple filter-based techniques and residual-based method and the multiple model and generalized likelihood ratio methods are considered. Issues such as the effect of unknown onset time on algorithm complexity and structure and robustness to model uncertainty are discussed.

  16. Optical pulse synthesis using brillouin selective sideband amplification

    NASA Technical Reports Server (NTRS)

    Yao, X. Steve (Inventor)

    2002-01-01

    Techniques for producing optical pulses based on Brillouin selective sideband amplification by using a common modulation control signal to modulate both a signal beam to produce multiple sideband signals and a single pump beam to produce multiple pump beams.

  17. Inhomogeneity compensation for MR brain image segmentation using a multi-stage FCM-based approach.

    PubMed

    Szilágyi, László; Szilágyi, Sándor M; Dávid, László; Benyó, Zoltán

    2008-01-01

    Intensity inhomogeneity or intensity non-uniformity (INU) is an undesired phenomenon that represents the main obstacle for MR image segmentation and registration methods. Various techniques have been proposed to eliminate or compensate the INU, most of which are embedded into clustering algorithms. This paper proposes a multiple stage fuzzy c-means (FCM) based algorithm for the estimation and compensation of the slowly varying additive or multiplicative noise, supported by a pre-filtering technique for Gaussian and impulse noise elimination. The slowly varying behavior of the bias or gain field is assured by a smoothening filter that performs a context dependent averaging, based on a morphological criterion. The experiments using 2-D synthetic phantoms and real MR images show, that the proposed method provides accurate segmentation. The produced segmentation and fuzzy membership values can serve as excellent support for 3-D registration and segmentation techniques.

  18. Automated simultaneous multiple feature classification of MTI data

    NASA Astrophysics Data System (ADS)

    Harvey, Neal R.; Theiler, James P.; Balick, Lee K.; Pope, Paul A.; Szymanski, John J.; Perkins, Simon J.; Porter, Reid B.; Brumby, Steven P.; Bloch, Jeffrey J.; David, Nancy A.; Galassi, Mark C.

    2002-08-01

    Los Alamos National Laboratory has developed and demonstrated a highly capable system, GENIE, for the two-class problem of detecting a single feature against a background of non-feature. In addition to the two-class case, however, a commonly encountered remote sensing task is the segmentation of multispectral image data into a larger number of distinct feature classes or land cover types. To this end we have extended our existing system to allow the simultaneous classification of multiple features/classes from multispectral data. The technique builds on previous work and its core continues to utilize a hybrid evolutionary-algorithm-based system capable of searching for image processing pipelines optimized for specific image feature extraction tasks. We describe the improvements made to the GENIE software to allow multiple-feature classification and describe the application of this system to the automatic simultaneous classification of multiple features from MTI image data. We show the application of the multiple-feature classification technique to the problem of classifying lava flows on Mauna Loa volcano, Hawaii, using MTI image data and compare the classification results with standard supervised multiple-feature classification techniques.

  19. A near-optimal low complexity sensor fusion technique for accurate indoor localization based on ultrasound time of arrival measurements from low-quality sensors

    NASA Astrophysics Data System (ADS)

    Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.

    2009-05-01

    A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.

  20. Damage Evaluation Based on a Wave Energy Flow Map Using Multiple PZT Sensors

    PubMed Central

    Liu, Yaolu; Hu, Ning; Xu, Hong; Yuan, Weifeng; Yan, Cheng; Li, Yuan; Goda, Riu; Alamusi; Qiu, Jinhao; Ning, Huiming; Wu, Liangke

    2014-01-01

    A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti's reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map. PMID:24463430

  1. Application of the multiple PRF technique to resolve Doppler centroid estimation ambiguity for spaceborne SAR

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    Estimation of the Doppler centroid ambiguity is a necessary element of the signal processing for SAR systems with large antenna pointing errors. Without proper resolution of the Doppler centroid estimation (DCE) ambiguity, the image quality will be degraded in the system impulse response function and the geometric fidelity. Two techniques for resolution of DCE ambiguity for the spaceborne SAR are presented; they include a brief review of the range cross-correlation technique and presentation of a new technique using multiple pulse repetition frequencies (PRFs). For SAR systems, where other performance factors control selection of the PRF's, an algorithm is devised to resolve the ambiguity that uses PRF's of arbitrary numerical values. The performance of this multiple PRF technique is analyzed based on a statistical error model. An example is presented that demonstrates for the Shuttle Imaging Radar-C (SIR-C) C-band SAR, the probability of correct ambiguity resolution is higher than 95 percent for antenna attitude errors as large as 3 deg.

  2. Multidimensional Test Assembly Based on Lagrangian Relaxation Techniques. Research Report 98-08.

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.

    In this paper, a mathematical programming approach is presented for the assembly of ability tests measuring multiple traits. The values of the variance functions of the estimators of the traits are minimized, while test specifications are met. The approach is based on Lagrangian relaxation techniques and provides good results for the two…

  3. Application of Multiregressive Linear Models, Dynamic Kriging Models and Neural Network Models to Predictive Maintenance of Hydroelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Lucifredi, A.; Mazzieri, C.; Rossi, M.

    2000-05-01

    Since the operational conditions of a hydroelectric unit can vary within a wide range, the monitoring system must be able to distinguish between the variations of the monitored variable caused by variations of the operation conditions and those due to arising and progressing of failures and misoperations. The paper aims to identify the best technique to be adopted for the monitoring system. Three different methods have been implemented and compared. Two of them use statistical techniques: the first, the linear multiple regression, expresses the monitored variable as a linear function of the process parameters (independent variables), while the second, the dynamic kriging technique, is a modified technique of multiple linear regression representing the monitored variable as a linear combination of the process variables in such a way as to minimize the variance of the estimate error. The third is based on neural networks. Tests have shown that the monitoring system based on the kriging technique is not affected by some problems common to the other two models e.g. the requirement of a large amount of data for their tuning, both for training the neural network and defining the optimum plane for the multiple regression, not only in the system starting phase but also after a trivial operation of maintenance involving the substitution of machinery components having a direct impact on the observed variable. Or, in addition, the necessity of different models to describe in a satisfactory way the different ranges of operation of the plant. The monitoring system based on the kriging statistical technique overrides the previous difficulties: it does not require a large amount of data to be tuned and is immediately operational: given two points, the third can be immediately estimated; in addition the model follows the system without adapting itself to it. The results of the experimentation performed seem to indicate that a model based on a neural network or on a linear multiple regression is not optimal, and that a different approach is necessary to reduce the amount of work during the learning phase using, when available, all the information stored during the initial phase of the plant to build the reference baseline, elaborating, if it is the case, the raw information available. A mixed approach using the kriging statistical technique and neural network techniques could optimise the result.

  4. Multiplexed Affinity-Based Separation of Proteins and Cells Using Inertial Microfluidics.

    PubMed

    Sarkar, Aniruddh; Hou, Han Wei; Mahan, Alison E; Han, Jongyoon; Alter, Galit

    2016-03-30

    Isolation of low abundance proteins or rare cells from complex mixtures, such as blood, is required for many diagnostic, therapeutic and research applications. Current affinity-based protein or cell separation methods use binary 'bind-elute' separations and are inefficient when applied to the isolation of multiple low-abundance proteins or cell types. We present a method for rapid and multiplexed, yet inexpensive, affinity-based isolation of both proteins and cells, using a size-coded mixture of multiple affinity-capture microbeads and an inertial microfluidic particle sorter device. In a single binding step, different targets-cells or proteins-bind to beads of different sizes, which are then sorted by flowing them through a spiral microfluidic channel. This technique performs continuous-flow, high throughput affinity-separation of milligram-scale protein samples or millions of cells in minutes after binding. We demonstrate the simultaneous isolation of multiple antibodies from serum and multiple cell types from peripheral blood mononuclear cells or whole blood. We use the technique to isolate low abundance antibodies specific to different HIV antigens and rare HIV-specific cells from blood obtained from HIV+ patients.

  5. ACM-based automatic liver segmentation from 3-D CT images by combining multiple atlases and improved mean-shift techniques.

    PubMed

    Ji, Hongwei; He, Jiangping; Yang, Xin; Deklerck, Rudi; Cornelis, Jan

    2013-05-01

    In this paper, we present an autocontext model(ACM)-based automatic liver segmentation algorithm, which combines ACM, multiatlases, and mean-shift techniques to segment liver from 3-D CT images. Our algorithm is a learning-based method and can be divided into two stages. At the first stage, i.e., the training stage, ACM is performed to learn a sequence of classifiers in each atlas space (based on each atlas and other aligned atlases). With the use of multiple atlases, multiple sequences of ACM-based classifiers are obtained. At the second stage, i.e., the segmentation stage, the test image will be segmented in each atlas space by applying each sequence of ACM-based classifiers. The final segmentation result will be obtained by fusing segmentation results from all atlas spaces via a multiclassifier fusion technique. Specially, in order to speed up segmentation, given a test image, we first use an improved mean-shift algorithm to perform over-segmentation and then implement the region-based image labeling instead of the original inefficient pixel-based image labeling. The proposed method is evaluated on the datasets of MICCAI 2007 liver segmentation challenge. The experimental results show that the average volume overlap error and the average surface distance achieved by our method are 8.3% and 1.5 m, respectively, which are comparable to the results reported in the existing state-of-the-art work on liver segmentation.

  6. Digital micromirror device as amplitude diffuser for multiple-plane phase retrieval

    NASA Astrophysics Data System (ADS)

    Abregana, Timothy Joseph T.; Hermosa, Nathaniel P.; Almoro, Percival F.

    2017-06-01

    Previous implementations of the phase diffuser used in the multiple-plane phase retrieval method included a diffuser glass plate with fixed optical properties or a programmable yet expensive spatial light modulator. Here a model for phase retrieval based on a digital micromirror device as amplitude diffuser is presented. The technique offers programmable, convenient and low-cost amplitude diffuser for a non-stagnating iterative phase retrieval. The technique is demonstrated in the reconstructions of smooth object wavefronts.

  7. A ROle-Oriented Filtering (ROOF) approach for collaborative recommendation

    NASA Astrophysics Data System (ADS)

    Ghani, Imran; Jeong, Seung Ryul

    2016-09-01

    In collaborative filtering (CF) recommender systems, existing techniques frequently focus on determining similarities among users' historical interests. This generally refers to situations in which each user normally plays a single role and his/her taste remains consistent over the long term. However, we note that existing techniques have not been significantly employed in a role-oriented context. This is especially so in situations where users may change their roles over time or play multiple roles simultaneously, while still expecting to access relevant information resources accordingly. Such systems include enterprise architecture management systems, e-commerce sites or journal management systems. In scenarios involving existing techniques, each user needs to build up very different profiles (preferences and interests) based on multiple roles which change over time. Should this not occur to a satisfactory degree, their previous information will either be lost or not utilised at all. To limit the occurrence of such issues, we propose a ROle-Oriented Filtering (ROOF) approach focusing on the manner in which multiple user profiles are obtained and maintained over time. We conducted a number of experiments using an enterprise architecture management scenario. In so doing, we observed that the ROOF approach performs better in comparison with other existing collaborative filtering-based techniques.

  8. Systematic Review of Synthetic Computed Tomography Generation Methodologies for Use in Magnetic Resonance Imaging-Only Radiation Therapy.

    PubMed

    Johnstone, Emily; Wyatt, Jonathan J; Henry, Ann M; Short, Susan C; Sebag-Montefiore, David; Murray, Louise; Kelly, Charles G; McCallum, Hazel M; Speight, Richard

    2018-01-01

    Magnetic resonance imaging (MRI) offers superior soft-tissue contrast as compared with computed tomography (CT), which is conventionally used for radiation therapy treatment planning (RTP) and patient positioning verification, resulting in improved target definition. The 2 modalities are co-registered for RTP; however, this introduces a systematic error. Implementing an MRI-only radiation therapy workflow would be advantageous because this error would be eliminated, the patient pathway simplified, and patient dose reduced. Unlike CT, in MRI there is no direct relationship between signal intensity and electron density; however, various methodologies for MRI-only RTP have been reported. A systematic review of these methods was undertaken. The PRISMA guidelines were followed. Embase and Medline databases were searched (1996 to March, 2017) for studies that generated synthetic CT scans (sCT)s for MRI-only radiation therapy. Sixty-one articles met the inclusion criteria. This review showed that MRI-only RTP techniques could be grouped into 3 categories: (1) bulk density override; (2) atlas-based; and (3) voxel-based techniques, which all produce an sCT scan from MR images. Bulk density override techniques either used a single homogeneous or multiple tissue override. The former produced large dosimetric errors (>2%) in some cases and the latter frequently required manual bone contouring. Atlas-based techniques used both single and multiple atlases and included methods incorporating pattern recognition techniques. Clinically acceptable sCTs were reported, but atypical anatomy led to erroneous results in some cases. Voxel-based techniques included methods using routine and specialized MRI sequences, namely ultra-short echo time imaging. High-quality sCTs were produced; however, use of multiple sequences led to long scanning times increasing the chances of patient movement. Using nonroutine sequences would currently be problematic in most radiation therapy centers. Atlas-based and voxel-based techniques were found to be the most clinically useful methods, with some studies reporting dosimetric differences of <1% between planning on the sCT and CT and <1-mm deviations when using sCTs for positional verification. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Extending the Distributed Lag Model framework to handle chemical mixtures.

    PubMed

    Bello, Ghalib A; Arora, Manish; Austin, Christine; Horton, Megan K; Wright, Robert O; Gennings, Chris

    2017-07-01

    Distributed Lag Models (DLMs) are used in environmental health studies to analyze the time-delayed effect of an exposure on an outcome of interest. Given the increasing need for analytical tools for evaluation of the effects of exposure to multi-pollutant mixtures, this study attempts to extend the classical DLM framework to accommodate and evaluate multiple longitudinally observed exposures. We introduce 2 techniques for quantifying the time-varying mixture effect of multiple exposures on an outcome of interest. Lagged WQS, the first technique, is based on Weighted Quantile Sum (WQS) regression, a penalized regression method that estimates mixture effects using a weighted index. We also introduce Tree-based DLMs, a nonparametric alternative for assessment of lagged mixture effects. This technique is based on the Random Forest (RF) algorithm, a nonparametric, tree-based estimation technique that has shown excellent performance in a wide variety of domains. In a simulation study, we tested the feasibility of these techniques and evaluated their performance in comparison to standard methodology. Both methods exhibited relatively robust performance, accurately capturing pre-defined non-linear functional relationships in different simulation settings. Further, we applied these techniques to data on perinatal exposure to environmental metal toxicants, with the goal of evaluating the effects of exposure on neurodevelopment. Our methods identified critical neurodevelopmental windows showing significant sensitivity to metal mixtures. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Hypergraph-Based Combinatorial Optimization of Matrix-Vector Multiplication

    ERIC Educational Resources Information Center

    Wolf, Michael Maclean

    2009-01-01

    Combinatorial scientific computing plays an important enabling role in computational science, particularly in high performance scientific computing. In this thesis, we will describe our work on optimizing matrix-vector multiplication using combinatorial techniques. Our research has focused on two different problems in combinatorial scientific…

  11. [Construction of multiple drug release system based on components of traditional Chinese medicine].

    PubMed

    Liu, Dan; Jia, Xiaobin; Yu, Danhong; Zhang, Zhenhai; Sun, E

    2012-08-01

    With the development of the modernization drive of traditional Chinese medicine (TCM) preparations, new-type TCM dosage forms research have become a hot spot in the field. Because of complexity of TCM components as well as uncertainty of material base, there is still not a scientific system for modern TCM dosage forms so far. Modern TCM preparations inevitably take the nature of the multi-component and the general function characteristics of multi-link and multi-target into account. The author suggests building a multiple drug release system for TCM using diverse preparation techniques and drug release methods at levels on the basis the nature and function characteristics of TCM components. This essay expounds elaborates the ideas to build the multiple traditional Chinese medicine release system, theoretical basis, preparation techniques and assessment system, current problems and solutions, in order to build a multiple TCM release system with a view of enhancing the bioavailability of TCM components and provide a new form for TCM preparations.

  12. Application of multiple modelling to hyperthermia estimation: reducing the effects of model mismatch.

    PubMed

    Potocki, J K; Tharp, H S

    1993-01-01

    Multiple model estimation is a viable technique for dealing with the spatial perfusion model mismatch associated with hyperthermia dosimetry. Using multiple models, spatial discrimination can be obtained without increasing the number of unknown perfusion zones. Two multiple model estimators based on the extended Kalman filter (EKF) are designed and compared with two EKFs based on single models having greater perfusion zone segmentation. Results given here indicate that multiple modelling is advantageous when the number of thermal sensors is insufficient for convergence of single model estimators having greater perfusion zone segmentation. In situations where sufficient measured outputs exist for greater unknown perfusion parameter estimation, the multiple model estimators and the single model estimators yield equivalent results.

  13. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Multiple-choice examinations: adopting an evidence-based approach to exam technique.

    PubMed

    Hammond, E J; McIndoe, A K; Sansome, A J; Spargo, P M

    1998-11-01

    Negatively marked multiple-choice questions (MCQs) are part of the assessment process in both the Primary and Final examinations for the fellowship of the Royal College of Anaesthetists. It is said that candidates who guess will lose marks in the MCQ paper. We studied candidates attending a pre-examination revision course and have shown that an evaluation of examination technique is an important part of an individual's preparation. All candidates benefited substantially from backing their educated guesses while only 3 out of 27 lost marks from backing their wild guesses. Failure to appreciate the relationship between knowledge and technique may significantly affect a candidate's performance in the examination.

  15. Laser Based Stand-Off Detection of Biological Agents (detection a distance des agents biologiques a l’aide du laser)

    DTIC Science & Technology

    2010-02-01

    overview of their respective national up-date. Dr. Roy presented a new technique for evaluating the bioaerosol particle size based on a multiple...Field-of-View LIDAR technique . Mr. Levesque from INO gave an overview of their expertise in LIDAR and biophotonics. Dr. Chin from Laval University gave... techniques have the potential to detect particulate aerosols remotely at distances of many kilometres [1]. They can provide spatially resolved

  16. Conservative Approach for the Esthetic Management of Multiple Interdental Spaces: A Systematic Approach.

    PubMed

    Báez Rosales, Abelardo; De Nordenflycht Carvacho, Diego; Schlieper Cacciutolo, Ramón; Gajardo Guineo, Manuel; Gandarillas Fuentes, Claudio

    2015-01-01

    To describe a conservative approach using resin-based composites following a buccolingual layering technique with a customized silicon index for the management of multiple diastemas. This clinical article describes the case of a patient with their anterior teeth esthetically compromised by multiple diastemas, incisal wear, and dull/low-value cervical composite resin restorations that were managed with nanofilled composite resin using the "buccolingual technique" with a customized silicon index made from a wax-up to build up the restorations. The first layer of composite placed lingually that represents the enamel replacement was placed directly on the silicon index so that it provides in one single step the lingual profile and the position of the incisal edge of the restoration. Then, dentine and effect composite resin can be applied in a precise three-dimensional configuration. To solve esthetic dental problems, as anterior diastemas, in a very conservative and even reversible way, the use of direct resin composites for layering is an excellent choice, but should be performed based on simple and reproducible techniques, as the buccolingual technique. The clinical technique described in this paper shows the advantages of a conservative approach to correct diastemas on maxillary anterior teeth. The application of these techniques can not only help achieve optimal esthetics, but also avoid the removal of extensive dental hard tissue and achieve a predictable final result, especially in esthetically demanding cases. © 2015 Wiley Periodicals, Inc.

  17. Image Alignment for Multiple Camera High Dynamic Range Microscopy.

    PubMed

    Eastwood, Brian S; Childs, Elisabeth C

    2012-01-09

    This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera.

  18. Image Alignment for Multiple Camera High Dynamic Range Microscopy

    PubMed Central

    Eastwood, Brian S.; Childs, Elisabeth C.

    2012-01-01

    This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera. PMID:22545028

  19. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  20. A Phytochemical-Sensing Strategy Based on Mass Spectrometry Imaging and Metabolic Profiling for Understanding the Functionality of the Medicinal Herb Green Tea.

    PubMed

    Fujimura, Yoshinori; Miura, Daisuke; Tachibana, Hirofumi

    2017-09-27

    Low-molecular-weight phytochemicals have health benefits and reduce the risk of diseases, but the mechanisms underlying their activities have remained elusive because of the lack of a methodology that can easily visualize the exact behavior of such small molecules. Recently, we developed an in situ label-free imaging technique, called mass spectrometry imaging, for visualizing spatially-resolved biotransformations based on simultaneous mapping of the major bioactive green tea polyphenol and its phase II metabolites. In addition, we established a mass spectrometry-based metabolic profiling technique capable of evaluating the bioactivities of diverse green tea extracts, which contain multiple phytochemicals, by focusing on their compositional balances. This methodology allowed us to simultaneously evaluate the relative contributions of the multiple compounds present in a multicomponent system to its bioactivity. This review highlights small molecule-sensing techniques for visualizing the complex behaviors of herbal components and linking such information to an enhanced understanding of the functionalities of multicomponent medicinal herbs.

  1. GaInNAsSb/GaAs vertical cavity surface-emitting lasers (VCSELs): current challenges and techniques to realize multiple-wavelength laser arrays at 1.55 μm

    NASA Astrophysics Data System (ADS)

    Gobet, Mathilde; Bae, Hopil P.; Sarmiento, Tomas; Harris, James S.

    2008-02-01

    Multiple-wavelength laser arrays at 1.55 μm are key components of wavelength division multiplexing (WDM) systems for increased bandwidth. Vertical cavity surface-emitting lasers (VCSELs) grown on GaAs substrates outperform their InP counterparts in several points. We summarize the current challenges to realize continuous-wave (CW) GaInNAsSb VCSELs on GaAs with 1.55 μm emission wavelength and explain the work in progress to realize CW GaInNAsSb VCSELs. Finally, we detail two techniques to realize GaInNAsSb multiple-wavelength VCSEL arrays at 1.55 μm. The first technique involves the incorporation of a photonic crystal into the upper mirror. Simulation results for GaAs-based VCSEL arrays at 1.55 μm are shown. The second technique uses non-uniform molecular beam epitaxy (MBE). We have successfully demonstrated 1x6 resonant cavity light-emitting diode arrays at 850 nm using this technique, with wavelength spacing of 0.4 nm between devices and present these results.

  2. Interleaved Practice with Multiple Representations: Analyses with Knowledge Tracing Based Techniques

    ERIC Educational Resources Information Center

    Rau, Martina A.; Pardos, Zachary A.

    2012-01-01

    The goal of this paper is to use Knowledge Tracing to augment the results obtained from an experiment that investigated the effects of practice schedules using an intelligent tutoring system for fractions. Specifically, this experiment compared different practice schedules of multiple representations of fractions: representations were presented to…

  3. Mars chronology: Assessing techniques for quantifying surficial processes

    USGS Publications Warehouse

    Doran, P.T.; Clifford, S.M.; Forman, S.L.; Nyquist, Larry; Papanastassiou, D.A.; Stewart, B.W.; Sturchio, N.C.; Swindle, T.D.; Cerling, T.; Kargel, J.; McDonald, G.; Nishiizumi, K.; Poreda, R.; Rice, J.W.; Tanaka, K.

    2004-01-01

    Currently, the absolute chronology of Martian rocks, deposits and events is based mainly on crater counting and remains highly imprecise with epoch boundary uncertainties in excess of 2 billion years. Answers to key questions concerning the comparative origin and evolution of Mars and Earth will not be forthcoming without a rigid Martian chronology, enabling the construction of a time scale comparable to Earth's. Priorities for exploration include calibration of the cratering rate, dating major volcanic and fluvial events and establishing chronology of the polar layered deposits. If extinct and/or extant life is discovered, the chronology of the biosphere will be of paramount importance. Many radiometric and cosmogenic techniques applicable on Earth and the Moon will apply to Mars after certain baselines (e.g. composition of the atmosphere, trace species, chemical and physical characteristics of Martian dust) are established. The high radiation regime may pose a problem for dosimetry-based techniques (e.g. luminescence). The unique isotopic composition of nitrogen in the Martian atmosphere may permit a Mars-specific chronometer for tracing the time-evolution of the atmosphere and of lithic phases with trapped atmospheric gases. Other Mars-specific chronometers include measurement of gas fluxes and accumulation of platinum group elements (PGE) in the regolith. Putting collected samples into geologic context is deemed essential, as is using multiple techniques on multiple samples. If in situ measurements are restricted to a single technique it must be shown to give consistent results on multiple samples, but in all cases, using two or more techniques (e.g. on the same lander) will reduce error. While there is no question that returned samples will yield the best ages, in situ techniques have the potential to be flown on multiple missions providing a larger data set and broader context in which to place the more accurate dates. ?? 2004 Elsevier B.V. All rights reserved.

  4. Statistical approach for selection of biologically informative genes.

    PubMed

    Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N

    2018-05-20

    Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.

  5. A new cooperative MIMO scheme based on SM for energy-efficiency improvement in wireless sensor network.

    PubMed

    Peng, Yuyang; Choi, Jaeho

    2014-01-01

    Improving the energy efficiency in wireless sensor networks (WSN) has attracted considerable attention nowadays. The multiple-input multiple-output (MIMO) technique has been proved as a good candidate for improving the energy efficiency, but it may not be feasible in WSN which is due to the size limitation of the sensor node. As a solution, the cooperative multiple-input multiple-output (CMIMO) technique overcomes this constraint and shows a dramatically good performance. In this paper, a new CMIMO scheme based on the spatial modulation (SM) technique named CMIMO-SM is proposed for energy-efficiency improvement. We first establish the system model of CMIMO-SM. Based on this model, the transmission approach is introduced graphically. In order to evaluate the performance of the proposed scheme, a detailed analysis in terms of energy consumption per bit of the proposed scheme compared with the conventional CMIMO is presented. Later, under the guide of this new scheme we extend our proposed CMIMO-SM to a multihop clustered WSN for further achieving energy efficiency by finding an optimal hop-length. Equidistant hop as the traditional scheme will be compared in this paper. Results from the simulations and numerical experiments indicate that by the use of the proposed scheme, significant savings in terms of total energy consumption can be achieved. Combining the proposed scheme with monitoring sensor node will provide a good performance in arbitrary deployed WSN such as forest fire detection system.

  6. CMOS imager for pointing and tracking applications

    NASA Technical Reports Server (NTRS)

    Sun, Chao (Inventor); Pain, Bedabrata (Inventor); Yang, Guang (Inventor); Heynssens, Julie B. (Inventor)

    2006-01-01

    Systems and techniques to realize pointing and tracking applications with CMOS imaging devices. In general, in one implementation, the technique includes: sampling multiple rows and multiple columns of an active pixel sensor array into a memory array (e.g., an on-chip memory array), and reading out the multiple rows and multiple columns sampled in the memory array to provide image data with reduced motion artifact. Various operation modes may be provided, including TDS, CDS, CQS, a tracking mode to read out multiple windows, and/or a mode employing a sample-first-read-later readout scheme. The tracking mode can take advantage of a diagonal switch array. The diagonal switch array, the active pixel sensor array and the memory array can be integrated onto a single imager chip with a controller. This imager device can be part of a larger imaging system for both space-based applications and terrestrial applications.

  7. Developing Scenarios: Linking Environmental Scanning and Strategic Planning.

    ERIC Educational Resources Information Center

    Whiteley, Meredith A.; And Others

    1990-01-01

    The multiple scenario analysis technique for organizational planning used by multinational corporations is adaptable for colleges and universities. Arizona State University launched a futures-based planning project using the Delphi technique and cross-impact analysis to produce three alternative scenarios (stable, turbulent, and chaotic) to expand…

  8. Is high–dose rate RapidArc-based radiosurgery dosimetrically advantageous for the treatment of intracranial tumors?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Bo; Yang, Yong, E-mail: yangy2@upmc.edu; Li, Xiang

    In linac-based stereotactic radiosurgery (SRS) and radiotherapy (SRT), circular cone(s) or conformal arc(s) are conventionally used to treat intracranial lesions. However, when the target is in close proximity to critical structures, it is frequently quite challenging to generate a quality plan using these techniques. In this study, we investigated the dosimetric characteristics of using high–dose rate RapidArc (RA) technique for radiosurgical treatment of intracranial lesions. A total of 10 intracranial SRS/SRT cases previously planned using dynamic conformal arc (DCA) or cone-based techniques have been included in this study. For each case, 3 treatment plans were generated: (1) a DCA planmore » with multiple noncoplanar arcs, (2) a high–dose rate RA plan with arcs oriented the same as DCA (multiple-arc RA), and 3) a high–dose rate RA plan with a single coplanar arc (single-arc RA). All treatment plans were generated under the same prescription and similar critical structure dose limits. Plan quality for different plans was evaluated by comparing various dosimetric parameters such as target coverage, conformity index (CI), homogeneity index (HI), critical structures, and normal brain tissue doses as well as beam delivery time. With similar critical structure sparing, high–dose rate RA plans can achieve much better target coverage, dose conformity, and dose homogeneity than the DCA plans can. Plan quality indices CI and HI, for the DCA, multiple-arc RA, and single-arc RA techniques, were measured as 1.67 ± 0.39, 1.32 ± 0.28, and 1.38 ± 0.30 and 1.24 ± 0.11, 1.10 ± 0.04, and 1.12 ± 0.07, respectively. Normal brain tissue dose (V{sub 12} {sub Gy}) was found to be similar for DCA and multiple-arc RA plans but much larger for the single-arc RA plans. Beam delivery was similar for DCA and multiple-arc RA plans but shorter with single-arc RA plans. Multiple-arc RA SRS/SRT can provide better treatment plans than conventional DCA plans, especially for complex cases.« less

  9. Tag-to-Tag Interference Suppression Technique Based on Time Division for RFID.

    PubMed

    Khadka, Grishma; Hwang, Suk-Seung

    2017-01-01

    Radio-frequency identification (RFID) is a tracking technology that enables immediate automatic object identification and rapid data sharing for a wide variety of modern applications using radio waves for data transmission from a tag to a reader. RFID is already well established in technical areas, and many companies have developed corresponding standards and measurement techniques. In the construction industry, effective monitoring of materials and equipment is an important task, and RFID helps to improve monitoring and controlling capabilities, in addition to enabling automation for construction projects. However, on construction sites, there are many tagged objects and multiple RFID tags that may interfere with each other's communications. This reduces the reliability and efficiency of the RFID system. In this paper, we propose an anti-collision algorithm for communication between multiple tags and a reader. In order to suppress interference signals from multiple neighboring tags, the proposed algorithm employs the time-division (TD) technique, where tags in the interrogation zone are assigned a specific time slot so that at every instance in time, a reader communicates with tags using the specific time slot. We present representative computer simulation examples to illustrate the performance of the proposed anti-collision technique for multiple RFID tags.

  10. Decomposition-Based Decision Making for Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Borer, Nicholas K.; Mavris, DImitri N.

    2005-01-01

    Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.

  11. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  12. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  13. From a meso- to micro-scale connectome: array tomography and mGRASP

    PubMed Central

    Rah, Jong-Cheol; Feng, Linqing; Druckmann, Shaul; Lee, Hojin; Kim, Jinhyun

    2015-01-01

    Mapping mammalian synaptic connectivity has long been an important goal of neuroscience because knowing how neurons and brain areas are connected underpins an understanding of brain function. Meeting this goal requires advanced techniques with single synapse resolution and large-scale capacity, especially at multiple scales tethering the meso- and micro-scale connectome. Among several advanced LM-based connectome technologies, Array Tomography (AT) and mammalian GFP-Reconstitution Across Synaptic Partners (mGRASP) can provide relatively high-throughput mapping synaptic connectivity at multiple scales. AT- and mGRASP-assisted circuit mapping (ATing and mGRASPing), combined with techniques such as retrograde virus, brain clearing techniques, and activity indicators will help unlock the secrets of complex neural circuits. Here, we discuss these useful new tools to enable mapping of brain circuits at multiple scales, some functional implications of spatial synaptic distribution, and future challenges and directions of these endeavors. PMID:26089781

  14. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks

    PubMed Central

    Yeom, Jeong Seon; Jung, Bang Chul; Jin, Hu

    2018-01-01

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations. PMID:29439413

  15. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks.

    PubMed

    Yeom, Jeong Seon; Chu, Eunmi; Jung, Bang Chul; Jin, Hu

    2018-02-10

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations.

  16. Mitigation of tropospheric InSAR phase artifacts through differential multisquint processing

    NASA Technical Reports Server (NTRS)

    Chen, Curtis W.

    2004-01-01

    We propose a technique for mitigating tropospheric phase errors in repeat-pass interferometric synthetic aperture radar (InSAR). The mitigation technique is based upon the acquisition of multisquint InSAR data. On each satellite pass over a target area, the radar instrument will acquire images from multiple squint (azimuth) angles, from which multiple interferograms can be formed. The diversity of viewing angles associated with the multisquint acquisition can be used to solve for two components of the 3-D surface displacement vector as well as for the differential tropospheric phase. We describe a model for the performance of the multisquint technique, and we present an assessment of the performance expected.

  17. The Research of Multiple Attenuation Based on Feedback Iteration and Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Xu, X.; Tong, S.; Wang, L.

    2017-12-01

    How to solve the problem of multiple suppression is a difficult problem in seismic data processing. The traditional technology for multiple attenuation is based on the principle of the minimum output energy of the seismic signal, this criterion is based on the second order statistics, and it can't achieve the multiple attenuation when the primaries and multiples are non-orthogonal. In order to solve the above problems, we combine the feedback iteration method based on the wave equation and the improved independent component analysis (ICA) based on high order statistics to suppress the multiple waves. We first use iterative feedback method to predict the free surface multiples of each order. Then, in order to predict multiples from real multiple in amplitude and phase, we design an expanded pseudo multi-channel matching filtering method to get a more accurate matching multiple result. Finally, we present the improved fast ICA algorithm which is based on the maximum non-Gauss criterion of output signal to the matching multiples and get better separation results of the primaries and the multiples. The advantage of our method is that we don't need any priori information to the prediction of the multiples, and can have a better separation result. The method has been applied to several synthetic data generated by finite-difference model technique and the Sigsbee2B model multiple data, the primaries and multiples are non-orthogonal in these models. The experiments show that after three to four iterations, we can get the perfect multiple results. Using our matching method and Fast ICA adaptive multiple subtraction, we can not only effectively preserve the effective wave energy in seismic records, but also can effectively suppress the free surface multiples, especially the multiples related to the middle and deep areas.

  18. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    PubMed

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  19. Using focused plenoptic cameras for rich image capture.

    PubMed

    Georgiev, T; Lumsdaine, A; Chunev, G

    2011-01-01

    This approach uses a focused plenoptic camera to capture the plenoptic function's rich "non 3D" structure. It employs two techniques. The first simultaneously captures multiple exposures (or other aspects) based on a microlens array having an interleaved set of different filters. The second places multiple filters at the main lens aperture.

  20. Identification of Multiple Nonreturner Profiles to Inform the Development of Targeted College Retention Interventions

    ERIC Educational Resources Information Center

    Mattern, Krista D.; Marini, Jessica P.; Shaw, Emily J.

    2015-01-01

    Throughout the college retention literature, there is a recurring theme that students leave college for a variety of reasons making retention a difficult phenomenon to model. In the current study, cluster analysis techniques were employed to investigate whether multiple empirically based profiles of nonreturning students existed to more fully…

  1. System for Training Aviation Regulations (STAR): Using Multiple Vantage Points To Learn Complex Information through Scenario-Based Instruction and Multimedia Techniques.

    ERIC Educational Resources Information Center

    Chandler, Terrell N.

    1996-01-01

    The System for Training of Aviation Regulations (STAR) provides comprehensive training in understanding and applying Federal aviation regulations. STAR gives multiple vantage points with multimedia presentations and storytelling within four categories of learning environments: overviews, scenarios, challenges, and resources. Discusses the…

  2. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  3. Time-Sharing-Based Synchronization and Performance Evaluation of Color-Independent Visual-MIMO Communication.

    PubMed

    Kwon, Tae-Ho; Kim, Jai-Eun; Kim, Ki-Doo

    2018-05-14

    In the field of communication, synchronization is always an important issue. The communication between a light-emitting diode (LED) array (LEA) and a camera is known as visual multiple-input multiple-output (MIMO), for which the data transmitter and receiver must be synchronized for seamless communication. In visual-MIMO, LEDs generally have a faster data rate than the camera. Hence, we propose an effective time-sharing-based synchronization technique with its color-independent characteristics providing the key to overcome this synchronization problem in visual-MIMO communication. We also evaluated the performance of our synchronization technique by varying the distance between the LEA and camera. A graphical analysis is also presented to compare the symbol error rate (SER) at different distances.

  4. Distributed reservation-based code division multiple access

    NASA Astrophysics Data System (ADS)

    Wieselthier, J. E.; Ephremides, A.

    1984-11-01

    The use of spread spectrum signaling, motivated primarily by its antijamming capabilities in military applications, leads naturally to the use of Code Division Multiple Access (CDMA) techniques that permit the successful simultaneous transmission by a number of users over a wideband channel. In this paper we address some of the major issues that are associated with the design of multiple access protocols for spread spectrum networks. We then propose, analyze, and evaluate a distributed reservation-based multiple access protocol that does in fact exploit CDMA properties. Especially significant is the fact that no acknowledgment or feedback information from the destination is required (thus facilitating communication with a radio-silent mode), nor is any form of coordination among the users necessary.

  5. Single-photon sensitive fast ebCMOS camera system for multiple-target tracking of single fluorophores: application to nano-biophotonics

    NASA Astrophysics Data System (ADS)

    Cajgfinger, Thomas; Chabanat, Eric; Dominjon, Agnes; Doan, Quang T.; Guerin, Cyrille; Houles, Julien; Barbier, Remi

    2011-03-01

    Nano-biophotonics applications will benefit from new fluorescent microscopy methods based essentially on super-resolution techniques (beyond the diffraction limit) on large biological structures (membranes) with fast frame rate (1000 Hz). This trend tends to push the photon detectors to the single-photon counting regime and the camera acquisition system to real time dynamic multiple-target tracing. The LUSIPHER prototype presented in this paper aims to give a different approach than those of Electron Multiplied CCD (EMCCD) technology and try to answer to the stringent demands of the new nano-biophotonics imaging techniques. The electron bombarded CMOS (ebCMOS) device has the potential to respond to this challenge, thanks to the linear gain of the accelerating high voltage of the photo-cathode, to the possible ultra fast frame rate of CMOS sensors and to the single-photon sensitivity. We produced a camera system based on a 640 kPixels ebCMOS with its acquisition system. The proof of concept for single-photon based tracking for multiple single-emitters is the main result of this paper.

  6. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Rapid Multi-Damage Identification for Health Monitoring of Laminated Composites Using Piezoelectric Wafer Sensor Arrays

    PubMed Central

    Si, Liang; Wang, Qian

    2016-01-01

    Through the use of the wave reflection from any damage in a structure, a Hilbert spectral analysis-based rapid multi-damage identification (HSA-RMDI) technique with piezoelectric wafer sensor arrays (PWSA) is developed to monitor and identify the presence, location and severity of damage in carbon fiber composite structures. The capability of the rapid multi-damage identification technique to extract and estimate hidden significant information from the collected data and to provide a high-resolution energy-time spectrum can be employed to successfully interpret the Lamb waves interactions with single/multiple damage. Nevertheless, to accomplish the precise positioning and effective quantification of multiple damage in a composite structure, two functional metrics from the RMDI technique are proposed and used in damage identification, which are the energy density metric and the energy time-phase shift metric. In the designed damage experimental tests, invisible damage to the naked eyes, especially delaminations, were detected in the leftward propagating waves as well as in the selected sensor responses, where the time-phase shift spectra could locate the multiple damage whereas the energy density spectra were used to quantify the multiple damage. The increasing damage was shown to follow a linear trend calculated by the RMDI technique. All damage cases considered showed completely the developed RMDI technique potential as an effective online damage inspection and assessment tool. PMID:27153070

  8. Errorless-based techniques can improve route finding in early Alzheimer's disease: a case study.

    PubMed

    Provencher, Véronique; Bier, Nathalie; Audet, Thérèse; Gagnon, Lise

    2008-01-01

    Topographical disorientation is a common and early manifestation of dementia of Alzheimer type, which threatens independence in activities of daily living. Errorless-based techniques appear to be effective in helping patients with amnesia to learn routes, but little is known about their effectiveness in early dementia of Alzheimer type. A 77-year-old woman with dementia of Alzheimer type had difficulty in finding her way around her seniors residence, which reduced her social activities. This study used an ABA design (A is the baseline and B is the intervention) with multiple baselines across routes for going to the rosary (target), laundry, and game rooms (controls). The errorless-based technique intervention was applied to 2 of the 3 routes. Analyses showed significant improvement only for the routes learned with errorless-based techniques. Following the study, the participant increased her topographical knowledge of her surroundings. Route learning interventions based on errorless-based techniques appear to be a promising approach for improving the independence in early dementia of Alzheimer type.

  9. Context-based automated defect classification system using multiple morphological masks

    DOEpatents

    Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed

    2002-01-01

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.

  10. Three Techniques to Help Students Teach Themselves Concepts in Environmental Geochemistry.

    ERIC Educational Resources Information Center

    Brown, I. Foster

    1984-01-01

    Describes techniques in which students learn to: (1) create elemental "fairy tales" based on the geochemical behavior of elements and on imagination to integrate concepts; (2) to visually eliminate problems of bias; and (3) to utilize multiple working hypotheses as a basis for testing concepts of classification and distinguishing…

  11. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  12. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  13. Performance Comparison of Superresolution Array Processing Algorithms. Revised

    DTIC Science & Technology

    1998-06-15

    plane waves is finite is the MUSIC algorithm [16]. MUSIC , which denotes Multiple Signal Classification, is an extension of the method of Pisarenko [18... MUSIC Is but one member of a class of methods based upon the decomposition of covariance data into eigenvectors and eigenvalues. Such techniques...techniques relative to the classical methods, however, results for MUSIC are included in this report. All of the techniques reviewed have application to

  14. Using AVIRIS data and multiple-masking techniques to map urban forest trees species

    Treesearch

    Q. Xiao; S.L. Ustin; E.G. McPherson

    2004-01-01

    Tree type and species information are critical parameters for urban forest management, benefit cost analysis and urban planning. However, traditionally, these parameters have been derived based on limited field samples in urban forest management practice. In this study we used high-resolution Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data and multiple-...

  15. DServO: A Peer-to-Peer-based Approach to Biomedical Ontology Repositories.

    PubMed

    Mambone, Zakaria; Savadogo, Mahamadi; Some, Borlli Michel Jonas; Diallo, Gayo

    2015-01-01

    We present in this poster an extension of the ServO ontology server system, which adopts a decentralized Peer-To-Peer approach for managing multiple heterogeneous knowledge organization systems. It relies on the use of the JXTA protocol coupled with information retrieval techniques to provide a decentralized infrastructure for managing multiples instances of Ontology Repositories.

  16. Self-Tuning of Design Variables for Generalized Predictive Control

    NASA Technical Reports Server (NTRS)

    Lin, Chaung; Juang, Jer-Nan

    2000-01-01

    Three techniques are introduced to determine the order and control weighting for the design of a generalized predictive controller. These techniques are based on the application of fuzzy logic, genetic algorithms, and simulated annealing to conduct an optimal search on specific performance indexes or objective functions. Fuzzy logic is found to be feasible for real-time and on-line implementation due to its smooth and quick convergence. On the other hand, genetic algorithms and simulated annealing are applicable for initial estimation of the model order and control weighting, and final fine-tuning within a small region of the solution space, Several numerical simulations for a multiple-input and multiple-output system are given to illustrate the techniques developed in this paper.

  17. Assessing the use of multiple sources in student essays.

    PubMed

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  18. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  19. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  20. Nasal hydropulsion: a novel tumor biopsy technique.

    PubMed

    Ashbaugh, Elizabeth A; McKiernan, Brendan C; Miller, Carrie J; Powers, Barbara

    2011-01-01

    Intranasal tumors of dogs and cats pose a diagnostic and therapeutic challenge for small animal practitioners. Multiple nasal biopsy techniques have been described in the past. This report describes a simplified flushing technique to biopsy and debulk nasal tumors, which often also results in immediate clinical relief for the patient. Based on the results of this retrospective study, the authors recommend high-pressure saline hydropulsion as a minimally invasive diagnostic, and potentially therapeutic, technique for nasal tumors in dogs and cats.

  1. Improving Multiple Fault Diagnosability using Possible Conflicts

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew J.; Bregon, Anibal; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino

    2012-01-01

    Multiple fault diagnosis is a difficult problem for dynamic systems. Due to fault masking, compensation, and relative time of fault occurrence, multiple faults can manifest in many different ways as observable fault signature sequences. This decreases diagnosability of multiple faults, and therefore leads to a loss in effectiveness of the fault isolation step. We develop a qualitative, event-based, multiple fault isolation framework, and derive several notions of multiple fault diagnosability. We show that using Possible Conflicts, a model decomposition technique that decouples faults from residuals, we can significantly improve the diagnosability of multiple faults compared to an approach using a single global model. We demonstrate these concepts and provide results using a multi-tank system as a case study.

  2. Factors influencing behavior guidance: a survey of practicing pediatric dentists.

    PubMed

    Juntgen, Laura M; Sanders, Brian J; Walker, Laquia A; Jones, James E; Weddell, James A; Tomlin, Angela M; Eckert, George; Maupome, Gerardo

    2013-01-01

    The purpose of this study was to identify factors influencing behavior guidance technique utilization among practicing pediatric dentists and explore potential barriers to the incorporation of previously unused techniques. The data for this study were obtained from a web-based survey containing 15 multiple choice questions concerning the practitioners' past, current, and anticipated future behavior guidance technique utilization. Most respondents received hands-on training in 10 of the American Academy of Pediatric Dentistry behavior guidance techniques. The type of training was associated with the practitioners' level of comfort using a given technique upon graduation and with the current frequency of technique utilization. Residency type impacted hands-on behavior guidance training, with 39 percent of respondents reporting no intravenous sedation training. The type of practice was associated with the frequency of behavior guidance technique utilization, as was graduation decade. Currently practicing dentists cited legal concerns, parental acceptance to change, and limited resources as perceived obstacles in the incorporation of new techniques. Behavior guidance technique selection and utilization among practicing pediatric dentists was influenced by multiple factors, including advanced education training, residency type, graduation decade, and practice type. Obstacles to the incorporation of previously unused techniques appear to be multifactorial.

  3. Ant colony optimisation-direct cover: a hybrid ant colony direct cover technique for multi-level synthesis of multiple-valued logic functions

    NASA Astrophysics Data System (ADS)

    Abd-El-Barr, Mostafa

    2010-12-01

    The use of non-binary (multiple-valued) logic in the synthesis of digital systems can lead to savings in chip area. Advances in very large scale integration (VLSI) technology have enabled the successful implementation of multiple-valued logic (MVL) circuits. A number of heuristic algorithms for the synthesis of (near) minimal sum-of products (two-level) realisation of MVL functions have been reported in the literature. The direct cover (DC) technique is one such algorithm. The ant colony optimisation (ACO) algorithm is a meta-heuristic that uses constructive greediness to explore a large solution space in finding (near) optimal solutions. The ACO algorithm mimics the ant's behaviour in the real world in using the shortest path to reach food sources. We have previously introduced an ACO-based heuristic for the synthesis of two-level MVL functions. In this article, we introduce the ACO-DC hybrid technique for the synthesis of multi-level MVL functions. The basic idea is to use an ant to decompose a given MVL function into a number of levels and then synthesise each sub-function using a DC-based technique. The results obtained using the proposed approach are compared to those obtained using existing techniques reported in the literature. A benchmark set consisting of 50,000 randomly generated 2-variable 4-valued functions is used in the comparison. The results obtained using the proposed ACO-DC technique are shown to produce efficient realisation in terms of the average number of gates (as a measure of chip area) needed for the synthesis of a given MVL function.

  4. Large Terrain Modeling and Visualization for Planets

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher

    2011-01-01

    Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.

  5. Non-uniform refractive index field measurement based on light field imaging technique

    NASA Astrophysics Data System (ADS)

    Du, Xiaokun; Zhang, Yumin; Zhou, Mengjie; Xu, Dong

    2018-02-01

    In this paper, a method for measuring the non-uniform refractive index field based on the light field imaging technique is proposed. First, the light field camera is used to collect the four-dimensional light field data, and then the light field data is decoded according to the light field imaging principle to obtain image sequences with different acquisition angles of the refractive index field. Subsequently PIV (Particle Image Velocimetry) technique is used to extract ray offset of each image. Finally, the distribution of non-uniform refractive index field can be calculated by inversing the deflection of light rays. Compared with traditional optical methods which require multiple optical detectors from multiple angles to synchronously collect data, the method proposed in this paper only needs a light field camera and shoot once. The effectiveness of the method has been verified by the experiment which quantitatively measures the distribution of the refractive index field above the flame of the alcohol lamp.

  6. Application of decentralized cooperative problem solving in dynamic flexible scheduling

    NASA Astrophysics Data System (ADS)

    Guan, Zai-Lin; Lei, Ming; Wu, Bo; Wu, Ya; Yang, Shuzi

    1995-08-01

    The object of this study is to discuss an intelligent solution to the problem of task-allocation in shop floor scheduling. For this purpose, the technique of distributed artificial intelligence (DAI) is applied. Intelligent agents (IAs) are used to realize decentralized cooperation, and negotiation is realized by using message passing based on the contract net model. Multiple agents, such as manager agents, workcell agents, and workstation agents, make game-like decisions based on multiple criteria evaluations. This procedure of decentralized cooperative problem solving makes local scheduling possible. And by integrating such multiple local schedules, dynamic flexible scheduling for the whole shop floor production can be realized.

  7. Monitoring of self-healing composites: a nonlinear ultrasound approach

    NASA Astrophysics Data System (ADS)

    Malfense Fierro, Gian-Piero; Pinto, Fulvio; Dello Iacono, Stefania; Martone, Alfonso; Amendola, Eugenio; Meo, Michele

    2017-11-01

    Self-healing composites using a thermally mendable polymer, based on Diels-Alder reaction were fabricated and subjected to various multiple damage loads. Unlike traditional destructive methods, this work presents a nonlinear ultrasound technique to evaluate the structural recovery of the proposed self-healing laminate structures. The results were compared to computer tomography and linear ultrasound methods. The laminates were subjected to multiple loading and healing cycles and the induced damage and recovery at each stage was evaluated. The results highlight the benefit and added advantage of using a nonlinear based methodology to monitor the structural recovery of reversibly cross-linked epoxy with efficient recycling and multiple self-healing capability.

  8. [MLPA technique--principles and use in practice].

    PubMed

    Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia

    2007-01-01

    MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.

  9. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.

  10. An Approach to Economic Dispatch with Multiple Fuels Based on Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Sriyanyong, Pichet

    2011-06-01

    Particle Swarm Optimization (PSO), a stochastic optimization technique, shows superiority to other evolutionary computation techniques in terms of less computation time, easy implementation with high quality solution, stable convergence characteristic and independent from initialization. For this reason, this paper proposes the application of PSO to the Economic Dispatch (ED) problem, which occurs in the operational planning of power systems. In this study, ED problem can be categorized according to the different characteristics of its cost function that are ED problem with smooth cost function and ED problem with multiple fuels. Taking the multiple fuels into account will make the problem more realistic. The experimental results show that the proposed PSO algorithm is more efficient than previous approaches under consideration as well as highly promising in real world applications.

  11. Multivariate spatiotemporal visualizations for mobile devices in Flyover Country

    NASA Astrophysics Data System (ADS)

    Loeffler, S.; Thorn, R.; Myrbo, A.; Roth, R.; Goring, S. J.; Williams, J.

    2017-12-01

    Visualizing and interacting with complex multivariate and spatiotemporal datasets on mobile devices is challenging due to their smaller screens, reduced processing power, and limited data connectivity. Pollen data require visualizing pollen assemblages spatially, temporally, and across multiple taxa to understand plant community dynamics through time. Drawing from cartography, information visualization, and paleoecology, we have created new mobile-first visualization techniques that represent multiple taxa across many sites and enable user interaction. Using pollen datasets from the Neotoma Paleoecology Database as a case study, the visualization techniques allow ecological patterns and trends to be quickly understood on a mobile device compared to traditional pollen diagrams and maps. This flexible visualization system can be used for datasets beyond pollen, with the only requirements being point-based localities and multiple variables changing through time or depth.

  12. SU-F-T-349: Dosimetric Comparison of Three Different Simultaneous Integrated Boost Irradiation Techniques for Multiple Brain Metastases: Intensity-Modulatedradiotherapy, Hybrid Intensity-Modulated Radiotherapy and Volumetric Modulated Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, X; Sun, T; Yin, Y

    Purpose: To study the dosimetric impact of intensity-modulated radiotherapy (IMRT), hybrid intensity-modulated radiotherapy (h-IMRT) and volumetric modulated arc therapy(VMAT) for whole-brain radiotherapy (WBRT) with simultaneous integrated boost in patients with multiple brain metastases. Methods: Ten patients with multiple brain metastases were included in this analysis. The prescribed dose was 45 Gy to the whole brain (PTVWBRT) and 55 Gy to individual brain metastases (PTVboost) delivered simultaneously in 25 fractions. Three treatment techniques were designed: the 7 equal spaced fields IMRT plan, hybrid IMRT plan and VMAT with two 358°arcs. In hybrid IMRT plan, two fields(90°and 270°) were planned to themore » whole brain. This was used as a base dose plan. Then 5 fields IMRT plan was optimized based on the two fields plan. The dose distribution in the target, the dose to the organs at risk and total MU in three techniques were compared. Results: For the target dose, conformity and homogeneity in PTV, no statistically differences were observed in the three techniques. For the maximum dose in bilateral lens and the mean dose in bilateral eyes, IMRT and h-IMRT plans showed the highest and lowest value respectively. No statistically significant differences were observed in the dose of optic nerve and brainstem. For the monitor units, IMRT and VMAT plans showed the highest and lowest value respectively. Conclusion: For WBRT with simultaneous integrated boost in patients with multiple brain metastases, hybrid IMRT could reduce the doses to lens and eyes. It is feasible for patients with brain metastases.« less

  13. Effect of denoising on supervised lung parenchymal clusters

    NASA Astrophysics Data System (ADS)

    Jayamani, Padmapriya; Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Denoising is a critical preconditioning step for quantitative analysis of medical images. Despite promises for more consistent diagnosis, denoising techniques are seldom explored in clinical settings. While this may be attributed to the esoteric nature of the parameter sensitve algorithms, lack of quantitative measures on their ecacy to enhance the clinical decision making is a primary cause of physician apathy. This paper addresses this issue by exploring the eect of denoising on the integrity of supervised lung parenchymal clusters. Multiple Volumes of Interests (VOIs) were selected across multiple high resolution CT scans to represent samples of dierent patterns (normal, emphysema, ground glass, honey combing and reticular). The VOIs were labeled through consensus of four radiologists. The original datasets were ltered by multiple denoising techniques (median ltering, anisotropic diusion, bilateral ltering and non-local means) and the corresponding ltered VOIs were extracted. Plurality of cluster indices based on multiple histogram-based pair-wise similarity measures were used to assess the quality of supervised clusters in the original and ltered space. The resultant rank orders were analyzed using the Borda criteria to nd the denoising-similarity measure combination that has the best cluster quality. Our exhaustive analyis reveals (a) for a number of similarity measures, the cluster quality is inferior in the ltered space; and (b) for measures that benet from denoising, a simple median ltering outperforms non-local means and bilateral ltering. Our study suggests the need to judiciously choose, if required, a denoising technique that does not deteriorate the integrity of supervised clusters.

  14. Change detection from remotely sensed images: From pixel-based to object-based approaches

    NASA Astrophysics Data System (ADS)

    Hussain, Masroor; Chen, Dongmei; Cheng, Angela; Wei, Hui; Stanley, David

    2013-06-01

    The appetite for up-to-date information about earth's surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.

  15. Visualization and dissemination of multidimensional proteomics data comparing protein abundance during Caenorhabditis elegans development.

    PubMed

    Riffle, Michael; Merrihew, Gennifer E; Jaschob, Daniel; Sharma, Vagisha; Davis, Trisha N; Noble, William S; MacCoss, Michael J

    2015-11-01

    Regulation of protein abundance is a critical aspect of cellular function, organism development, and aging. Alternative splicing may give rise to multiple possible proteoforms of gene products where the abundance of each proteoform is independently regulated. Understanding how the abundances of these distinct gene products change is essential to understanding the underlying mechanisms of many biological processes. Bottom-up proteomics mass spectrometry techniques may be used to estimate protein abundance indirectly by sequencing and quantifying peptides that are later mapped to proteins based on sequence. However, quantifying the abundance of distinct gene products is routinely confounded by peptides that map to multiple possible proteoforms. In this work, we describe a technique that may be used to help mitigate the effects of confounding ambiguous peptides and multiple proteoforms when quantifying proteins. We have applied this technique to visualize the distribution of distinct gene products for the whole proteome across 11 developmental stages of the model organism Caenorhabditis elegans. The result is a large multidimensional dataset for which web-based tools were developed for visualizing how translated gene products change during development and identifying possible proteoforms. The underlying instrument raw files and tandem mass spectra may also be downloaded. The data resource is freely available on the web at http://www.yeastrc.org/wormpes/ . Graphical Abstract ᅟ.

  16. Visualization and Dissemination of Multidimensional Proteomics Data Comparing Protein Abundance During Caenorhabditis elegans Development

    NASA Astrophysics Data System (ADS)

    Riffle, Michael; Merrihew, Gennifer E.; Jaschob, Daniel; Sharma, Vagisha; Davis, Trisha N.; Noble, William S.; MacCoss, Michael J.

    2015-11-01

    Regulation of protein abundance is a critical aspect of cellular function, organism development, and aging. Alternative splicing may give rise to multiple possible proteoforms of gene products where the abundance of each proteoform is independently regulated. Understanding how the abundances of these distinct gene products change is essential to understanding the underlying mechanisms of many biological processes. Bottom-up proteomics mass spectrometry techniques may be used to estimate protein abundance indirectly by sequencing and quantifying peptides that are later mapped to proteins based on sequence. However, quantifying the abundance of distinct gene products is routinely confounded by peptides that map to multiple possible proteoforms. In this work, we describe a technique that may be used to help mitigate the effects of confounding ambiguous peptides and multiple proteoforms when quantifying proteins. We have applied this technique to visualize the distribution of distinct gene products for the whole proteome across 11 developmental stages of the model organism Caenorhabditis elegans. The result is a large multidimensional dataset for which web-based tools were developed for visualizing how translated gene products change during development and identifying possible proteoforms. The underlying instrument raw files and tandem mass spectra may also be downloaded. The data resource is freely available on the web at http://www.yeastrc.org/wormpes/.

  17. Effects of line-of-sight velocity on spaced-antenna measurements, part 3.5A

    NASA Technical Reports Server (NTRS)

    Royrvik, O.

    1984-01-01

    Horizontal wind velocities in the upper atmosphere, particularly the mesosphere, have been measured using a multitude of different techniques. Most techniques are based on stated or unstated assumptions about the wind field that may or may not be true. Some problems with the spaced antenna drifts (SAD) technique that usually appear to be overlooked are investigated. These problems are not unique to the SAD technique; very similar considerations apply to measurement of horizontal wind using multiple-beam Doppler radars as well. Simply stated, the SAD technique relies on scattering from multiple scatterers within an antenna beam of fairly large beam width. The combination of signals with random phase gives rise to an interference pattern on the ground. This pattern will drift across the ground with a velocity twice that of the ionospheric irregularities from which the radar signals are scattered. By using spaced receivers and measuring time delays of the signal fading in different antennas, it is possible to estimate the horizontal drift velocities.

  18. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    PubMed Central

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-01-01

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation. PMID:26225974

  19. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm.

    PubMed

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-07-28

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation.

  20. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  1. Multi-Step Deep Reactive Ion Etching Fabrication Process for Silicon-Based Terahertz Components

    NASA Technical Reports Server (NTRS)

    Reck, Theodore (Inventor); Perez, Jose Vicente Siles (Inventor); Lee, Choonsup (Inventor); Cooper, Ken B. (Inventor); Jung-Kubiak, Cecile (Inventor); Mehdi, Imran (Inventor); Chattopadhyay, Goutam (Inventor); Lin, Robert H. (Inventor); Peralta, Alejandro (Inventor)

    2016-01-01

    A multi-step silicon etching process has been developed to fabricate silicon-based terahertz (THz) waveguide components. This technique provides precise dimensional control across multiple etch depths with batch processing capabilities. Nonlinear and passive components such as mixers and multipliers waveguides, hybrids, OMTs and twists have been fabricated and integrated into a small silicon package. This fabrication technique enables a wafer-stacking architecture to provide ultra-compact multi-pixel receiver front-ends in the THz range.

  2. Protein fold recognition using geometric kernel data fusion.

    PubMed

    Zakeri, Pooya; Jeuris, Ben; Vandebril, Raf; Moreau, Yves

    2014-07-01

    Various approaches based on features extracted from protein sequences and often machine learning methods have been used in the prediction of protein folds. Finding an efficient technique for integrating these different protein features has received increasing attention. In particular, kernel methods are an interesting class of techniques for integrating heterogeneous data. Various methods have been proposed to fuse multiple kernels. Most techniques for multiple kernel learning focus on learning a convex linear combination of base kernels. In addition to the limitation of linear combinations, working with such approaches could cause a loss of potentially useful information. We design several techniques to combine kernel matrices by taking more involved, geometry inspired means of these matrices instead of convex linear combinations. We consider various sequence-based protein features including information extracted directly from position-specific scoring matrices and local sequence alignment. We evaluate our methods for classification on the SCOP PDB-40D benchmark dataset for protein fold recognition. The best overall accuracy on the protein fold recognition test set obtained by our methods is ∼ 86.7%. This is an improvement over the results of the best existing approach. Moreover, our computational model has been developed by incorporating the functional domain composition of proteins through a hybridization model. It is observed that by using our proposed hybridization model, the protein fold recognition accuracy is further improved to 89.30%. Furthermore, we investigate the performance of our approach on the protein remote homology detection problem by fusing multiple string kernels. The MATLAB code used for our proposed geometric kernel fusion frameworks are publicly available at http://people.cs.kuleuven.be/∼raf.vandebril/homepage/software/geomean.php?menu=5/. © The Author 2014. Published by Oxford University Press.

  3. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  4. [Difference between perinatal mortality in multiple pregnancies obtained spontaneously versus assisted reproduction].

    PubMed

    del Rayo Rivas-Ortiz, Yazmín; Hernández-Herrera, Ricardo Jorge

    2010-06-01

    Recently assisted reproduction techniques are more common, which increases multiple pregnancies and adverse perinatal outcomes. Some authors report increased mortality in multiple pregnancies products obtained by techniques of assisted reproduction vs. conceived spontaneously, although other authors found no significant difference. To evaluate mortality rate of multiple pregnancies comparing those obtained by assisted reproduction vs. spontaneous conception. Retrospective, observational and comparative study. We included pregnant women with 3 or more products that went to the Unidad Médica de Alta Especialidad No. 23, IMSS, in Monterrey, NL (Mexico), between 2002-2008. We compared the number of complicated pregnancies and dead products obtained by a technique of assisted reproduction vs. spontaneous. 68 multiple pregnancies were included. On average, spontaneously conceived fetuses had more weeks of gestation and more birth weight than those achieved by assisted reproduction techniques (p = ns). 20.5% (14/68) of multiple pregnancies had one or more fatal events: 10/40 (25%) by assisted reproduction techniques vs. 4/28 (14%) of spontaneous multiple pregnancies (p = 0.22). 21/134 (16%) of the products conceived by assisted reproduction techniques and 6/88 (7%) of spontaneous (p < 0.03) died. 60% of all multiple pregnancies were obtained by a technique of assisted reproduction and 21% of the cases had one or more fatal events (11% more in pregnancies achieved by assisted reproduction techniques). 12% of the products of multiple pregnancies died (9% more in those obtained by a technique of assisted reproduction).

  5. Patch-Based Super-Resolution of MR Spectroscopic Images: Application to Multiple Sclerosis

    PubMed Central

    Jain, Saurabh; Sima, Diana M.; Sanaei Nezhad, Faezeh; Hangel, Gilbert; Bogner, Wolfgang; Williams, Stephen; Van Huffel, Sabine; Maes, Frederik; Smeets, Dirk

    2017-01-01

    Purpose: Magnetic resonance spectroscopic imaging (MRSI) provides complementary information to conventional magnetic resonance imaging. Acquiring high resolution MRSI is time consuming and requires complex reconstruction techniques. Methods: In this paper, a patch-based super-resolution method is presented to increase the spatial resolution of metabolite maps computed from MRSI. The proposed method uses high resolution anatomical MR images (T1-weighted and Fluid-attenuated inversion recovery) to regularize the super-resolution process. The accuracy of the method is validated against conventional interpolation techniques using a phantom, as well as simulated and in vivo acquired human brain images of multiple sclerosis subjects. Results: The method preserves tissue contrast and structural information, and matches well with the trend of acquired high resolution MRSI. Conclusions: These results suggest that the method has potential for clinically relevant neuroimaging applications. PMID:28197066

  6. An Agent-Based Interface to Terrestrial Ecological Forecasting

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Nemani, Ramakrishna; Pang, Wan-Lin; Votava, Petr; Etzioni, Oren

    2004-01-01

    This paper describes a flexible agent-based ecological forecasting system that combines multiple distributed data sources and models to provide near-real-time answers to questions about the state of the Earth system We build on novel techniques in automated constraint-based planning and natural language interfaces to automatically generate data products based on descriptions of the desired data products.

  7. Building a new predictor for multiple linear regression technique-based corrective maintenance turnaround time.

    PubMed

    Cruz, Antonio M; Barr, Cameron; Puñales-Pozo, Elsa

    2008-01-01

    This research's main goals were to build a predictor for a turnaround time (TAT) indicator for estimating its values and use a numerical clustering technique for finding possible causes of undesirable TAT values. The following stages were used: domain understanding, data characterisation and sample reduction and insight characterisation. Building the TAT indicator multiple linear regression predictor and clustering techniques were used for improving corrective maintenance task efficiency in a clinical engineering department (CED). The indicator being studied was turnaround time (TAT). Multiple linear regression was used for building a predictive TAT value model. The variables contributing to such model were clinical engineering department response time (CE(rt), 0.415 positive coefficient), stock service response time (Stock(rt), 0.734 positive coefficient), priority level (0.21 positive coefficient) and service time (0.06 positive coefficient). The regression process showed heavy reliance on Stock(rt), CE(rt) and priority, in that order. Clustering techniques revealed the main causes of high TAT values. This examination has provided a means for analysing current technical service quality and effectiveness. In doing so, it has demonstrated a process for identifying areas and methods of improvement and a model against which to analyse these methods' effectiveness.

  8. Accounting for estimated IQ in neuropsychological test performance with regression-based techniques.

    PubMed

    Testa, S Marc; Winicki, Jessica M; Pearlson, Godfrey D; Gordon, Barry; Schretlen, David J

    2009-11-01

    Regression-based normative techniques account for variability in test performance associated with multiple predictor variables and generate expected scores based on algebraic equations. Using this approach, we show that estimated IQ, based on oral word reading, accounts for 1-9% of the variability beyond that explained by individual differences in age, sex, race, and years of education for most cognitive measures. These results confirm that adding estimated "premorbid" IQ to demographic predictors in multiple regression models can incrementally improve the accuracy with which regression-based norms (RBNs) benchmark expected neuropsychological test performance in healthy adults. It remains to be seen whether the incremental variance in test performance explained by estimated "premorbid" IQ translates to improved diagnostic accuracy in patient samples. We describe these methods, and illustrate the step-by-step application of RBNs with two cases. We also discuss the rationale, assumptions, and caveats of this approach. More broadly, we note that adjusting test scores for age and other characteristics might actually decrease the accuracy with which test performance predicts absolute criteria, such as the ability to drive or live independently.

  9. Consolidation of Surface Coatings by Friction Stir Techniques

    DTIC Science & Technology

    2010-09-01

    alloy samples were plasma sprayed with a Titanium-Nickel-Chrome coating or a Titanium coating. Single and multiple pass experiments were performed...based coatings onto the Aluminum alloy surface. Results showed that the most successful results were accomplished using a flat, pinless tool, with...properties. Aluminum alloy samples were plasma sprayed with a Titanium-Nickel-Chrome coating or a Titanium coating. Single and multiple pass experiments

  10. The Role of Proteomics in the Diagnosis and Treatment of Women's Cancers: Current Trends in Technology and Future Opportunities

    PubMed Central

    Breuer, Eun-Kyoung Yim; Murph, Mandi M.

    2011-01-01

    Technological and scientific innovations over the last decade have greatly contributed to improved diagnostics, predictive models, and prognosis among cancers affecting women. In fact, an explosion of information in these areas has almost assured future generations that outcomes in cancer will continue to improve. Herein we discuss the current status of breast, cervical, and ovarian cancers as it relates to screening, disease diagnosis, and treatment options. Among the differences in these cancers, it is striking that breast cancer has multiple predictive tests based upon tumor biomarkers and sophisticated, individualized options for prescription therapeutics while ovarian cancer lacks these tools. In addition, cervical cancer leads the way in innovative, cancer-preventative vaccines and multiple screening options to prevent disease progression. For each of these malignancies, emerging proteomic technologies based upon mass spectrometry, stable isotope labeling with amino acids, high-throughput ELISA, tissue or protein microarray techniques, and click chemistry in the pursuit of activity-based profiling can pioneer the next generation of discovery. We will discuss six of the latest techniques to understand proteomics in cancer and highlight research utilizing these techniques with the goal of improvement in the management of women's cancers. PMID:21886869

  11. Consideration of techniques to mitigate the unauthorized 3D printing production of keys

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy; Kerlin, Scott

    2016-05-01

    The illicit production of 3D printed keys based on remote-sensed imagery is problematic as it allows a would-be intruder to access a secured facility without the attack attempt being as obviously detectable as conventional techniques. This paper considers the problem from multiple perspectives. First, it looks at different attack types and considers the prospective attack from a digital information perspective. Second, based on this, techniques for securing keys are considered. Third, the design of keys is considered from the perspective of making them more difficult to duplicate using visible light sensing and 3D printing. Policy and legal considerations are discussed.

  12. One-shot synthetic aperture digital holographic microscopy with non-coplanar angular-multiplexing and coherence gating.

    PubMed

    Lin, Yu-Chih; Tu, Han-Yen; Wu, Xin-Ru; Lai, Xin-Ji; Cheng, Chau-Jern

    2018-05-14

    This paper proposes one-shot synthetic aperture digital holographic microscopy using a combination of angular-multiplexing and coherence gating. The proposed angular-multiplexing technique uses multiple noncoplanar incident beams into the synthetic aperture to create tight packed passbands so as to extend spatial frequency spectrum. Coherence gating is performed to prevent the self-interference among the multiple beams. Based on the design guideline proposed herein, a phase-only spatial light modulator is employed as an adjustable blazed grating to split multiple noncoplanar beams and perform angular-multiplexing, and then using coherence gating based on low-coherence-light, superresolution imaging is achieved after one-shot acquisition.

  13. A first generation cytogenetic ideogram for the Florida manatee (Trichechus manatus latirostris) based on multiple chromosome banding techniques

    USGS Publications Warehouse

    Gray, B.A.; Zori, Roberto T.; McGuire, P.M.; Bonde, R.K.

    2002-01-01

    Detailed chromosome studies were conducted for the Florida manatee (Trichechus manatus latirostris) utilizing primary chromosome banding techniques (G- and Q-banding). Digital microscopic imaging methods were employed and a standard G-banded karyotype was constructed for both sexes. Based on chromosome banding patterns and measurements obtained in these studies, a standard karyotype and ideogram are proposed. Characterization of additional cytogenetic features of this species by supplemental chromosome banding techniques, C-banding (constitutive heterochromatin), Ag-NOR staining (nucleolar organizer regions), and DA/DAPI staining, was also performed. These studies provide detailed cytogenetic data for T. manatus latirostris, which could enhance future genetic mapping projects and interspecific and intraspecific genomic comparisons by techniques such as zoo-FISH.

  14. Performance Analysis of Blind Subspace-Based Signature Estimation Algorithms for DS-CDMA Systems with Unknown Correlated Noise

    NASA Astrophysics Data System (ADS)

    Zarifi, Keyvan; Gershman, Alex B.

    2006-12-01

    We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.

  15. Investigation to realize a computationally efficient implementation of the high-order instantaneous-moments-based fringe analysis method

    NASA Astrophysics Data System (ADS)

    Gorthi, Sai Siva; Rajshekhar, Gannavarpu; Rastogi, Pramod

    2010-06-01

    Recently, a high-order instantaneous moments (HIM)-operator-based method was proposed for accurate phase estimation in digital holographic interferometry. The method relies on piece-wise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients from the HIM operator using single-tone frequency estimation. The work presents a comparative analysis of the performance of different single-tone frequency estimation techniques, like Fourier transform followed by optimization, estimation of signal parameters by rotational invariance technique (ESPRIT), multiple signal classification (MUSIC), and iterative frequency estimation by interpolation on Fourier coefficients (IFEIF) in HIM-operator-based methods for phase estimation. Simulation and experimental results demonstrate the potential of the IFEIF technique with respect to computational efficiency and estimation accuracy.

  16. Simultaneous multiple non-crossing quantile regression estimation using kernel constraints

    PubMed Central

    Liu, Yufeng; Wu, Yichao

    2011-01-01

    Quantile regression (QR) is a very useful statistical tool for learning the relationship between the response variable and covariates. For many applications, one often needs to estimate multiple conditional quantile functions of the response variable given covariates. Although one can estimate multiple quantiles separately, it is of great interest to estimate them simultaneously. One advantage of simultaneous estimation is that multiple quantiles can share strength among them to gain better estimation accuracy than individually estimated quantile functions. Another important advantage of joint estimation is the feasibility of incorporating simultaneous non-crossing constraints of QR functions. In this paper, we propose a new kernel-based multiple QR estimation technique, namely simultaneous non-crossing quantile regression (SNQR). We use kernel representations for QR functions and apply constraints on the kernel coefficients to avoid crossing. Both unregularised and regularised SNQR techniques are considered. Asymptotic properties such as asymptotic normality of linear SNQR and oracle properties of the sparse linear SNQR are developed. Our numerical results demonstrate the competitive performance of our SNQR over the original individual QR estimation. PMID:22190842

  17. Advancement of multifunctional hybrid nanogel systems: Construction and application in drug co-delivery and imaging technique.

    PubMed

    Ma, Yakun; Ge, Yanxiu; Li, Lingbing

    2017-02-01

    Nanogel-based multifunctional drug delivery systems, especially hybrid nanogels and multicompartment nanogels have drawn more and more extensive attention from the researchers in pharmacy because it can result in achieving a superior functionality through the synergistic property enhancement of each component. The unique hybrid and compartmentalized structures provide the great potential for co-delivery of multiple agents even the multiple agents with different physicochemical properties. Otherwise the hybrid nanogel encapsulating optical and magnetic resonance imaging contrast can be utilized in imaging technique for disease diagnosis. More importantly through nanogel-based multifunctional drug delivery systems the stimuli-responsive features might be easily employed for the design of targeted release of drug. This review summarizes the construction of diverse hybrid nanogels and multicompartment nanogels. The application in co-delivery of multiple agents and imaging agents for diagnosis as well as the application in the design of stimuli-responsive multifunctional nanogels as drug delivery are also reviewed and discussed. The future prospects in application of multifunctional nanogels will be also discussed in this review. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  19. A QoS Management Technique of Urgent Information Provision in ITS Services Using DSRC for Autonomous Base Stations

    NASA Astrophysics Data System (ADS)

    Shimura, Akitoshi; Aizono, Takeiki; Hiraiwa, Masashi; Sugano, Shigeki

    A QoS management technique based on an autonomous decentralized mobility system, which is an autonomous decentralized system enhanced to provide mobile stations with information about urgent roadway situations, is proposed in this paper. This technique enables urgent messages to be flexibly and quickly transmitted to mobile stations by multiple decentralized base stations using dedicated short range communication. It also supports the easy addition of additional base stations. Each station autonomously creates information-delivery communities based on the urgency of the messages it receives through the roadside network and the distances between the senders and receivers. Each station dynamically determines the urgency of messages according to the message content and the speed of the mobile stations. Evaluation of this technique applied to the Smart Gateway system, which provides driving-assistance services to mobile stations through dedicated short-range communication, demonstrated its effectiveness and that it is suitable for actual systems.

  20. Traffic handling capability of a broadband indoor wireless network using CDMA multiple access

    NASA Astrophysics Data System (ADS)

    Zhang, Chang G.; Hafez, H. M.; Falconer, David D.

    1994-05-01

    CDMA (code division multiple access) may be an attractive technique for wireless access to broadband services because of its multiple access simplicity and other appealing features. In order to investigate traffic handling capabilities of a future network providing a variety of integrated services, this paper presents a study of a broadband indoor wireless network supporting high-speed traffic using CDMA multiple access. The results are obtained through the simulation of an indoor environment and the traffic capabilities of the wireless access to broadband 155.5 MHz ATM-SONET networks using the mm-wave band. A distributed system architecture is employed and the system performance is measured in terms of call blocking probability and dropping probability. The impacts of the base station density, traffic load, average holding time, and variable traffic sources on the system performance are examined. The improvement of system performance by implementing various techniques such as handoff, admission control, power control and sectorization are also investigated.

  1. Tumor or abnormality identification from magnetic resonance images using statistical region fusion based segmentation.

    PubMed

    Subudhi, Badri Narayan; Thangaraj, Veerakumar; Sankaralingam, Esakkirajan; Ghosh, Ashish

    2016-11-01

    In this article, a statistical fusion based segmentation technique is proposed to identify different abnormality in magnetic resonance images (MRI). The proposed scheme follows seed selection, region growing-merging and fusion of multiple image segments. In this process initially, an image is divided into a number of blocks and for each block we compute the phase component of the Fourier transform. The phase component of each block reflects the gray level variation among the block but contains a large correlation among them. Hence a singular value decomposition (SVD) technique is adhered to generate a singular value of each block. Then a thresholding procedure is applied on these singular values to identify edgy and smooth regions and some seed points are selected for segmentation. By considering each seed point we perform a binary segmentation of the complete MRI and hence with all seed points we get an equal number of binary images. A parcel based statistical fusion process is used to fuse all the binary images into multiple segments. Effectiveness of the proposed scheme is tested on identifying different abnormalities: prostatic carcinoma detection, tuberculous granulomas identification and intracranial neoplasm or brain tumor detection. The proposed technique is established by comparing its results against seven state-of-the-art techniques with six performance evaluation measures. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Investigation of laser Doppler anemometry in developing a velocity-based measurement technique

    NASA Astrophysics Data System (ADS)

    Jung, Ki Won

    2009-12-01

    Acoustic properties, such as the characteristic impedance and the complex propagation constant, of porous materials have been traditionally characterized based on pressure-based measurement techniques using microphones. Although the microphone techniques have evolved since their introduction, the most general form of the microphone technique employs two microphones in characterizing the acoustic field for one continuous medium. The shortcomings of determining the acoustic field based on only two microphones can be overcome by using numerous microphones. However, the use of a number of microphones requires a careful and intricate calibration procedure. This dissertation uses laser Doppler anemometry (LDA) to establish a new measurement technique which can resolve issues that microphone techniques have: First, it is based on a single sensor, thus the calibration is unnecessary when only overall ratio of the acoustic field is required for the characterization of a system. This includes the measurements of the characteristic impedance and the complex propagation constant of a system. Second, it can handle multiple positional measurements without calibrating the signal at each position. Third, it can measure three dimensional components of velocity even in a system with a complex geometry. Fourth, it has a flexible adaptability which is not restricted to a certain type of apparatus only if the apparatus is transparent. LDA is known to possess several disadvantages, such as the requirement of a transparent apparatus, high cost, and necessity of seeding particles. The technique based on LDA combined with a curvefitting algorithm is validated through measurements on three systems. First, the complex propagation constant of the air is measured in a rigidly terminated cylindrical pipe which has very low dissipation. Second, the radiation impedance of an open-ended pipe is measured. These two parameters can be characterized by the ratio of acoustic field measured at multiple locations. Third, the power dissipated in a variable RLC load is measured. The three experiments validate the LDA technique proposed. The utility of the LDA method is then extended to the measurement of the complex propagation constant of the air inside a 100 ppi reticulated vitreous carbon (RVC) sample. Compared to measurements in the available studies, the measurement with the 100 ppi RVC sample supports the LDA technique in that it can achieve a low uncertainty in the determined quantity. This dissertation concludes with using the LDA technique for modal decomposition of the plane wave mode and the (1,1) mode that are driven simultaneously. This modal decomposition suggests that the LDA technique surpasses microphone-based techniques, because they are unable to determine the acoustic field based on an acoustic model with unconfined propagation constants for each modal component.

  3. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    NASA Astrophysics Data System (ADS)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  4. Plasmonic nanoparticle lithography: Fast resist-free laser technique for large-scale sub-50 nm hole array fabrication

    NASA Astrophysics Data System (ADS)

    Pan, Zhenying; Yu, Ye Feng; Valuckas, Vytautas; Yap, Sherry L. K.; Vienne, Guillaume G.; Kuznetsov, Arseniy I.

    2018-05-01

    Cheap large-scale fabrication of ordered nanostructures is important for multiple applications in photonics and biomedicine including optical filters, solar cells, plasmonic biosensors, and DNA sequencing. Existing methods are either expensive or have strict limitations on the feature size and fabrication complexity. Here, we present a laser-based technique, plasmonic nanoparticle lithography, which is capable of rapid fabrication of large-scale arrays of sub-50 nm holes on various substrates. It is based on near-field enhancement and melting induced under ordered arrays of plasmonic nanoparticles, which are brought into contact or in close proximity to a desired material and acting as optical near-field lenses. The nanoparticles are arranged in ordered patterns on a flexible substrate and can be attached and removed from the patterned sample surface. At optimized laser fluence, the nanohole patterning process does not create any observable changes to the nanoparticles and they have been applied multiple times as reusable near-field masks. This resist-free nanolithography technique provides a simple and cheap solution for large-scale nanofabrication.

  5. Feasibility of track-based multiple scattering tomography

    NASA Astrophysics Data System (ADS)

    Jansen, H.; Schütze, P.

    2018-04-01

    We present a tomographic technique making use of a gigaelectronvolt electron beam for the determination of the material budget distribution of centimeter-sized objects by means of simulations and measurements. In both cases, the trajectory of electrons traversing a sample under test is reconstructed using a pixel beam-telescope. The width of the deflection angle distribution of electrons undergoing multiple Coulomb scattering at the sample is estimated. Basing the sinogram on position-resolved estimators enables the reconstruction of the original sample using an inverse radon transform. We exemplify the feasibility of this tomographic technique via simulations of two structured cubes—made of aluminium and lead—and via an in-beam measured coaxial adapter. The simulations yield images with FWHM edge resolutions of (177 ± 13) μm and a contrast-to-noise ratio of 5.6 ± 0.2 (7.8 ± 0.3) for aluminium (lead) compared to air. The tomographic reconstruction of a coaxial adapter serves as experimental evidence of the technique and yields a contrast-to-noise ratio of 15.3 ± 1.0 and a FWHM edge resolution of (117 ± 4) μm.

  6. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  7. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  8. Central venous access in children: indications, devices, and risks.

    PubMed

    Ares, Guillermo; Hunter, Catherine J

    2017-06-01

    Central venous catheters (CVCs) have a prominent role in the diagnostic and therapy of neonates and children. Herein, we describe the multiple indications for CVC use and the different devices available for central venous access. Given the prevalent use of CVCs, healthcare systems are focused on reducing complications from their use, particularly central line-associated bloodstream infections (CLABSIs). The most up-to-date information available sheds light on best practices and future areas of investigation. Large systematic reviews of randomized trials suggest that ultrasound guidance for placement of CVCs in children is safer than using blind technique, at least for internal jugular vein access. Appropriate catheter tip placement is associated with decreased complications. Furthermore, the prophylactic use of ethanol lock between cycles of parenteral nutrition administration has reduced the rates of CLABSI. A recent randomized trial in pediatric CVCs showed a benefit with antibiotic-coated CVCs. Based on the available evidence, multiple techniques for CVC placement are still valid, including the landmark technique based on practitioner experience, but ultrasound guidance has been shown to decrease complications from line placement. Adherence to CVC care protocols is essential in reducing infectious complications.

  9. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  10. Distributed Finite-Time Cooperative Control of Multiple High-Order Nonholonomic Mobile Robots.

    PubMed

    Du, Haibo; Wen, Guanghui; Cheng, Yingying; He, Yigang; Jia, Ruting

    2017-12-01

    The consensus problem of multiple nonholonomic mobile robots in the form of high-order chained structure is considered in this paper. Based on the model features and the finite-time control technique, a finite-time cooperative controller is explicitly constructed which guarantees that the states consensus is achieved in a finite time. As an application of the proposed results, finite-time formation control of multiple wheeled mobile robots is studied and a finite-time formation control algorithm is proposed. To show effectiveness of the proposed approach, a simulation example is given.

  11. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  12. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    PubMed

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  13. Cooperative Robots to Observe Moving Targets: Review.

    PubMed

    Khan, Asif; Rinner, Bernhard; Cavallaro, Andrea

    2018-01-01

    The deployment of multiple robots for achieving a common goal helps to improve the performance, efficiency, and/or robustness in a variety of tasks. In particular, the observation of moving targets is an important multirobot application that still exhibits numerous open challenges, including the effective coordination of the robots. This paper reviews control techniques for cooperative mobile robots monitoring multiple targets. The simultaneous movement of robots and targets makes this problem particularly interesting, and our review systematically addresses this cooperative multirobot problem for the first time. We classify and critically discuss the control techniques: cooperative multirobot observation of multiple moving targets, cooperative search, acquisition, and track, cooperative tracking, and multirobot pursuit evasion. We also identify the five major elements that characterize this problem, namely, the coordination method, the environment, the target, the robot and its sensor(s). These elements are used to systematically analyze the control techniques. The majority of the studied work is based on simulation and laboratory studies, which may not accurately reflect real-world operational conditions. Importantly, while our systematic analysis is focused on multitarget observation, our proposed classification is useful also for related multirobot applications.

  14. [Comparison between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection].

    PubMed

    Sun, Zong-ke; Wu, Rong; Ding, Pei; Xue, Jin-Rong

    2006-07-01

    To compare between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection. Using inoculated and real water samples to compare the equivalence and false positive rate between two methods. Results demonstrate that enzyme substrate technique shows equivalence with multiple-tube fermentation technique (P = 0.059), false positive rate between the two methods has no statistical difference. It is suggested that enzyme substrate technique can be used as a standard method for water microbiological safety evaluation.

  15. Molecular dynamics based enhanced sampling of collective variables with very large time steps.

    PubMed

    Chen, Pei-Yang; Tuckerman, Mark E

    2018-01-14

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  16. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    NASA Astrophysics Data System (ADS)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  17. NetCoDer: A Retransmission Mechanism for WSNs Based on Cooperative Relays and Network Coding

    PubMed Central

    Valle, Odilson T.; Montez, Carlos; Medeiros de Araujo, Gustavo; Vasques, Francisco; Moraes, Ricardo

    2016-01-01

    Some of the most difficult problems to deal with when using Wireless Sensor Networks (WSNs) are related to the unreliable nature of communication channels. In this context, the use of cooperative diversity techniques and the application of network coding concepts may be promising solutions to improve the communication reliability. In this paper, we propose the NetCoDer scheme to address this problem. Its design is based on merging cooperative diversity techniques and network coding concepts. We evaluate the effectiveness of the NetCoDer scheme through both an experimental setup with real WSN nodes and a simulation assessment, comparing NetCoDer performance against state-of-the-art TDMA-based (Time Division Multiple Access) retransmission techniques: BlockACK, Master/Slave and Redundant TDMA. The obtained results highlight that the proposed NetCoDer scheme clearly improves the network performance when compared with other retransmission techniques. PMID:27258280

  18. Impression of multiple implants using photogrammetry: Description of technique and case presentation

    PubMed Central

    Peñarrocha-Oltra, David; Agustín-Panadero, Rubén; Bagán, Leticia; Giménez, Beatriz

    2014-01-01

    Aim: To describe a technique for registering the positions of multiple dental implants using a system based on photogrammetry. A case is presented in which a prosthetic treatment was performed using this technique. Study Design: Three Euroteknika® dental implants were placed to rehabilitate a 55-year-old male patient with right posterior maxillary edentulism. Three months later, the positions of the implants were registered using a photogrammetry-based stereo-camera (PICcamera®). After processing patient and implant data, special abutments (PICabutment®) were screwed onto each implant. The PICcamera® was then used to capture images of the implant positions, automatically taking 150 images in less than 60 seconds. From this information a file was obtained describing the relative positions – angles and distances – of each implant in vector form. Information regarding the soft tissues was obtained from an alginate impression that was cast in plaster and scanned. A Cr-Co structure was obtained using CAD/CAM, and its passive fit was verified in the patient’s mouth using the Sheffield test and the screw resistance test. Results and Conclusions: Twelve months after loading, peri-implant tissues were healthy and no marginal bone loss was observed. The clinical application of this new system using photogrammetry to record the position of multiple dental implants facilitated the rehabilitation of a patient with posterior maxillary edentulism by means of a prosthesis with optimal fit. The prosthetic process was accurate, fast, simple to apply and comfortable for the patient. Key words:Dental implants, photogrammetry, dental impression technique, CAD/CAM. PMID:24608216

  19. Multiple needle puncturing: balancing the varus knee.

    PubMed

    Bellemans, Johan

    2011-09-09

    The so-called "pie crusting" technique using multiple stab incisions is a well-established procedure for correcting tightness of the iliotibial band in the valgus knee. It is, however, not applicable for balancing the medial side in varus knees because of the risk for iatrogenic transsection of the medial collateral ligament (MCL). This article presents our experience with a safer alternative and minimally invasive technique for medial soft tissue balancing, where we make multiple punctures in the MCL using a 19-gauge needle to progressively stretch the MCL until a correct ligament balance is achieved. Our technique requires minimal to no additional soft tissue dissection and can even be performed percutaneously when necessary. This technique, therefore, does not impact the length of the skin or soft tissue incisions. We analyzed 61 cases with varus deformity that were intraoperatively treated using this technique. In 4 other cases, the technique was used as a percutaneous procedure to correct postoperative medial tightness that caused persistent pain on the medial side. The procedure was considered successful when a 2- to 4-mm mediolateral joint line opening was obtained in extension and 2 to 6 mm in flexion. In 62 cases (95%), a progressive correction of medial tightness was achieved according to the above-described criteria. Three cases were overreleased and required compensatory release of the lateral structures and use of a thicker insert. Based on these results, we consider needle puncturing an effective and safe technique for progressive correction of MCL tightness during minimally invasive total knee arthroplasty. Copyright 2011, SLACK Incorporated.

  20. A Novel Technique to Detect Code for SAC-OCDMA System

    NASA Astrophysics Data System (ADS)

    Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.

    2018-04-01

    The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.

  1. Simplified multiple headspace extraction gas chromatographic technique for determination of monomer solubility in water.

    PubMed

    Chai, X S; Schork, F J; DeCinque, Anthony

    2005-04-08

    This paper reports an improved headspace gas chromatographic (GC) technique for determination of monomer solubilities in water. The method is based on a multiple headspace extraction GC technique developed previously [X.S. Chai, Q.X. Hou, F.J. Schork, J. Appl. Polym. Sci., in press], but with the major modification in the method calibration technique. As a result, only a few iterations of headspace extraction and GC measurement are required, which avoids the "exhaustive" headspace extraction, and thus the experimental time for each analysis. For highly insoluble monomers, effort must be made to minimize adsorption in the headspace sampling channel, transportation conduit and capillary column by using higher operating temperature and a short capillary column in the headspace sampler and GC system. For highly water soluble monomers, a new calibration method is proposed. The combinations of these technique modifications results in a method that is simple, rapid and automated. While the current focus of the authors is on the determination of monomer solubility in aqueous solutions, the method should be applicable to determination of solubility of any organic in water.

  2. Wavelet regression model in forecasting crude oil price

    NASA Astrophysics Data System (ADS)

    Hamid, Mohd Helmie; Shabri, Ani

    2017-05-01

    This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.

  3. A multiple scales approach to maximal superintegrability

    NASA Astrophysics Data System (ADS)

    Gubbiotti, G.; Latini, D.

    2018-07-01

    In this paper we present a simple, algorithmic test to establish if a Hamiltonian system is maximally superintegrable or not. This test is based on a very simple corollary of a theorem due to Nekhoroshev and on a perturbative technique called the multiple scales method. If the outcome is positive, this test can be used to suggest maximal superintegrability, whereas when the outcome is negative it can be used to disprove it. This method can be regarded as a finite dimensional analog of the multiple scales method as a way to produce soliton equations. We use this technique to show that the real counterpart of a mechanical system found by Jules Drach in 1935 is, in general, not maximally superintegrable. We give some hints on how this approach could be applied to classify maximally superintegrable systems by presenting a direct proof of the well-known Bertrand’s theorem.

  4. Modeling and Simulation of a Novel Relay Node Based Secure Routing Protocol Using Multiple Mobile Sink for Wireless Sensor Networks.

    PubMed

    Perumal, Madhumathy; Dhandapani, Sivakumar

    2015-01-01

    Data gathering and optimal path selection for wireless sensor networks (WSN) using existing protocols result in collision. Increase in collision further increases the possibility of packet drop. Thus there is a necessity to eliminate collision during data aggregation. Increasing the efficiency is the need of the hour with maximum security. This paper is an effort to come up with a reliable and energy efficient WSN routing and secure protocol with minimum delay. This technique is named as relay node based secure routing protocol for multiple mobile sink (RSRPMS). This protocol finds the rendezvous point for optimal transmission of data using a "splitting tree" technique in tree-shaped network topology and then to determine all the subsequent positions of a sink the "Biased Random Walk" model is used. In case of an event, the sink gathers the data from all sources, when they are in the sensing range of rendezvous point. Otherwise relay node is selected from its neighbor to transfer packets from rendezvous point to sink. A symmetric key cryptography is used for secure transmission. The proposed relay node based secure routing protocol for multiple mobile sink (RSRPMS) is experimented and simulation results are compared with Intelligent Agent-Based Routing (IAR) protocol to prove that there is increase in the network lifetime compared with other routing protocols.

  5. River velocities from sequential multispectral remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Mied, Richard P.

    2013-06-01

    We address the problem of extracting surface velocities from a pair of multispectral remote sensing images over rivers using a new nonlinear multiple-tracer form of the global optimal solution (GOS). The derived velocity field is a valid solution across the image domain to the nonlinear system of equations obtained by minimizing a cost function inferred from the conservation constraint equations for multiple tracers. This is done by deriving an iteration equation for the velocity, based on the multiple-tracer displaced frame difference equations, and a local approximation to the velocity field. The number of velocity equations is greater than the number of velocity components, and thus overly constrain the solution. The iterative technique uses Gauss-Newton and Levenberg-Marquardt methods and our own algorithm of the progressive relaxation of the over-constraint. We demonstrate the nonlinear multiple-tracer GOS technique with sequential multispectral Landsat and ASTER images over a portion of the Potomac River in MD/VA, and derive a dense field of accurate velocity vectors. We compare the GOS river velocities with those from over 12 years of data at four NOAA reference stations, and find good agreement. We discuss how to find the appropriate spatial and temporal resolutions to allow optimization of the technique for specific rivers.

  6. 3D shape measurement of moving object with FFT-based spatial matching

    NASA Astrophysics Data System (ADS)

    Guo, Qinghua; Ruan, Yuxi; Xi, Jiangtao; Song, Limei; Zhu, Xinjun; Yu, Yanguang; Tong, Jun

    2018-03-01

    This work presents a new technique for 3D shape measurement of moving object in translational motion, which finds applications in online inspection, quality control, etc. A low-complexity 1D fast Fourier transform (FFT)-based spatial matching approach is devised to obtain accurate object displacement estimates, and it is combined with single shot fringe pattern prolometry (FPP) techniques to achieve high measurement performance with multiple captured images through coherent combining. The proposed technique overcomes some limitations of existing ones. Specifically, the placement of marks on object surface and synchronization between projector and camera are not needed, the velocity of the moving object is not required to be constant, and there is no restriction on the movement trajectory. Both simulation and experimental results demonstrate the effectiveness of the proposed technique.

  7. Rayleigh Scattering Diagnostic for Simultaneous Measurements of Dynamic Density and Velocity

    NASA Technical Reports Server (NTRS)

    Seasholtz, Richard G.; Panda, J.

    2000-01-01

    A flow diagnostic technique based on the molecular Rayleigh scattering of laser light is used to obtain dynamic density and velocity data in turbulent flows. The technique is based on analyzing the Rayleigh scattered light with a Fabry-Perot interferometer and recording information about the interference pattern with a multiple anode photomultiplier tube (PMT). An artificial neural network is used to process the signals from the PMT to recover the velocity time history, which is then used to calculate the velocity power spectrum. The technique is illustrated using simulated data. The results of an experiment to measure the velocity power spectrum in a low speed (100 rn/sec) flow are also presented.

  8. A literature review on anesthetic practice for carotid endarterectomy surgery based on cost, hemodynamic stability, and neurologic status.

    PubMed

    Meitzner, Mark C; Skurnowicz, Julie A; Mitchell, Anne

    2007-06-01

    An extensive literature review was undertaken to evaluate the best anesthetic practice for carotid endarterectomy surgery. Two anesthetic techniques were evaluated: general anesthetic with an endotracheal tube and regional anesthetic block. Three variables were reviewed with respect to significant clinical outcomes based on anesthetic technique. Relevant literature was obtained through multiple sources that included professional journals, a professional website, and textbooks. According to the literature, there is an advantage to performing regional anesthesia with respect to cost and neurologic status. Information analyzed was inconclusive with respect to hemodynamic stability and anesthetic technique. We conclude that regional anesthesia may have some slight advantages; however, more investigation is warranted.

  9. Design and implementation of the modified signed digit multiplication routine on a ternary optical computer.

    PubMed

    Xu, Qun; Wang, Xianchao; Xu, Chao

    2017-06-01

    Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.

  10. Classifying magnetic resonance image modalities with convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Remedios, Samuel; Pham, Dzung L.; Butman, John A.; Roy, Snehashis

    2018-02-01

    Magnetic Resonance (MR) imaging allows the acquisition of images with different contrast properties depending on the acquisition protocol and the magnetic properties of tissues. Many MR brain image processing techniques, such as tissue segmentation, require multiple MR contrasts as inputs, and each contrast is treated differently. Thus it is advantageous to automate the identification of image contrasts for various purposes, such as facilitating image processing pipelines, and managing and maintaining large databases via content-based image retrieval (CBIR). Most automated CBIR techniques focus on a two-step process: extracting features from data and classifying the image based on these features. We present a novel 3D deep convolutional neural network (CNN)- based method for MR image contrast classification. The proposed CNN automatically identifies the MR contrast of an input brain image volume. Specifically, we explored three classification problems: (1) identify T1-weighted (T1-w), T2-weighted (T2-w), and fluid-attenuated inversion recovery (FLAIR) contrasts, (2) identify pre vs postcontrast T1, (3) identify pre vs post-contrast FLAIR. A total of 3418 image volumes acquired from multiple sites and multiple scanners were used. To evaluate each task, the proposed model was trained on 2137 images and tested on the remaining 1281 images. Results showed that image volumes were correctly classified with 97.57% accuracy.

  11. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras.

    PubMed

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-06-24

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.

  12. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    PubMed Central

    Jung, Jaehoon; Yoon, Inhye; Lee, Seungwon; Paik, Joonki

    2016-01-01

    Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i) generation of a three-dimensional (3D) human model; (ii) human object-based automatic scene calibration; and (iii) metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system. PMID:27347961

  13. Comparison of Phase-Based 3D Near-Field Source Localization Techniques for UHF RFID.

    PubMed

    Parr, Andreas; Miesen, Robert; Vossiek, Martin

    2016-06-25

    In this paper, we present multiple techniques for phase-based narrowband backscatter tag localization in three-dimensional space with planar antenna arrays or synthetic apertures. Beamformer and MUSIC localization algorithms, known from near-field source localization and direction-of-arrival estimation, are applied to the 3D backscatter scenario and their performance in terms of localization accuracy is evaluated. We discuss the impact of different transceiver modes known from the literature, which evaluate different send and receive antenna path combinations for a single localization, as in multiple input multiple output (MIMO) systems. Furthermore, we propose a new Singledimensional-MIMO (S-MIMO) transceiver mode, which is especially suited for use with mobile robot systems. Monte-Carlo simulations based on a realistic multipath error model ensure spatial correlation of the simulated signals, and serve to critically appraise the accuracies of the different localization approaches. A synthetic uniform rectangular array created by a robotic arm is used to evaluate selected localization techniques. We use an Ultra High Frequency (UHF) Radiofrequency Identification (RFID) setup to compare measurements with the theory and simulation. The results show how a mean localization accuracy of less than 30 cm can be reached in an indoor environment. Further simulations demonstrate how the distance between aperture and tag affects the localization accuracy and how the size and grid spacing of the rectangular array need to be adapted to improve the localization accuracy down to orders of magnitude in the centimeter range, and to maximize array efficiency in terms of localization accuracy per number of elements.

  14. Speciation and distribution of copper in a mining soil using multiple synchrotron-based bulk and microscopic techniques.

    PubMed

    Yang, Jianjun; Liu, Jin; Dynes, James J; Peak, Derek; Regier, Tom; Wang, Jian; Zhu, Shenhai; Shi, Jiyan; Tse, John S

    2014-02-01

    Molecular-level understanding of soil Cu speciation and distribution assists in management of Cu contamination in mining sites. In this study, one soil sample, collected from a mining site contaminated since 1950s, was characterized complementarily by multiple synchrotron-based bulk and spatially resolved techniques for the speciation and distribution of Cu as well as other related elements (Fe, Ca, Mn, K, Al, and Si). Bulk X-ray absorption near-edge structure (XANES) and extended X-ray absorption fine structure (EXAFS) spectroscopy revealed that soil Cu was predominantly associated with Fe oxides instead of soil organic matter. This agreed with the closest association of Cu to Fe by microscopic X-ray fluorescence (U-XRF) and scanning transmission X-ray microscopy (STXM) nanoanalysis, along with the non-occurrence of photoreduction of soil Cu(II) by quick Cu L3,2-edge XANES spectroscopy (Q-XANES) which often occurs when Cu organic complexes are present. Furthermore, bulk-EXAFS and STXM-coupled Fe L3,2-edge nano-XANES analysis revealed soil Cu adsorbed primarily to Fe(III) oxides by inner-sphere complexation. Additionally, Cu K-edge μ-XANES, L3,2-edge bulk-XANES, and successive Q-XANES results identified the presence of Cu2S rather than radiation-damage artifacts dominant in certain microsites of the mining soil. This study demonstrates the great benefits in use of multiple combined synchrotron-based techniques for comprehensive understanding of Cu speciation in heterogeneous soil matrix, which facilitates our prediction of Cu reactivity and environmental fate in the mining site.

  15. Use of Empirical Estimates of Shrinkage in Multiple Regression: A Caution.

    ERIC Educational Resources Information Center

    Kromrey, Jeffrey D.; Hines, Constance V.

    1995-01-01

    The accuracy of four empirical techniques to estimate shrinkage in multiple regression was studied through Monte Carlo simulation. None of the techniques provided unbiased estimates of the population squared multiple correlation coefficient, but the normalized jackknife and bootstrap techniques demonstrated marginally acceptable performance with…

  16. Multiple template-based fluoroscopic tracking of lung tumor mass without implanted fiducial markers

    NASA Astrophysics Data System (ADS)

    Cui, Ying; Dy, Jennifer G.; Sharp, Gregory C.; Alexander, Brian; Jiang, Steve B.

    2007-10-01

    Precise lung tumor localization in real time is particularly important for some motion management techniques, such as respiratory gating or beam tracking with a dynamic multi-leaf collimator, due to the reduced clinical tumor volume (CTV) to planning target volume (PTV) margin and/or the escalated dose. There might be large uncertainties in deriving tumor position from external respiratory surrogates. While tracking implanted fiducial markers has sufficient accuracy, this procedure may not be widely accepted due to the risk of pneumothorax. Previously, we have developed a technique to generate gating signals from fluoroscopic images without implanted fiducial markers using a template matching method (Berbeco et al 2005 Phys. Med. Biol. 50 4481-90, Cui et al 2007 Phys. Med. Biol. 52 741-55). In this paper, we present an extension of this method to multiple-template matching for directly tracking the lung tumor mass in fluoroscopy video. The basic idea is as follows: (i) during the patient setup session, a pair of orthogonal fluoroscopic image sequences are taken and processed off-line to generate a set of reference templates that correspond to different breathing phases and tumor positions; (ii) during treatment delivery, fluoroscopic images are continuously acquired and processed; (iii) the similarity between each reference template and the processed incoming image is calculated; (iv) the tumor position in the incoming image is then estimated by combining the tumor centroid coordinates in reference templates with proper weights based on the measured similarities. With different handling of image processing and similarity calculation, two such multiple-template tracking techniques have been developed: one based on motion-enhanced templates and Pearson's correlation score while the other based on eigen templates and mean-squared error. The developed techniques have been tested on six sequences of fluoroscopic images from six lung cancer patients against the reference tumor positions manually determined by a radiation oncologist. The tumor centroid coordinates automatically detected using both methods agree well with the manually marked reference locations. The eigenspace tracking method performs slightly better than the motion-enhanced method, with average localization errors less than 2 pixels (1 mm) and the error at a 95% confidence level of about 2-4 pixels (1-2 mm). This work demonstrates the feasibility of direct tracking of a lung tumor mass in fluoroscopic images without implanted fiducial markers using multiple reference templates.

  17. Mobile multiple access study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Multiple access techniques (FDMA, CDMA, TDMA) for the mobile user and attempts to identify the current best technique are discussed. Traffic loading is considered as well as voice and data modulation and spacecraft and system design. Emphasis is placed on developing mobile terminal cost estimates for the selected design. In addition, design examples are presented for the alternative techniques of multiple access in order to compare with the selected technique.

  18. Chromatographic peak deconvolution of constitutional isomers by multiple-reaction-monitoring mass spectrometry.

    PubMed

    Trapp, Oliver

    2010-02-12

    Highly efficient and sophisticated separation techniques are available to analyze complex compound mixtures with superior sensitivities and selectivities often enhanced by a 2nd dimension, e.g. a separation technique or spectroscopic and spectrometric techniques. For enantioselective separations numerous chiral stationary phases (CSPs) exist to cover a broad range of chiral compounds. Despite these advances enantioselective separations can become very challenging for mixtures of stereolabile constitutional isomers, because the on-column interconversion can lead to completely overlapping peak profiles. Typically, multidimensional separation techniques, e.g. multidimensional GC (MDGC), using an achiral 1st separation dimension and transferring selected analytes to a chiral 2nd separation are the method of choice to approach such problems. However, this procedure is very time consuming and only predefined sections of peaks can be transferred by column switching to the second dimension. Here we demonstrate for stereolabile 1,2-dialkylated diaziridines a technique to experimentally deconvolute overlapping gas chromatographic elution profiles of constitutional isomers based on multiple-reaction-monitoring MS (MRM-MS). The here presented technique takes advantage of different fragmentation probabilities and pathways to isolate the elution profile of configurational isomers. Copyright 2009 Elsevier B.V. All rights reserved.

  19. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  20. Rule-based statistical data mining agents for an e-commerce application

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar

    2003-03-01

    Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.

  1. A Comparison of Accuracy of Matrix Impression System with Putty Reline Technique and Multiple Mix Technique: An In Vitro Study.

    PubMed

    Kumar, M Praveen; Patil, Suneel G; Dheeraj, Bhandari; Reddy, Keshav; Goel, Dinker; Krishna, Gopi

    2015-06-01

    The difficulty in obtaining an acceptable impression increases exponentially as the number of abutments increases. Accuracy of the impression material and the use of a suitable impression technique are of utmost importance in the fabrication of a fixed partial denture. This study compared the accuracy of the matrix impression system with conventional putty reline and multiple mix technique for individual dies by comparing the inter-abutment distance in the casts obtained from the impressions. Three groups, 10 impressions each with three impression techniques (matrix impression system, putty reline technique and multiple mix technique) were made of a master die. Typodont teeth were embedded in a maxillary frasaco model base. The left first premolar was removed to create a three-unit fixed partial denture situation and the left canine and second premolar were prepared conservatively, and hatch marks were made on the abutment teeth. The final casts obtained from the impressions were examined under a profile projector and the inter-abutment distance was calculated for all the casts and compared. The results from this study showed that in the mesiodistal dimensions the percentage deviation from master model in Group I was 0.1 and 0.2, in Group II was 0.9 and 0.3, and Group III was 1.6 and 1.5, respectively. In the labio-palatal dimensions the percentage deviation from master model in Group I was 0.01 and 0.4, Group II was 1.9 and 1.3, and Group III was 2.2 and 2.0, respectively. In the cervico-incisal dimensions the percentage deviation from the master model in Group I was 1.1 and 0.2, Group II was 3.9 and 1.7, and Group III was 1.9 and 3.0, respectively. In the inter-abutment dimension of dies, percentage deviation from master model in Group I was 0.1, Group II was 0.6, and Group III was 1.0. The matrix impression system showed more accuracy of reproduction for individual dies when compared with putty reline technique and multiple mix technique in all the three directions, as well as the inter-abutment distance.

  2. Bunny hops: using multiplicities of zeroes in calculus for graphing

    NASA Astrophysics Data System (ADS)

    Miller, David; Deshler, Jessica M.; Hansen, Ryan

    2016-07-01

    Students learn a lot of material in each mathematics course they take. However, they are not always able to make meaningful connections between content in successive mathematics courses. This paper reports on a technique to address a common topic in calculus I courses (intervals of increase/decrease and concave up/down) while also making use of students' pre-existing knowledge about the behaviour of functions around zeroes based on multiplicities.

  3. Parallel Implementation of the Wideband DOA Algorithm on the IBM Cell BE Processor

    DTIC Science & Technology

    2010-05-01

    Abstract—The Multiple Signal Classification ( MUSIC ) algorithm is a powerful technique for determining the Direction of Arrival (DOA) of signals...Broadband Engine Processor (Cell BE). The process of adapting the serial based MUSIC algorithm to the Cell BE will be analyzed in terms of parallelism and...using Multiple Signal Classification MUSIC algorithm [4] • Computation of Focus matrix • Computation of number of sources • Separation of Signal

  4. Polarization manipulation in single refractive prism based holography lithography

    NASA Astrophysics Data System (ADS)

    Xiong, Wenjie; Xu, Yi; Xiao, Yujian; Lv, Xiaoxu; Wu, Lijun

    2015-01-01

    We propose theoretically and demonstrate experimentally a simple but effective strategy for polarization manipulation in single refractive prism based holographic lithography. By tuning the polarization of a single laser beam, we can obtain the pill shape interference pattern with a high-contrast where a complex optical setup and multiple polarizers are needed in the conventional holography lithography. Fabrication of pill shape two-dimensional polymer photonic crystals using one beam and one shoot holography lithography is shown as an example to support our theoretical results. This integrated polarization manipulation technique can release the crucial stability restrictions imposed on the multiple beams holography lithography.

  5. Physical layer security in fiber-optic MIMO-SDM systems: An overview

    NASA Astrophysics Data System (ADS)

    Guan, Kyle; Cho, Junho; Winzer, Peter J.

    2018-02-01

    Fiber-optic transmission systems provide large capacities over enormous distances but are vulnerable to simple eavesdropping attacks at the physical layer. We classify key-based and keyless encryption and physical layer security techniques and discuss them in the context of optical multiple-input-multiple-output space-division multiplexed (MIMO-SDM) fiber-optic communication systems. We show that MIMO-SDM not only increases system capacity, but also ensures the confidentiality of information transmission. Based on recent numerical and experimental results, we review how the unique channel characteristics of MIMO-SDM can be exploited to provide various levels of physical layer security.

  6. Sensor fusion III: 3-D perception and recognition; Proceedings of the Meeting, Boston, MA, Nov. 5-8, 1990

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1991-01-01

    The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.

  7. A subjective scheduler for subjective dedicated networks

    NASA Astrophysics Data System (ADS)

    Suherman; Fakhrizal, Said Reza; Al-Akaidi, Marwan

    2017-09-01

    Multiple access technique is one of important techniques within medium access layer in TCP/IP protocol stack. Each network technology implements the selected access method. Priority can be implemented in those methods to differentiate services. Some internet networks are dedicated for specific purpose. Education browsing or tutorial video accesses are preferred in a library hotspot, while entertainment and sport contents could be subjects of limitation. Current solution may use IP address filter or access list. This paper proposes subjective properties of users or applications are used for priority determination in multiple access techniques. The NS-2 simulator is employed to evaluate the method. A video surveillance network using WiMAX is chosen as the object. Subjective priority is implemented on WiMAX scheduler based on traffic properties. Three different traffic sources from monitoring video: palace, park, and market are evaluated. The proposed subjective scheduler prioritizes palace monitoring video that results better quality, xx dB than the later monitoring spots.

  8. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  9. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  10. Link Correlation Based Transmit Sector Antenna Selection for Alamouti Coded OFDM

    NASA Astrophysics Data System (ADS)

    Ahn, Chang-Jun

    In MIMO systems, the deployment of a multiple antenna technique can enhance the system performance. However, since the cost of RF transmitters is much higher than that of antennas, there is growing interest in techniques that use a larger number of antennas than the number of RF transmitters. These methods rely on selecting the optimal transmitter antennas and connecting them to the respective. In this case, feedback information (FBI) is required to select the optimal transmitter antenna elements. Since FBI is control overhead, the rate of the feedback is limited. This motivates the study of limited feedback techniques where only partial or quantized information from the receiver is conveyed back to the transmitter. However, in MIMO/OFDM systems, it is difficult to develop an effective FBI quantization method for choosing the space-time, space-frequency, or space-time-frequency processing due to the numerous subchannels. Moreover, MIMO/OFDM systems require antenna separation of 5 ∼ 10 wavelengths to keep the correlation coefficient below 0.7 to achieve a diversity gain. In this case, the base station requires a large space to set up multiple antennas. To reduce these problems, in this paper, we propose the link correlation based transmit sector antenna selection for Alamouti coded OFDM without FBI.

  11. Improving wave forecasting by integrating ensemble modelling and machine learning

    NASA Astrophysics Data System (ADS)

    O'Donncha, F.; Zhang, Y.; James, S. C.

    2017-12-01

    Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.

  12. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  13. Optical multiple-image authentication based on cascaded phase filtering structure

    NASA Astrophysics Data System (ADS)

    Wang, Q.; Alfalou, A.; Brosseau, C.

    2016-10-01

    In this study, we report on the recent developments of optical image authentication algorithms. Compared with conventional optical encryption, optical image authentication achieves more security strength because such methods do not need to recover information of plaintext totally during the decryption period. Several recently proposed authentication systems are briefly introduced. We also propose a novel multiple-image authentication system, where multiple original images are encoded into a photon-limited encoded image by using a triple-plane based phase retrieval algorithm and photon counting imaging (PCI) technique. One can only recover a noise-like image using correct keys. To check authority of multiple images, a nonlinear fractional correlation is employed to recognize the original information hidden in the decrypted results. The proposal can be implemented optically using a cascaded phase filtering configuration. Computer simulation results are presented to evaluate the performance of this proposal and its effectiveness.

  14. Identifying content-based and relational techniques to change behaviour in motivational interviewing.

    PubMed

    Hardcastle, Sarah J; Fortier, Michelle; Blake, Nicola; Hagger, Martin S

    2017-03-01

    Motivational interviewing (MI) is a complex intervention comprising multiple techniques aimed at changing health-related motivation and behaviour. However, MI techniques have not been systematically isolated and classified. This study aimed to identify the techniques unique to MI, classify them as content-related or relational, and evaluate the extent to which they overlap with techniques from the behaviour change technique taxonomy version 1 [BCTTv1; Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., … Wood, C. E. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81-95]. Behaviour change experts (n = 3) content-analysed MI techniques based on Miller and Rollnick's [(2013). Motivational interviewing: Preparing people for change (3rd ed.). New York: Guildford Press] conceptualisation. Each technique was then coded for independence and uniqueness by independent experts (n = 10). The experts also compared each MI technique to those from the BCTTv1. Experts identified 38 distinct MI techniques with high agreement on clarity, uniqueness, preciseness, and distinctiveness ratings. Of the identified techniques, 16 were classified as relational techniques. The remaining 22 techniques were classified as content based. Sixteen of the MI techniques were identified as having substantial overlap with techniques from the BCTTv1. The isolation and classification of MI techniques will provide researchers with the necessary tools to clearly specify MI interventions and test the main and interactive effects of the techniques on health behaviour. The distinction between relational and content-based techniques within MI is also an important advance, recognising that changes in motivation and behaviour in MI is a function of both intervention content and the interpersonal style in which the content is delivered.

  15. (DCT-FY08) Target Detection Using Multiple Modality Airborne and Ground Based Sensors

    DTIC Science & Technology

    2013-03-01

    Plenoptic modeling: an image-based rendering system,” in SIGGRAPH ’95: Proceedings of the 22nd annual conference on Computer graphics and interactive...techniques. New York, NY, USA: ACM, 1995, pp. 39–46. [21] D. G. Aliaga and I. Carlbom, “ Plenoptic stitching: a scalable method for reconstructing 3D

  16. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    NASA Astrophysics Data System (ADS)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-06-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  17. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    NASA Astrophysics Data System (ADS)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-03-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  18. Multi-carrier mobile TDMA system with active array antenna

    NASA Technical Reports Server (NTRS)

    Suzuki, Ryutaro; Matsumoto, Yasushi; Hamamoto, Naokazu

    1990-01-01

    A multi-carrier time division multiple access (TDMA) is proposed for the future mobile satellite communications systems that include a multi-satellite system. This TDMA system employs the active array antenna in which the digital beam forming technique is adopted to control the antenna beam direction. The antenna beam forming is carried out at the base band frequency by using the digital signal processing technique. The time division duplex technique is applied for the TDM/TDMA burst format, in order not to overlap transmit and receive timing.

  19. Towards Efficient Decoding of Multiple Classes of Motor Imagery Limb Movements Based on EEG Spectral and Time Domain Descriptors.

    PubMed

    Samuel, Oluwarotimi Williams; Geng, Yanjuan; Li, Xiangxin; Li, Guanglin

    2017-10-28

    To control multiple degrees of freedom (MDoF) upper limb prostheses, pattern recognition (PR) of electromyogram (EMG) signals has been successfully applied. This technique requires amputees to provide sufficient EMG signals to decode their limb movement intentions (LMIs). However, amputees with neuromuscular disorder/high level amputation often cannot provide sufficient EMG control signals, and thus the applicability of the EMG-PR technique is limited especially to this category of amputees. As an alternative approach, electroencephalograph (EEG) signals recorded non-invasively from the brain have been utilized to decode the LMIs of humans. However, most of the existing EEG based limb movement decoding methods primarily focus on identifying limited classes of upper limb movements. In addition, investigation on EEG feature extraction methods for the decoding of multiple classes of LMIs has rarely been considered. Therefore, 32 EEG feature extraction methods (including 12 spectral domain descriptors (SDDs) and 20 time domain descriptors (TDDs)) were used to decode multiple classes of motor imagery patterns associated with different upper limb movements based on 64-channel EEG recordings. From the obtained experimental results, the best individual TDD achieved an accuracy of 67.05 ± 3.12% as against 87.03 ± 2.26% for the best SDD. By applying a linear feature combination technique, an optimal set of combined TDDs recorded an average accuracy of 90.68% while that of the SDDs achieved an accuracy of 99.55% which were significantly higher than those of the individual TDD and SDD at p < 0.05. Our findings suggest that optimal feature set combination would yield a relatively high decoding accuracy that may improve the clinical robustness of MDoF neuroprosthesis. The study was approved by the ethics committee of Institutional Review Board of Shenzhen Institutes of Advanced Technology, and the reference number is SIAT-IRB-150515-H0077.

  20. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  1. A Synthesis of Star Calibration Techniques for Ground-Based Narrowband Electron-Multiplying Charge-Coupled Device Imagers Used in Auroral Photometry

    NASA Technical Reports Server (NTRS)

    Grubbs, Guy II; Michell, Robert; Samara, Marilia; Hampton, Don; Jahn, Jorg-Micha

    2016-01-01

    A technique is presented for the periodic and systematic calibration of ground-based optical imagers. It is important to have a common system of units (Rayleighs or photon flux) for cross comparison as well as self-comparison over time. With the advancement in technology, the sensitivity of these imagers has improved so that stars can be used for more precise calibration. Background subtraction, flat fielding, star mapping, and other common techniques are combined in deriving a calibration technique appropriate for a variety of ground-based imager installations. Spectral (4278, 5577, and 8446 A ) ground-based imager data with multiple fields of view (19, 47, and 180 deg) are processed and calibrated using the techniques developed. The calibration techniques applied result in intensity measurements in agreement between different imagers using identical spectral filtering, and the intensity at each wavelength observed is within the expected range of auroral measurements. The application of these star calibration techniques, which convert raw imager counts into units of photon flux, makes it possible to do quantitative photometry. The computed photon fluxes, in units of Rayleighs, can be used for the absolute photometry between instruments or as input parameters for auroral electron transport models.

  2. A Wideband Satcom Based Avionics Network with CDMA Uplink and TDM Downlink

    NASA Technical Reports Server (NTRS)

    Agrawal, D.; Johnson, B. S.; Madhow, U.; Ramchandran, K.; Chun, K. S.

    2000-01-01

    The purpose of this paper is to describe some key technical ideas behind our vision of a future satcom based digital communication network for avionics applications The key features of our design are as follows: (a) Packetized transmission to permit efficient use of system resources for multimedia traffic; (b) A time division multiplexed (TDM) satellite downlink whose physical layer is designed to operate the satellite link at maximum power efficiency. We show how powerful turbo codes (invented originally for linear modulation) can be used with nonlinear constant envelope modulation, thus permitting the satellite amplifier to operate in a power efficient nonlinear regime; (c) A code division multiple access (CDMA) satellite uplink, which permits efficient access to the satellite from multiple asynchronous users. Closed loop power control is difficult for bursty packetized traffic, especially given the large round trip delay to the satellite. We show how adaptive interference suppression techniques can be used to deal with the ensuing near-far problem; (d) Joint source-channel coding techniques are required both at the physical and the data transport layer to optimize the end-to-end performance. We describe a novel approach to multiple description image encoding at the data transport layer in this paper.

  3. PHEA-PLA biocompatible nanoparticles by technique of solvent evaporation from multiple emulsions.

    PubMed

    Cavallaro, Gennara; Craparo, Emanuela Fabiola; Sardo, Carla; Lamberti, Gaetano; Barba, Anna Angela; Dalmoro, Annalisa

    2015-11-30

    Nanocarriers of amphiphilic polymeric materials represent versatile delivery systems for poorly water soluble drugs. In this work the technique of solvent evaporation from multiple emulsions was applied to produce nanovectors based on new amphiphilic copolymer, the α,β-poly(N-2-hydroxyethyl)-DL-aspartamide-polylactic acid (PHEA-PLA), purposely synthesized to be used in the controlled release of active molecules poorly soluble in water. To this aim an amphiphilic derivative of PHEA, a hydrophilic polymer, was synthesized by derivatization of the polymeric backbone with hydrophobic grafts of polylactic acid (PLA). The achieved copolymer was thus used to produce nanoparticles loaded with α tocopherol (vitamin E) adopted as lipophilic model molecule. Applying a protocol based on solvent evaporation from multiple emulsions assisted by ultrasonic energy and optimizing the emulsification process (solvent selection/separation stages), PHEA-PLA nanostructured particles with total α tocopherol entrapment efficiency (100%), were obtained. The drug release is expected to take place in lower times with respect to PLA due to the presence of the hydrophilic PHEA, therefore the produced nanoparticles can be used for semi-long term release drug delivery systems. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Multiple speckle illumination for optical-resolution photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Poisson, Florian; Stasio, Nicolino; Moser, Christophe; Psaltis, Demetri; Bossy, Emmanuel

    2017-03-01

    Optical-resolution photoacoustic microscopy offers exquisite and specific contrast to optical absorption. Conventional approaches generally involves raster scanning a focused spot over the sample. Here, we demonstrate that a full-field illumination approach with multiple speckle illumination can also provide diffraction-limited optical-resolution photoacoustic images. Two different proof-of-concepts are demonstrated with micro-structured test samples. The first approach follows the principle of correlation/ghost imaging,1, 2 and is based on cross-correlating photoacoustic signals under multiple speckle illumination with known speckle patterns measured during a calibration step. The second approach is a speckle scanning microscopy technique, which adapts the technique proposed in fluorescence microscopy by Bertolotti and al.:3 in our work, spatially unresolved photoacoustic measurements are performed for various translations of unknown speckle patterns. A phase-retrieval algorithm is used to reconstruct the object from the knowledge of the modulus of its Fourier Transform yielded by the measurements. Because speckle patterns naturally appear in many various situations, including propagation through biological tissue or multi-mode fibers (for which focusing light is either very demanding if not impossible), speckle-illumination-based photoacoustic microscopy provides a powerful framework for the development of novel reconstruction approaches, well-suited to compressed sensing approaches.2

  5. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    PubMed

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. E-Nose Vapor Identification Based on Dempster-Shafer Fusion of Multiple Classifiers

    NASA Technical Reports Server (NTRS)

    Li, Winston; Leung, Henry; Kwan, Chiman; Linnell, Bruce R.

    2005-01-01

    Electronic nose (e-nose) vapor identification is an efficient approach to monitor air contaminants in space stations and shuttles in order to ensure the health and safety of astronauts. Data preprocessing (measurement denoising and feature extraction) and pattern classification are important components of an e-nose system. In this paper, a wavelet-based denoising method is applied to filter the noisy sensor measurements. Transient-state features are then extracted from the denoised sensor measurements, and are used to train multiple classifiers such as multi-layer perceptions (MLP), support vector machines (SVM), k nearest neighbor (KNN), and Parzen classifier. The Dempster-Shafer (DS) technique is used at the end to fuse the results of the multiple classifiers to get the final classification. Experimental analysis based on real vapor data shows that the wavelet denoising method can remove both random noise and outliers successfully, and the classification rate can be improved by using classifier fusion.

  7. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  8. Multiple sclerosis lesion segmentation using an automatic multimodal graph cuts.

    PubMed

    García-Lorenzo, Daniel; Lecoeur, Jeremy; Arnold, Douglas L; Collins, D Louis; Barillot, Christian

    2009-01-01

    Graph Cuts have been shown as a powerful interactive segmentation technique in several medical domains. We propose to automate the Graph Cuts in order to automatically segment Multiple Sclerosis (MS) lesions in MRI. We replace the manual interaction with a robust EM-based approach in order to discriminate between MS lesions and the Normal Appearing Brain Tissues (NABT). Evaluation is performed in synthetic and real images showing good agreement between the automatic segmentation and the target segmentation. We compare our algorithm with the state of the art techniques and with several manual segmentations. An advantage of our algorithm over previously published ones is the possibility to semi-automatically improve the segmentation due to the Graph Cuts interactive feature.

  9. Electrodynamic multiple-scattering method for the simulation of optical trapping atop periodic metamaterials

    NASA Astrophysics Data System (ADS)

    Yannopapas, Vassilios; Paspalakis, Emmanuel

    2018-07-01

    We present a new theoretical tool for simulating optical trapping of nanoparticles in the presence of an arbitrary metamaterial design. The method is based on rigorously solving Maxwell's equations for the metamaterial via a hybrid discrete-dipole approximation/multiple-scattering technique and direct calculation of the optical force exerted on the nanoparticle by means of the Maxwell stress tensor. We apply the method to the case of a spherical polystyrene probe trapped within the optical landscape created by illuminating of a plasmonic metamaterial consisting of periodically arranged tapered metallic nanopyramids. The developed technique is ideally suited for general optomechanical calculations involving metamaterial designs and can compete with purely numerical methods such as finite-difference or finite-element schemes.

  10. A satellite mobile communication system based on Band-Limited Quasi-Synchronous Code Division Multiple Access (BLQS-CDMA)

    NASA Technical Reports Server (NTRS)

    Degaudenzi, R.; Elia, C.; Viola, R.

    1990-01-01

    Discussed here is a new approach to code division multiple access applied to a mobile system for voice (and data) services based on Band Limited Quasi Synchronous Code Division Multiple Access (BLQS-CDMA). The system requires users to be chip synchronized to reduce the contribution of self-interference and to make use of voice activation in order to increase the satellite power efficiency. In order to achieve spectral efficiency, Nyquist chip pulse shaping is used with no detection performance impairment. The synchronization problems are solved in the forward link by distributing a master code, whereas carrier forced activation and closed loop control techniques have been adopted in the return link. System performance sensitivity to nonlinear amplification and timing/frequency synchronization errors are analyzed.

  11. Practical Implementation of Multiple Model Adaptive Estimation Using Neyman-Pearson Based Hypothesis Testing and Spectral Estimation Tools

    DTIC Science & Technology

    1996-09-01

    Generalized Likelihood Ratio (GLR) and voting techniques. The third class consisted of multiple hypothesis filter detectors, specifically the MMAE. The...vector version, versus a tensor if we use the matrix version of the power spectral density estimate. Using this notation, we will derive an...as MATLAB , have an intrinsic sample covariance computation available, which makes this method quite easy to implement. In practice, the mean for the

  12. Discovery of hotspots on Io using disk-resolved infrared imaging

    NASA Technical Reports Server (NTRS)

    Spencer, J. R.; Shure, M. A.; Ressler, M. E.; Sinton, W. M.; Goguen, J. D.

    1990-01-01

    First results are presented using two new techniques for ground-based observation of Io's hotspots. An IR array camera was used to obtain direct IR images of Io with resolution better than 0.5 arcsec, so that more than one hotspot is seen on Io in Jupiter eclipse. The camera was also used to make the first observations of the Jupiter occultation of the hotspots. These new techniques have revealed and located at least three hotspots and will now permit routine ground-based monitoring of the locations, temperatures, and sizes of multiple hotspots on Io.

  13. A multiple-point spatially weighted k-NN method for object-based classification

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.

    2016-10-01

    Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.

  14. An Exploratory Study of Religion and Trust in Ghana

    ERIC Educational Resources Information Center

    Addai, Isaac; Opoku-Agyeman, Chris; Ghartey, Helen Tekyiwa

    2013-01-01

    Based on individual-level data from 2008 Afro-barometer survey, this study explores the relationship between religion (religious affiliation and religious importance) and trust (interpersonal and institutional) among Ghanaians. Employing hierarchical multiple regression technique, our analyses reveal a positive relationship between religious…

  15. Comprehensive quantification of signal-to-noise ratio and g-factor for image-based and k-space-based parallel imaging reconstructions.

    PubMed

    Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A

    2008-10-01

    Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.

  16. Computationally efficient stochastic optimization using multiple realizations

    NASA Astrophysics Data System (ADS)

    Bayer, P.; Bürger, C. M.; Finkel, M.

    2008-02-01

    The presented study is concerned with computationally efficient methods for solving stochastic optimization problems involving multiple equally probable realizations of uncertain parameters. A new and straightforward technique is introduced that is based on dynamically ordering the stack of realizations during the search procedure. The rationale is that a small number of critical realizations govern the output of a reliability-based objective function. By utilizing a problem, which is typical to designing a water supply well field, several variants of this "stack ordering" approach are tested. The results are statistically assessed, in terms of optimality and nominal reliability. This study demonstrates that the simple ordering of a given number of 500 realizations while applying an evolutionary search algorithm can save about half of the model runs without compromising the optimization procedure. More advanced variants of stack ordering can, if properly configured, save up to more than 97% of the computational effort that would be required if the entire number of realizations were considered. The findings herein are promising for similar problems of water management and reliability-based design in general, and particularly for non-convex problems that require heuristic search techniques.

  17. Deterring watermark collusion attacks using signal processing techniques

    NASA Astrophysics Data System (ADS)

    Lemma, Aweke N.; van der Veen, Michiel

    2007-02-01

    Collusion attack is a malicious watermark removal attack in which the hacker has access to multiple copies of the same content with different watermarks and tries to remove the watermark using averaging. In the literature, several solutions to collusion attacks have been reported. The main stream solutions aim at designing watermark codes that are inherently resistant to collusion attacks. The other approaches propose signal processing based solutions that aim at modifying the watermarked signals in such a way that averaging multiple copies of the content leads to a significant degradation of the content quality. In this paper, we present signal processing based technique that may be deployed for deterring collusion attacks. We formulate the problem in the context of electronic music distribution where the content is generally available in the compressed domain. Thus, we first extend the collusion resistance principles to bit stream signals and secondly present experimental based analysis to estimate a bound on the maximum number of modified versions of a content that satisfy good perceptibility requirement on one hand and destructive averaging property on the other hand.

  18. Impression of multiple implants using photogrammetry: description of technique and case presentation.

    PubMed

    Peñarrocha-Oltra, David; Agustín-Panadero, Rubén; Bagán, Leticia; Giménez, Beatriz; Peñarrocha, María

    2014-07-01

    To describe a technique for registering the positions of multiple dental implants using a system based on photogrammetry. A case is presented in which a prosthetic treatment was performed using this technique. Three Euroteknika® dental implants were placed to rehabilitate a 55-year-old male patient with right posterior maxillary edentulism. Three months later, the positions of the implants were registered using a photogrammetry-based stereo-camera (PICcamera®). After processing patient and implant data, special abutments (PICabutment®) were screwed onto each implant. The PICcamera® was then used to capture images of the implant positions, automatically taking 150 images in less than 60 seconds. From this information a file was obtained describing the relative positions - angles and distances - of each implant in vector form. Information regarding the soft tissues was obtained from an alginate impression that was cast in plaster and scanned. A Cr-Co structure was obtained using CAD/CAM, and its passive fit was verified in the patient's mouth using the Sheffield test and the screw resistance test. Twelve months after loading, peri-implant tissues were healthy and no marginal bone loss was observed. The clinical application of this new system using photogrammetry to record the position of multiple dental implants facilitated the rehabilitation of a patient with posterior maxillary edentulism by means of a prosthesis with optimal fit. The prosthetic process was accurate, fast, simple to apply and comfortable for the patient.

  19. Properties of a Formal Method to Model Emergence in Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    Future space missions will require cooperation between multiple satellites and/or rovers. Developers are proposing intelligent autonomous swarms for these missions, but swarm-based systems are difficult or impossible to test with current techniques. This viewgraph presentation examines the use of formal methods in testing swarm-based systems. The potential usefulness of formal methods in modeling the ANTS asteroid encounter mission is also examined.

  20. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers*

    PubMed Central

    Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-01-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782

  1. An Automated Parallel Image Registration Technique Based on the Correlation of Wavelet Features

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Campbell, William J.; Cromp, Robert F.; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    With the increasing importance of multiple platform/multiple remote sensing missions, fast and automatic integration of digital data from disparate sources has become critical to the success of these endeavors. Our work utilizes maxima of wavelet coefficients to form the basic features of a correlation-based automatic registration algorithm. Our wavelet-based registration algorithm is tested successfully with data from the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) and the Landsat/Thematic Mapper(TM), which differ by translation and/or rotation. By the choice of high-frequency wavelet features, this method is similar to an edge-based correlation method, but by exploiting the multi-resolution nature of a wavelet decomposition, our method achieves higher computational speeds for comparable accuracies. This algorithm has been implemented on a Single Instruction Multiple Data (SIMD) massively parallel computer, the MasPar MP-2, as well as on the CrayT3D, the Cray T3E and a Beowulf cluster of Pentium workstations.

  2. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  3. Physics-based approach to haptic display

    NASA Technical Reports Server (NTRS)

    Brown, J. Michael; Colgate, J. Edward

    1994-01-01

    This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.

  4. Memory management and compiler support for rapid recovery from failures in computer systems

    NASA Technical Reports Server (NTRS)

    Fuchs, W. K.

    1991-01-01

    This paper describes recent developments in the use of memory management and compiler technology to support rapid recovery from failures in computer systems. The techniques described include cache coherence protocols for user transparent checkpointing in multiprocessor systems, compiler-based checkpoint placement, compiler-based code modification for multiple instruction retry, and forward recovery in distributed systems utilizing optimistic execution.

  5. Calibrating and training of neutron based NSA techniques with less SNM standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William H; Swinhoe, Martyn T; Bracken, David S

    2010-01-01

    Accessing special nuclear material (SNM) standards for the calibration of and training on nondestructive assay (NDA) instruments has become increasingly difficult in light of enhanced safeguards and security regulations. Limited or nonexistent access to SNM has affected neutron based NDA techniques more than gamma ray techniques because the effects of multiplication require a range of masses to accurately measure the detector response. Neutron based NDA techniques can also be greatly affected by the matrix and impurity characteristics of the item. The safeguards community has been developing techniques for calibrating instrumentation and training personnel with dwindling numbers of SNM standards. Montemore » Carlo methods have become increasingly important for design and calibration of instrumentation. Monte Carlo techniques have the ability to accurately predict the detector response for passive techniques. The Monte Carlo results are usually benchmarked to neutron source measurements such as californium. For active techniques, the modeling becomes more difficult because of the interaction of the interrogation source with the detector and nuclear material; and the results cannot be simply benchmarked with neutron sources. A Monte Carlo calculated calibration curve for a training course in Indonesia of material test reactor (MTR) fuel elements assayed with an active well coincidence counter (AWCC) will be presented as an example. Performing training activities with reduced amounts of nuclear material makes it difficult to demonstrate how the multiplication and matrix properties of the item affects the detector response and limits the knowledge that can be obtained with hands-on training. A neutron pulse simulator (NPS) has been developed that can produce a pulse stream representative of a real pulse stream output from a detector measuring SNM. The NPS has been used by the International Atomic Energy Agency (IAEA) for detector testing and training applications at the Agency due to the lack of appropriate SNM standards. This paper will address the effect of reduced access to SNM for calibration and training of neutron NDA applications along with the advantages and disadvantages of some solutions that do not use standards, such as the Monte Carlo techniques and the NPS.« less

  6. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    PubMed Central

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  7. Multi-objects recognition for distributed intelligent sensor networks

    NASA Astrophysics Data System (ADS)

    He, Haibo; Chen, Sheng; Cao, Yuan; Desai, Sachi; Hohil, Myron E.

    2008-04-01

    This paper proposes an innovative approach for multi-objects recognition for homeland security and defense based intelligent sensor networks. Unlike the conventional way of information analysis, data mining in such networks is typically characterized with high information ambiguity/uncertainty, data redundancy, high dimensionality and real-time constrains. Furthermore, since a typical military based network normally includes multiple mobile sensor platforms, ground forces, fortified tanks, combat flights, and other resources, it is critical to develop intelligent data mining approaches to fuse different information resources to understand dynamic environments, to support decision making processes, and finally to achieve the goals. This paper aims to address these issues with a focus on multi-objects recognition. Instead of classifying a single object as in the traditional image classification problems, the proposed method can automatically learn multiple objectives simultaneously. Image segmentation techniques are used to identify the interesting regions in the field, which correspond to multiple objects such as soldiers or tanks. Since different objects will come with different feature sizes, we propose a feature scaling method to represent each object in the same number of dimensions. This is achieved by linear/nonlinear scaling and sampling techniques. Finally, support vector machine (SVM) based learning algorithms are developed to learn and build the associations for different objects, and such knowledge will be adaptively accumulated for objects recognition in the testing stage. We test the effectiveness of proposed method in different simulated military environments.

  8. A nonlinear control method based on ANFIS and multiple models for a class of SISO nonlinear systems and its application.

    PubMed

    Zhang, Yajun; Chai, Tianyou; Wang, Hong

    2011-11-01

    This paper presents a novel nonlinear control strategy for a class of uncertain single-input and single-output discrete-time nonlinear systems with unstable zero-dynamics. The proposed method combines adaptive-network-based fuzzy inference system (ANFIS) with multiple models, where a linear robust controller, an ANFIS-based nonlinear controller and a switching mechanism are integrated using multiple models technique. It has been shown that the linear controller can ensure the boundedness of the input and output signals and the nonlinear controller can improve the dynamic performance of the closed loop system. Moreover, it has also been shown that the use of the switching mechanism can simultaneously guarantee the closed loop stability and improve its performance. As a result, the controller has the following three outstanding features compared with existing control strategies. First, this method relaxes the assumption of commonly-used uniform boundedness on the unmodeled dynamics and thus enhances its applicability. Second, since ANFIS is used to estimate and compensate the effect caused by the unmodeled dynamics, the convergence rate of neural network learning has been increased. Third, a "one-to-one mapping" technique is adapted to guarantee the universal approximation property of ANFIS. The proposed controller is applied to a numerical example and a pulverizing process of an alumina sintering system, respectively, where its effectiveness has been justified.

  9. Structural damage diagnostics via wave propagation-based filtering techniques

    NASA Astrophysics Data System (ADS)

    Ayers, James T., III

    Structural health monitoring (SHM) of aerospace components is a rapidly emerging field due in part to commercial and military transport vehicles remaining in operation beyond their designed life cycles. Damage detection strategies are sought that provide real-time information of the structure's integrity. One approach that has shown promise to accurately identify and quantify structural defects is based on guided ultrasonic wave (GUW) inspections, where low amplitude attenuation properties allow for long range and large specimen evaluation. One drawback to GUWs is that they exhibit a complex multi-modal response, such that each frequency corresponds to at least two excited modes, and thus intelligent signal processing is required for even the simplest of structures. In addition, GUWs are dispersive, whereby the wave velocity is a function of frequency, and the shape of the wave packet changes over the spatial domain, requiring sophisticated detection algorithms. Moreover, existing damage quantification measures are typically formulated as a comparison of the damaged to undamaged response, which has proven to be highly sensitive to changes in environment, and therefore often unreliable. As a response to these challenges inherent to GUW inspections, this research develops techniques to locate and estimate the severity of the damage. Specifically, a phase gradient based localization algorithm is introduced to identify the defect position independent of excitation frequency and damage size. Mode separation through the filtering technique is central in isolating and extracting single mode components, such as reflected, converted, and transmitted modes that may arise from the incident wave impacting a damage. Spatially-integrated single and multiple component mode coefficients are also formulated with the intent to better characterize wave reflections and conversions and to increase the signal to noise ratios. The techniques are applied to damaged isotropic finite element plate models and experimental data obtained from Scanning Laser Doppler Vibrometry tests. Numerical and experimental parametric studies are conducted, and the current strengths and weaknesses of the proposed approaches are discussed. In particular, limitations to the damage profiling characterization are shown for low ultrasonic frequency regimes, whereas the multiple component mode conversion coefficients provide excellent noise mitigation. Multiple component estimation relies on an experimental technique developed for the estimation of Lamb wave polarization using a 1D Laser Vibrometer. Lastly, suggestions are made to apply the techniques to more structurally complex geometries.

  10. The evolution of spinal instrumentation for the management of occipital cervical and cervicothoracic junctional injuries.

    PubMed

    Smucker, Joseph D; Sasso, Rick C

    2006-05-15

    Independent computer-based literature review of articles pertaining to instrumentation and fusion of junctional injuries of the cervical spine. To review and discuss the evolution of instrumentation techniques and systems used in the treatment of cervical spine junctional injuries. Instrumentation of junctional injuries of the cervical spine has been limited historically by failure to achieve rigid internal fixation in multiple planes. The evolution of these techniques has required increased insight into the morphology and unique biomechanics of the structures to be instrumented. Computer-based literature search of Ovid and PubMed databases. Extensive literature search yielded insights into the evolution of systems initially based on onlay bone graft combined with wiring techniques. Such techniques have come to include systems incorporating rigid, longitudinal struts that accommodate multiplanar screws placed in the lateral masses, pedicles, transarticular regions, and occipital bone. Despite a rapid evolution of techniques and instrumentation technologies, it remains incumbent on the physician to provide the patient with a surgical procedure that balances the likelihood of a favorable outcome with the risk inherent in the implementation of the procedure.

  11. Empirically Based Strategies for Preventing Juvenile Delinquency.

    PubMed

    Pardini, Dustin

    2016-04-01

    Juvenile crime is a serious public health problem that results in significant emotional and financial costs for victims and society. Using etiologic models as a guide, multiple interventions have been developed to target risk factors thought to perpetuate the emergence and persistence of delinquent behavior. Evidence suggests that the most effective interventions tend to have well-defined treatment protocols, focus on therapeutic approaches as opposed to external control techniques, and use multimodal cognitive-behavioral treatment strategies. Moving forward, there is a need to develop effective policies and procedures that promote the widespread adoption of evidence-based delinquency prevention practices across multiple settings. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Tropospheric wet refractivity tomography using multiplicative algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Xiaoying, Wang; Ziqiang, Dai; Enhong, Zhang; Fuyang, K. E.; Yunchang, Cao; Lianchun, Song

    2014-01-01

    Algebraic reconstruction techniques (ART) have been successfully used to reconstruct the total electron content (TEC) of the ionosphere and in recent years be tentatively used in tropospheric wet refractivity and water vapor tomography in the ground-based GNSS technology. The previous research on ART used in tropospheric water vapor tomography focused on the convergence and relaxation parameters for various algebraic reconstruction techniques and rarely discussed the impact of Gaussian constraints and initial field on the iteration results. The existing accuracy evaluation parameters calculated from slant wet delay can only evaluate the resultant precision of the voxels penetrated by slant paths and cannot evaluate that of the voxels not penetrated by any slant path. The paper proposes two new statistical parameters Bias and RMS, calculated from wet refractivity of the total voxels, to improve the deficiencies of existing evaluation parameters and then discusses the effect of the Gaussian constraints and initial field on the convergence and tomography results in multiplicative algebraic reconstruction technique (MART) to reconstruct the 4D tropospheric wet refractivity field using simulation method.

  13. An Exact Model-Based Method for Near-Field Sources Localization with Bistatic MIMO System.

    PubMed

    Singh, Parth Raj; Wang, Yide; Chargé, Pascal

    2017-03-30

    In this paper, we propose an exact model-based method for near-field sources localization with a bistatic multiple input, multiple output (MIMO) radar system, and compare it with an approximated model-based method. The aim of this paper is to propose an efficient way to use the exact model of the received signals of near-field sources in order to eliminate the systematic error introduced by the use of approximated model in most existing near-field sources localization techniques. The proposed method uses parallel factor (PARAFAC) decomposition to deal with the exact model. Thanks to the exact model, the proposed method has better precision and resolution than the compared approximated model-based method. The simulation results show the performance of the proposed method.

  14. Evaluation of an Agricultural Meteorological Disaster Based on Multiple Criterion Decision Making and Evolutionary Algorithm

    PubMed Central

    Yu, Xiaobing; Yu, Xianrui; Lu, Yiqun

    2018-01-01

    The evaluation of a meteorological disaster can be regarded as a multiple-criteria decision making problem because it involves many indexes. Firstly, a comprehensive indexing system for an agricultural meteorological disaster is proposed, which includes the disaster rate, the inundated rate, and the complete loss rate. Following this, the relative weights of the three criteria are acquired using a novel proposed evolutionary algorithm. The proposed algorithm consists of a differential evolution algorithm and an evolution strategy. Finally, a novel evaluation model, based on the proposed algorithm and the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), is presented to estimate the agricultural meteorological disaster of 2008 in China. The geographic information system (GIS) technique is employed to depict the disaster. The experimental results demonstrated that the agricultural meteorological disaster of 2008 was very serious, especially in Hunan and Hubei provinces. Some useful suggestions are provided to relieve agriculture meteorological disasters. PMID:29597243

  15. Signed-negabinary-arithmetic-based optical computing by use of a single liquid-crystal-display panel.

    PubMed

    Datta, Asit K; Munshi, Soumika

    2002-03-10

    Based on the negabinary number representation, parallel one-step arithmetic operations (that is, addition and subtraction), logical operations, and matrix-vector multiplication on data have been optically implemented, by use of a two-dimensional spatial-encoding technique. For addition and subtraction, one of the operands in decimal form is converted into the unsigned negabinary form, whereas the other decimal number is represented in the signed negabinary form. The result of operation is obtained in the mixed negabinary form and is converted back into decimal. Matrix-vector multiplication for unsigned negabinary numbers is achieved through the convolution technique. Both of the operands for logical operation are converted to their signed negabinary forms. All operations are implemented by use of a unique optical architecture. The use of a single liquid-crystal-display panel to spatially encode the input data, operational kernels, and decoding masks have simplified the architecture as well as reduced the cost and complexity.

  16. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  17. [Pay attention to the application of the international intraocular retinoblastoma classification and sequential multiple modality treatment].

    PubMed

    Fan, X Q

    2017-08-11

    Retinoblastoma (RB) is the most common intraocular malignancy in childhood. It may seriously affect vision, and even threaten the life. The early diagnosis rate of RB in China remains low, and the majority of patients are at late phase with high rates of enucleation and mortality. The International Intraocular Retinoblastoma Classification and TNM staging system are guidances for therapeutic choices and bases for prognosis evaluation. Based on the sequential multi-method treatment modality, chemotherapy combined with local therapy is the mainstream in dealing with RB, which may maximize the results of eye saving and even vision retaining. New therapeutic techniques including supra-selective ophthalmic artery interventional chemotherapy and intravitreal chemotherapy can further improve the efficacy of treatment, especially the eye salvage rate. The overall level of RB treatment should be improved by promoting the international staging, new therapeutic techniques, and the sequential multiple modality treatment. (Chin J Ophthalmol, 2017, 53: 561 - 565) .

  18. Optical-wireless-optical full link for polarization multiplexing quadrature amplitude/phase modulation signal transmission.

    PubMed

    Li, Xinying; Yu, Jianjun; Chi, Nan; Zhang, Junwen

    2013-11-15

    We propose and experimentally demonstrate an optical wireless integration system at the Q-band, in which up to 40 Gb/s polarization multiplexing multilevel quadrature amplitude/phase modulation (PM-QAM) signal can be first transmitted over 20 km single-mode fiber-28 (SMF-28), then delivered over a 2 m 2 × 2 multiple-input multiple-output wireless link, and finally transmitted over another 20 km SMF-28. The PM-QAM modulated wireless millimeter-wave (mm-wave) signal at 40 GHz is generated based on the remote heterodyning technique, and demodulated by the radio-frequency transparent photonic technique based on homodyne coherent detection and baseband digital signal processing. The classic constant modulus algorithm equalization is used at the receiver to realize polarization demultiplexing of the PM-QAM signal. For the first time, to the best of our knowledge, we realize the conversion of the PM-QAM modulated wireless mm-wave signal to the optical signal as well as 20 km fiber transmission of the converted optical signal.

  19. Automatic Generation of Caricatures with Multiple Expressions Using Transformative Approach

    NASA Astrophysics Data System (ADS)

    Liao, Wen-Hung; Lai, Chien-An

    The proliferation of digital cameras has changed the way we create and share photos. Novel forms of photo composition and reproduction have surfaced in recent years. In this paper, we present an automatic caricature generation system using transformative approaches. By combing facial feature detection, image segmentation and image warping/morphing techniques, the system is able to generate stylized caricature using only one reference image. When more than one reference sample are available, the system can either choose the best fit based on shape matching, or synthesize a composite style using polymorph technique. The system can also produce multiple expressions by controlling a subset of MPEG-4 facial animation parameters (FAP). Finally, to enable flexible manipulation of the synthetic caricature, we also investigate issues such as color quantization and raster-to-vector conversion. A major strength of our method is that the synthesized caricature bears a higher degree of resemblance to the real person than traditional component-based approaches.

  20. A Modal Approach to Compact MIMO Antenna Design

    NASA Astrophysics Data System (ADS)

    Yang, Binbin

    MIMO (Multiple-Input Multiple-Output) technology offers new possibilities for wireless communication through transmission over multiple spatial channels, and enables linear increases in spectral efficiency as the number of the transmitting and receiving antennas increases. However, the physical implementation of such systems in compact devices encounters many physical constraints mainly from the design of multi-antennas. First, an antenna's bandwidth decreases dramatically as its electrical size reduces, a fact known as antenna Q limit; secondly, multiple antennas closely spaced tend to couple with each other, undermining MIMO performance. Though different MIMO antenna designs have been proposed in the literature, there is still a lack of a systematic design methodology and knowledge of performance limits. In this dissertation, we employ characteristic mode theory (CMT) as a powerful tool for MIMO antenna analysis and design. CMT allows us to examine each physical mode of the antenna aperture, and to access its many physical parameters without even exciting the antenna. For the first time, we propose efficient circuit models for MIMO antennas of arbitrary geometry using this modal decomposition technique. Those circuit models demonstrate the powerful physical insight of CMT for MIMO antenna modeling, and simplify MIMO antenna design problem to just the design of specific antenna structural modes and a modal feed network, making possible the separate design of antenna aperture and feeds. We therefore develop a feed-independent shape synthesis technique for optimization of broadband multi-mode apertures. Combining the shape synthesis and circuit modeling techniques for MIMO antennas, we propose a shape-first feed-next design methodology for MIMO antennas, and designed and fabricated two planar MIMO antennas, each occupying an aperture much smaller than the regular size of lambda/2 x lambda/2. Facilitated by the newly developed source formulation for antenna stored energy and recently reported work on antenna Q factor minimization, we extend the minimum Q limit to antennas of arbitrary geometry, and show that given an antenna aperture, any antenna design based on its substructure will result into minimum Q factors larger than or equal to that of the complete structure. This limit is much tighter than Chu's limit based on spherical modes, and applies to antennas of arbitrary geometry. Finally, considering the almost inevitable presence of mutual coupling effects within compact multiport antennas, we develop new decoupling networks (DN) and decoupling network synthesis techniques. An information-theoretic metric, information mismatch loss (Gammainfo), is defined for DN characterization. Based on this metric, the optimization of decoupling networks for broadband system performance is conducted, which demonstrates the limitation of the single-frequency decoupling techniques and room for improvement.

  1. Vertical decomposition with Genetic Algorithm for Multiple Sequence Alignment

    PubMed Central

    2011-01-01

    Background Many Bioinformatics studies begin with a multiple sequence alignment as the foundation for their research. This is because multiple sequence alignment can be a useful technique for studying molecular evolution and analyzing sequence structure relationships. Results In this paper, we have proposed a Vertical Decomposition with Genetic Algorithm (VDGA) for Multiple Sequence Alignment (MSA). In VDGA, we divide the sequences vertically into two or more subsequences, and then solve them individually using a guide tree approach. Finally, we combine all the subsequences to generate a new multiple sequence alignment. This technique is applied on the solutions of the initial generation and of each child generation within VDGA. We have used two mechanisms to generate an initial population in this research: the first mechanism is to generate guide trees with randomly selected sequences and the second is shuffling the sequences inside such trees. Two different genetic operators have been implemented with VDGA. To test the performance of our algorithm, we have compared it with existing well-known methods, namely PRRP, CLUSTALX, DIALIGN, HMMT, SB_PIMA, ML_PIMA, MULTALIGN, and PILEUP8, and also other methods, based on Genetic Algorithms (GA), such as SAGA, MSA-GA and RBT-GA, by solving a number of benchmark datasets from BAliBase 2.0. Conclusions The experimental results showed that the VDGA with three vertical divisions was the most successful variant for most of the test cases in comparison to other divisions considered with VDGA. The experimental results also confirmed that VDGA outperformed the other methods considered in this research. PMID:21867510

  2. Carrier Estimation Using Classic Spectral Estimation Techniques for the Proposed Demand Assignment Multiple Access Service

    NASA Technical Reports Server (NTRS)

    Scaife, Bradley James

    1999-01-01

    In any satellite communication, the Doppler shift associated with the satellite's position and velocity must be calculated in order to determine the carrier frequency. If the satellite state vector is unknown then some estimate must be formed of the Doppler-shifted carrier frequency. One elementary technique is to examine the signal spectrum and base the estimate on the dominant spectral component. If, however, the carrier is spread (as in most satellite communications) this technique may fail unless the chip rate-to-data rate ratio (processing gain) associated with the carrier is small. In this case, there may be enough spectral energy to allow peak detection against a noise background. In this thesis, we present a method to estimate the frequency (without knowledge of the Doppler shift) of a spread-spectrum carrier assuming a small processing gain and binary-phase shift keying (BPSK) modulation. Our method relies on an averaged discrete Fourier transform along with peak detection on spectral match filtered data. We provide theory and simulation results indicating the accuracy of this method. In addition, we will describe an all-digital hardware design based around a Motorola DSP56303 and high-speed A/D which implements this technique in real-time. The hardware design is to be used in NMSU's implementation of NASA's demand assignment, multiple access (DAMA) service.

  3. Genomic Physics. Multiple Laser Beam Treatment of Alzheimer's Disease

    NASA Astrophysics Data System (ADS)

    Stefan, V. Alexander

    2014-03-01

    The synapses affected by Alzheimer's disease can be rejuvenated by the multiple ultrashort wavelength laser beams.[2] The guiding lasers scan the whole area to detect the amyloid plaques based on the laser scattering technique. The scanning lasers pinpoint the areas with plaques and eliminate them. Laser interaction is highly efficient, because of the focusing capabilities and possibility for the identification of the damaging proteins by matching the protein oscillation eigen-frequency with laser frequency.[3] Supported by Nikola Tesla Labs, La Jolla, California, USA.

  4. Electrical test prediction using hybrid metrology and machine learning

    NASA Astrophysics Data System (ADS)

    Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti

    2017-03-01

    Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.

  5. An ANN-Based Smart Tomographic Reconstructor in a Dynamic Environment

    PubMed Central

    de Cos Juez, Francisco J.; Lasheras, Fernando Sánchez; Roqueñí, Nieves; Osborn, James

    2012-01-01

    In astronomy, the light emitted by an object travels through the vacuum of space and then the turbulent atmosphere before arriving at a ground based telescope. By passing through the atmosphere a series of turbulent layers modify the light's wave-front in such a way that Adaptive Optics reconstruction techniques are needed to improve the image quality. A novel reconstruction technique based in Artificial Neural Networks (ANN) is proposed. The network is designed to use the local tilts of the wave-front measured by a Shack Hartmann Wave-front Sensor (SHWFS) as inputs and estimate the turbulence in terms of Zernike coefficients. The ANN used is a Multi-Layer Perceptron (MLP) trained with simulated data with one turbulent layer changing in altitude. The reconstructor was tested using three different atmospheric profiles and compared with two existing reconstruction techniques: Least Squares type Matrix Vector Multiplication (LS) and Learn and Apply (L + A). PMID:23012524

  6. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE PAGES

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo; ...

    2017-07-14

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  7. An islanding detection methodology combining decision trees and Sandia frequency shift for inverter-based distributed generations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azim, Riyasat; Li, Fangxing; Xue, Yaosuo

    Distributed generations (DGs) for grid-connected applications require an accurate and reliable islanding detection methodology (IDM) for secure system operation. This paper presents an IDM for grid-connected inverter-based DGs. The proposed method is a combination of passive and active islanding detection techniques for aggregation of their advantages and elimination/minimisation of the drawbacks. In the proposed IDM, the passive method utilises critical system attributes extracted from local voltage measurements at target DG locations as well as employs decision tree-based classifiers for characterisation and detection of islanding events. The active method is based on Sandia frequency shift technique and is initiated only whenmore » the passive method is unable to differentiate islanding events from other system events. Thus, the power quality degradation introduced into the system by active islanding detection techniques can be minimised. Furthermore, a combination of active and passive techniques allows detection of islanding events under low power mismatch scenarios eliminating the disadvantage associated with the use of passive techniques alone. Finally, detailed case study results demonstrate the effectiveness of the proposed method in detection of islanding events under various power mismatch scenarios, load quality factors and in the presence of single or multiple grid-connected inverter-based DG units.« less

  8. Evolutionary Algorithm Based Feature Optimization for Multi-Channel EEG Classification.

    PubMed

    Wang, Yubo; Veluvolu, Kalyana C

    2017-01-01

    The most BCI systems that rely on EEG signals employ Fourier based methods for time-frequency decomposition for feature extraction. The band-limited multiple Fourier linear combiner is well-suited for such band-limited signals due to its real-time applicability. Despite the improved performance of these techniques in two channel settings, its application in multiple-channel EEG is not straightforward and challenging. As more channels are available, a spatial filter will be required to eliminate the noise and preserve the required useful information. Moreover, multiple-channel EEG also adds the high dimensionality to the frequency feature space. Feature selection will be required to stabilize the performance of the classifier. In this paper, we develop a new method based on Evolutionary Algorithm (EA) to solve these two problems simultaneously. The real-valued EA encodes both the spatial filter estimates and the feature selection into its solution and optimizes it with respect to the classification error. Three Fourier based designs are tested in this paper. Our results show that the combination of Fourier based method with covariance matrix adaptation evolution strategy (CMA-ES) has the best overall performance.

  9. A Comparison of Accuracy of Matrix Impression System with Putty Reline Technique and Multiple Mix Technique: An In Vitro Study

    PubMed Central

    Kumar, M Praveen; Patil, Suneel G; Dheeraj, Bhandari; Reddy, Keshav; Goel, Dinker; Krishna, Gopi

    2015-01-01

    Background: The difficulty in obtaining an acceptable impression increases exponentially as the number of abutments increases. Accuracy of the impression material and the use of a suitable impression technique are of utmost importance in the fabrication of a fixed partial denture. This study compared the accuracy of the matrix impression system with conventional putty reline and multiple mix technique for individual dies by comparing the inter-abutment distance in the casts obtained from the impressions. Materials and Methods: Three groups, 10 impressions each with three impression techniques (matrix impression system, putty reline technique and multiple mix technique) were made of a master die. Typodont teeth were embedded in a maxillary frasaco model base. The left first premolar was removed to create a three-unit fixed partial denture situation and the left canine and second premolar were prepared conservatively, and hatch marks were made on the abutment teeth. The final casts obtained from the impressions were examined under a profile projector and the inter-abutment distance was calculated for all the casts and compared. Results: The results from this study showed that in the mesiodistal dimensions the percentage deviation from master model in Group I was 0.1 and 0.2, in Group II was 0.9 and 0.3, and Group III was 1.6 and 1.5, respectively. In the labio-palatal dimensions the percentage deviation from master model in Group I was 0.01 and 0.4, Group II was 1.9 and 1.3, and Group III was 2.2 and 2.0, respectively. In the cervico-incisal dimensions the percentage deviation from the master model in Group I was 1.1 and 0.2, Group II was 3.9 and 1.7, and Group III was 1.9 and 3.0, respectively. In the inter-abutment dimension of dies, percentage deviation from master model in Group I was 0.1, Group II was 0.6, and Group III was 1.0. Conclusion: The matrix impression system showed more accuracy of reproduction for individual dies when compared with putty reline technique and multiple mix technique in all the three directions, as well as the inter-abutment distance. PMID:26124599

  10. Information theoretic partitioning and confidence based weight assignment for multi-classifier decision level fusion in hyperspectral target recognition applications

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Bruce, L. M.

    2007-04-01

    There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.

  11. Determination of the plutonium content in a spent fuel assembly by passive and active interrogation using a differential die-away instrument

    NASA Astrophysics Data System (ADS)

    Henzl, V.; Croft, S.; Richard, J.; Swinhoe, M. T.; Tobin, S. J.

    2013-06-01

    In this paper, we present a novel approach to estimating the total plutonium content in a spent fuel assembly (SFA) that is based on combining information from a passive measurement of the total neutron count rate (PN) of the assayed SFA and a measure of its multiplication. While PN can be measured essentially with any non-destructive assay (NDA) technique capable of neutron detection, the measure of multiplication is, in our approach, determined by means of active interrogation using an instrument based on the Differential Die-Away technique (DDA). The DDA is a NDA technique developed within the U.S. Department of Energy's Next Generation Safeguards Initiative (NGSI) project focused on the utilization of NDA techniques to determine the elemental plutonium content in commercial nuclear SFA's [1]. This approach was adopted since DDA also allows determination of other SFA characteristics, such as burnup, initial enrichment, and cooling time, and also allows for detection of certain types of diversion of nuclear material. The quantification of total plutonium is obtained using an analytical correlation function in terms of the observed PN and active multiplication. Although somewhat similar approaches relating Pu content with PN have been adopted in the past, we demonstrate by extensive simulation of the fuel irradiation and NDA process that our analytical method is independent of explicit knowledge of the initial enrichment, burnup, and an absolute value of the SFA's reactivity (i.e. multiplication factor). We show that when tested with MCNPX™ simulations comprising the 64 SFA NGSI Spent Fuel Library-1 we were able to determine elemental plutonium content, using just a few calibration parameters, with an average variation in the prediction of around 1-2% across the wide dynamic range of irradiation history parameters used, namely initial enrichment (IE=2-5%), burnup (BU=15-60 GWd/tU) and cooling time (CT=1-80 y). In this paper we describe the basic approach and the success obtained against synthetic data. We recognize that our synthetic data may not fully capture the rich behavior of actual irradiated fuel and the uncertainties of the practical measurements. However, this design study is based on a rather complete nuclide inventory and the correlations for Pu seem robust to variation of input. Thus it is concluded that the proposed method is sufficiently promising that further experimentally based work is desirable.

  12. Neural-adaptive control of single-master-multiple-slaves teleoperation for coordinated multiple mobile manipulators with time-varying communication delays and input uncertainties.

    PubMed

    Li, Zhijun; Su, Chun-Yi

    2013-09-01

    In this paper, adaptive neural network control is investigated for single-master-multiple-slaves teleoperation in consideration of time delays and input dead-zone uncertainties for multiple mobile manipulators carrying a common object in a cooperative manner. Firstly, concise dynamics of teleoperation systems consisting of a single master robot, multiple coordinated slave robots, and the object are developed in the task space. To handle asymmetric time-varying delays in communication channels and unknown asymmetric input dead zones, the nonlinear dynamics of the teleoperation system are transformed into two subsystems through feedback linearization: local master or slave dynamics including the unknown input dead zones and delayed dynamics for the purpose of synchronization. Then, a model reference neural network control strategy based on linear matrix inequalities (LMI) and adaptive techniques is proposed. The developed control approach ensures that the defined tracking errors converge to zero whereas the coordination internal force errors remain bounded and can be made arbitrarily small. Throughout this paper, stability analysis is performed via explicit Lyapunov techniques under specific LMI conditions. The proposed adaptive neural network control scheme is robust against motion disturbances, parametric uncertainties, time-varying delays, and input dead zones, which is validated by simulation studies.

  13. A VIKOR Technique with Applications Based on DEMATEL and ANP

    NASA Astrophysics Data System (ADS)

    Ou Yang, Yu-Ping; Shieh, How-Ming; Tzeng, Gwo-Hshiung

    In multiple criteria decision making (MCDM) methods, the compromise ranking method (named VIKOR) was introduced as one applicable technique to implement within MCDM. It was developed for multicriteria optimization of complex systems. However, few papers discuss conflicting (competing) criteria with dependence and feedback in the compromise solution method. Therefore, this study proposes and provides applications for a novel model using the VIKOR technique based on DEMATEL and the ANP to solve the problem of conflicting criteria with dependence and feedback. In addition, this research also uses DEMATEL to normalize the unweighted supermatrix of the ANP to suit the real world. An example is also presented to illustrate the proposed method with applications thereof. The results show the proposed method is suitable and effective in real-world applications.

  14. Application of stepwise multiple regression techniques to inversion of Nimbus 'IRIS' observations.

    NASA Technical Reports Server (NTRS)

    Ohring, G.

    1972-01-01

    Exploratory studies with Nimbus-3 infrared interferometer-spectrometer (IRIS) data indicate that, in addition to temperature, such meteorological parameters as geopotential heights of pressure surfaces, tropopause pressure, and tropopause temperature can be inferred from the observed spectra with the use of simple regression equations. The technique of screening the IRIS spectral data by means of stepwise regression to obtain the best radiation predictors of meteorological parameters is validated. The simplicity of application of the technique and the simplicity of the derived linear regression equations - which contain only a few terms - suggest usefulness for this approach. Based upon the results obtained, suggestions are made for further development and exploitation of the stepwise regression analysis technique.

  15. Proposed low-energy absolute calibration of nuclear recoils in a dual-phase noble element TPC using D-D neutron scattering kinematics

    NASA Astrophysics Data System (ADS)

    Verbus, J. R.; Rhyne, C. A.; Malling, D. C.; Genecov, M.; Ghosh, S.; Moskowitz, A. G.; Chan, S.; Chapman, J. J.; de Viveiros, L.; Faham, C. H.; Fiorucci, S.; Huang, D. Q.; Pangilinan, M.; Taylor, W. C.; Gaitskell, R. J.

    2017-04-01

    We propose a new technique for the calibration of nuclear recoils in large noble element dual-phase time projection chambers used to search for WIMP dark matter in the local galactic halo. This technique provides an in situ measurement of the low-energy nuclear recoil response of the target media using the measured scattering angle between multiple neutron interactions within the detector volume. The low-energy reach and reduced systematics of this calibration have particular significance for the low-mass WIMP sensitivity of several leading dark matter experiments. Multiple strategies for improving this calibration technique are discussed, including the creation of a new type of quasi-monoenergetic neutron source with a minimum possible peak energy of 272 keV. We report results from a time-of-flight-based measurement of the neutron energy spectrum produced by an Adelphi Technology, Inc. DD108 neutron generator, confirming its suitability for the proposed nuclear recoil calibration.

  16. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  17. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  18. Synthetic Minority Oversampling Technique and Fractal Dimension for Identifying Multiple Sclerosis

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Dong; Zhang, Yin; Phillips, Preetha; Dong, Zhengchao; Wang, Shuihua

    Multiple sclerosis (MS) is a severe brain disease. Early detection can provide timely treatment. Fractal dimension can provide statistical index of pattern changes with scale at a given brain image. In this study, our team used susceptibility weighted imaging technique to obtain 676 MS slices and 880 healthy slices. We used synthetic minority oversampling technique to process the unbalanced dataset. Then, we used Canny edge detector to extract distinguishing edges. The Minkowski-Bouligand dimension was a fractal dimension estimation method and used to extract features from edges. Single hidden layer neural network was used as the classifier. Finally, we proposed a three-segment representation biogeography-based optimization to train the classifier. Our method achieved a sensitivity of 97.78±1.29%, a specificity of 97.82±1.60% and an accuracy of 97.80±1.40%. The proposed method is superior to seven state-of-the-art methods in terms of sensitivity and accuracy.

  19. Urban Monitoring Based on SENTINEL-1 Data Using Permanent Scatterer Interferometry and SAR Tomography

    NASA Astrophysics Data System (ADS)

    Crosetto, M.; Budillon, A.; Johnsy, A.; Schirinzi, G.; Devanthéry, N.; Monserrat, O.; Cuevas-González, M.

    2018-04-01

    A lot of research and development has been devoted to the exploitation of satellite SAR images for deformation measurement and monitoring purposes since Differential Interferometric Synthetic Apertura Radar (InSAR) was first described in 1989. In this work, we consider two main classes of advanced DInSAR techniques: Persistent Scatterer Interferometry and Tomographic SAR. Both techniques make use of multiple SAR images acquired over the same site and advanced procedures to separate the deformation component from the other phase components, such as the residual topographic component, the atmospheric component, the thermal expansion component and the phase noise. TomoSAR offers the advantage of detecting either single scatterers presenting stable proprieties over time (Persistent Scatterers) and multiple scatterers interfering within the same range-azimuth resolution cell, a significant improvement for urban areas monitoring. This paper addresses a preliminary inter-comparison of the results of both techniques, for a test site located in the metropolitan area of Barcelona (Spain), where interferometric Sentinel-1 data were analysed.

  20. Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow

    NASA Astrophysics Data System (ADS)

    Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar

    2018-03-01

    Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.

  1. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  2. Multiplanar visualization in 3D transthoracic echocardiography for precise delineation of mitral valve pathology.

    PubMed

    Kuppahally, Suman S; Paloma, Allan; Craig Miller, D; Schnittger, Ingela; Liang, David

    2008-01-01

    A novel multiplanar reformatting (MPR) technique in three-dimensional transthoracic echocardiography (3D TTE) was used to precisely localize the prolapsed lateral segment of posterior mitral valve leaflet in a patient symptomatic with mitral valve prolapse (MVP) and moderate mitral regurgitation (MR) before undergoing mitral valve repair surgery. Transesophageal echocardiography was avoided based on the findings of this new technique by 3D TTE. It was noninvasive, quick, reproducible and reliable. Also, it did not need the time-consuming reconstruction of multiple cardiac images. Mitral valve repair surgery was subsequently performed based on the MPR findings and corroborated the findings from the MPR examination.

  3. Amplitude and intensity spatial interferometry; Proceedings of the Meeting, Tucson, AZ, Feb. 14-16, 1990

    NASA Technical Reports Server (NTRS)

    Breckinridge, Jim B. (Editor)

    1990-01-01

    Attention is given to such topics as ground interferometers, space interferometers, speckle-based and interferometry-based astronomical observations, adaptive and atmospheric optics, speckle techniques, and instrumentation. Particular papers are presented concerning recent progress on the IR Michelson array; the IOTA interferometer project; a space interferometer concept for the detection of extrasolar earth-like planets; IR speckle imaging at Palomar; optical diameters of stars measured with the Mt. Wilson Mark III interferometer; the IR array camera for interferometry with the cophased Multiple Mirror Telescope; optimization techniques appliesd to the bispectrum of one-dimensional IR astronomical speckle data; and adaptive optical iamging for extended objects.

  4. Multiple Small Diameter Drillings Increase Femoral Neck Stability Compared with Single Large Diameter Femoral Head Core Decompression Technique for Avascular Necrosis of the Femoral Head.

    PubMed

    Brown, Philip J; Mannava, Sandeep; Seyler, Thorsten M; Plate, Johannes F; Van Sikes, Charles; Stitzel, Joel D; Lang, Jason E

    2016-10-26

    Femoral head core decompression is an efficacious joint-preserving procedure for treatment of early stage avascular necrosis. However, postoperative fractures have been described which may be related to the decompression technique used. Femoral head decompressions were performed on 12 matched human cadaveric femora comparing large 8mm single bore versus multiple 3mm small drilling techniques. Ultimate failure strength of the femora was tested using a servo-hydraulic material testing system. Ultimate load to failure was compared between the different decompression techniques using two paired ANCOVA linear regression models. Prior to biomechanical testing and after the intervention, volumetric bone mineral density was determined using quantitative computed tomography to account for variation between cadaveric samples and to assess the amount of bone disruption by the core decompression. Core decompression, using the small diameter bore and multiple drilling technique, withstood significantly greater load prior to failure compared with the single large bore technique after adjustment for bone mineral density (p< 0.05). The 8mm single bore technique removed a significantly larger volume of bone compared to the 3mm multiple drilling technique (p< 0.001). However, total fracture energy was similar between the two core decompression techniques. When considering core decompression for the treatment of early stage avascular necrosis, the multiple small bore technique removed less bone volume, thereby potentially leading to higher load to failure.

  5. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  6. Performance of coincidence-based PSD on LiF/ZnS Detectors for Multiplicity Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, Sean M.; Stave, Sean C.; Lintereur, Azaree

    Abstract: Mass accountancy measurement is a nuclear nonproliferation application which utilizes coincidence and multiplicity counters to verify special nuclear material declarations. With a well-designed and efficient detector system, several relevant parameters of the material can be verified simultaneously. 6LiF/ZnS scintillating sheets may be used for this purpose due to a combination of high efficiency and short die-away times in systems designed with this material, but involve choices of detector geometry and exact material composition (e.g., the addition of Ni-quenching in the material) that must be optimized for the application. Multiplicity counting for verification of declared nuclear fuel mass involves neutronmore » detection in conditions where several neutrons arrive in a short time window, with confounding gamma rays. This paper considers coincidence-based Pulse-Shape Discrimination (PSD) techniques developed to work under conditions of high pileup, and the performance of these algorithms with different detection materials. Simulated and real data from modern LiF/ZnS scintillator systems are evaluated with these techniques and the relationship between the performance under pileup and material characteristics (e.g., neutron peak width and total light collection efficiency) are determined, to allow for an optimal choice of detector and material.« less

  7. Preliminary study on X-ray fluorescence computed tomography imaging of gold nanoparticles: Acceleration of data acquisition by multiple pinholes scheme

    NASA Astrophysics Data System (ADS)

    Sasaya, Tenta; Sunaguchi, Naoki; Seo, Seung-Jum; Hyodo, Kazuyuki; Zeniya, Tsutomu; Kim, Jong-Ki; Yuasa, Tetsuya

    2018-04-01

    Gold nanoparticles (GNPs) have recently attracted attention in nanomedicine as novel contrast agents for cancer imaging. A decisive tomographic imaging technique has not yet been established to depict the 3-D distribution of GNPs in an object. An imaging technique known as pinhole-based X-ray fluorescence computed tomography (XFCT) is a promising method that can be used to reconstruct the distribution of GNPs from the X-ray fluorescence emitted by GNPs. We address the acceleration of data acquisition in pinhole-based XFCT for preclinical use using a multiple pinhole scheme. In this scheme, multiple projections are simultaneously acquired through a multi-pinhole collimator with a 2-D detector and full-field volumetric beam to enhance the signal-to-noise ratio of the projections; this enables fast data acquisition. To demonstrate the efficacy of this method, we performed an imaging experiment using a physical phantom with an actual multi-pinhole XFCT system that was constructed using the beamline AR-NE7A at KEK. The preliminary study showed that the multi-pinhole XFCT achieved a data acquisition time of 20 min at a theoretical detection limit of approximately 0.1 Au mg/ml and at a spatial resolution of 0.4 mm.

  8. Multiple response optimization of processing and formulation parameters of Eudragit RL/RS-based matrix tablets for sustained delivery of diclofenac.

    PubMed

    Elzayat, Ehab M; Abdel-Rahman, Ali A; Ahmed, Sayed M; Alanazi, Fars K; Habib, Walid A; Sakr, Adel

    2017-11-01

    Multiple response optimization is an efficient technique to develop sustained release formulation while decreasing the number of experiments based on trial and error approach. Diclofenac matrix tablets were optimized to achieve a release profile conforming to USP monograph, matching Voltaren ® SR and withstand formulation variables. The percent of drug released at predetermined multiple time points were the response variables in the design. Statistical models were obtained with relative contour diagrams being overlaid to predict process and formulation parameters expected to produce the target release profile. Tablets were prepared by wet granulation using mixture of equivalent quantities of Eudragit RL/RS at overall polymer concentration of 10-30%w/w and compressed at 5-15KN. Drug release from the optimized formulation E4 (15%w/w, 15KN) was similar to Voltaren, conformed to USP monograph and found to be stable. Substituting lactose with mannitol, reversing the ratio between lactose and microcrystalline cellulose or increasing drug load showed no significant difference in drug release. Using dextromethorphan hydrobromide as a model soluble drug showed burst release due to higher solubility and formation of micro cavities. A numerical optimization technique was employed to develop a stable consistent promising formulation for sustained delivery of diclofenac.

  9. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system has been created. The qualitative model describes the effects of seal failures on the system steady-state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  10. Rocket engine diagnostics using qualitative modeling techniques

    NASA Technical Reports Server (NTRS)

    Binder, Michael; Maul, William; Meyer, Claudia; Sovie, Amy

    1992-01-01

    Researchers at NASA Lewis Research Center are presently developing qualitative modeling techniques for automated rocket engine diagnostics. A qualitative model of a turbopump interpropellant seal system was created. The qualitative model describes the effects of seal failures on the system steady state behavior. This model is able to diagnose the failure of particular seals in the system based on anomalous temperature and pressure values. The anomalous values input to the qualitative model are generated using numerical simulations. Diagnostic test cases include both single and multiple seal failures.

  11. Plenoptic particle image velocimetry with multiple plenoptic cameras

    NASA Astrophysics Data System (ADS)

    Fahringer, Timothy W.; Thurow, Brian S.

    2018-07-01

    Plenoptic particle image velocimetry was recently introduced as a viable three-dimensional, three-component velocimetry technique based on light field cameras. One of the main benefits of this technique is its single camera configuration allowing the technique to be applied in facilities with limited optical access. The main drawback of this configuration is decreased accuracy in the out-of-plane dimension. This work presents a solution with the addition of a second plenoptic camera in a stereo-like configuration. A framework for reconstructing volumes with multiple plenoptic cameras including the volumetric calibration and reconstruction algorithms, including: integral refocusing, filtered refocusing, multiplicative refocusing, and MART are presented. It is shown that the addition of a second camera improves the reconstruction quality and removes the ‘cigar’-like elongation associated with the single camera system. In addition, it is found that adding a third camera provides minimal improvement. Further metrics of the reconstruction quality are quantified in terms of a reconstruction algorithm, particle density, number of cameras, camera separation angle, voxel size, and the effect of common image noise sources. In addition, a synthetic Gaussian ring vortex is used to compare the accuracy of the single and two camera configurations. It was determined that the addition of a second camera reduces the RMSE velocity error from 1.0 to 0.1 voxels in depth and 0.2 to 0.1 voxels in the lateral spatial directions. Finally, the technique is applied experimentally on a ring vortex and comparisons are drawn from the four presented reconstruction algorithms, where it was found that MART and multiplicative refocusing produced the cleanest vortex structure and had the least shot-to-shot variability. Filtered refocusing is able to produce the desired structure, albeit with more noise and variability, while integral refocusing struggled to produce a coherent vortex ring.

  12. Fuzzy neural network technique for system state forecasting.

    PubMed

    Li, Dezhi; Wang, Wilson; Ismail, Fathy

    2013-10-01

    In many system state forecasting applications, the prediction is performed based on multiple datasets, each corresponding to a distinct system condition. The traditional methods dealing with multiple datasets (e.g., vector autoregressive moving average models and neural networks) have some shortcomings, such as limited modeling capability and opaque reasoning operations. To tackle these problems, a novel fuzzy neural network (FNN) is proposed in this paper to effectively extract information from multiple datasets, so as to improve forecasting accuracy. The proposed predictor consists of both autoregressive (AR) nodes modeling and nonlinear nodes modeling; AR models/nodes are used to capture the linear correlation of the datasets, and the nonlinear correlation of the datasets are modeled with nonlinear neuron nodes. A novel particle swarm technique [i.e., Laplace particle swarm (LPS) method] is proposed to facilitate parameters estimation of the predictor and improve modeling accuracy. The effectiveness of the developed FNN predictor and the associated LPS method is verified by a series of tests related to Mackey-Glass data forecast, exchange rate data prediction, and gear system prognosis. Test results show that the developed FNN predictor and the LPS method can capture the dynamics of multiple datasets effectively and track system characteristics accurately.

  13. A Neutron Multiplicity Meter for Deep Underground Muon-Induced High Energy Neutron Measurements

    NASA Astrophysics Data System (ADS)

    Hennings-Yeomans, Raul; Akerib, Daniel

    2007-04-01

    The nature of dark matter is one of the most important outstanding issues in particle physics, cosmology and astrophysics. A leading hypothesis is that Weakly Interacting Massive Particles, or WIMPs, were produced in the early universe and make up the dark matter. WIMP searches must be performed underground to shield from cosmic rays, which produce secondary particles that could fake a WIMP signal. Nuclear recoils from fast neutrons in underground laboratories are one of the most challenging backgrounds to WIMP detection. We present, for the first time, the design of an instrument capable of measuring the high energy (>60,eV) muon-induced neutron flux deep underground. The instrument is based on applying the Gd-loaded liquid-scintillator technique to measure the rate of multiple low energy neutron events produced in a Pb target and from this measurement to infer the rate of high energy neutron events. This unique signature allows both for efficient tagging of neutron multiplicity events as well as rejection of random gamma backgrounds so effectively that typical low-background techniques are not required. We will also discuss the benefits of using a neutron multiplicity meter as a component of active shielding.

  14. Chemical and Biological Dynamics Using Droplet-Based Microfluidics.

    PubMed

    Dressler, Oliver J; Casadevall I Solvas, Xavier; deMello, Andrew J

    2017-06-12

    Recent years have witnessed an increased use of droplet-based microfluidic techniques in a wide variety of chemical and biological assays. Nevertheless, obtaining dynamic data from these platforms has remained challenging, as this often requires reading the same droplets (possibly thousands of them) multiple times over a wide range of intervals (from milliseconds to hours). In this review, we introduce the elemental techniques for the formation and manipulation of microfluidic droplets, together with the most recent developments in these areas. We then discuss a wide range of analytical methods that have been successfully adapted for analyte detection in droplets. Finally, we highlight a diversity of studies where droplet-based microfluidic strategies have enabled the characterization of dynamic systems that would otherwise have remained unexplorable.

  15. Learner-Centered Environments: Creating Effective Strategies Based on Student Attitudes and Faculty Reflection

    ERIC Educational Resources Information Center

    Bishop, Catharine F.; Caston, Michael I.; King, Cheryl A.

    2014-01-01

    Learner-centered environments effectively implement multiple teaching techniques to enhance students' higher education experience and provide them with greater control over their academic learning. This qualitative study involves an exploration of the eight reasons for learner-centered teaching found in Terry Doyle's 2008 book, "Helping…

  16. Multiple microbial activity-based measures reflect effects of cover cropping and tillage on soils

    USDA-ARS?s Scientific Manuscript database

    Agricultural producers, conservation professionals, and policy makers are eager to learn of soil analytical techniques and data that document improvement in soil health by agricultural practices such as no-till and incorporation of cover crops. However, there is considerable uncertainty within the r...

  17. Multiple input electrode gap controller

    DOEpatents

    Hysinger, C.L.; Beaman, J.J.; Melgaard, D.K.; Williamson, R.L.

    1999-07-27

    A method and apparatus for controlling vacuum arc remelting (VAR) furnaces by estimation of electrode gap based on a plurality of secondary estimates derived from furnace outputs. The estimation is preferably performed by Kalman filter. Adaptive gain techniques may be employed, as well as detection of process anomalies such as glows. 17 figs.

  18. Multiple input electrode gap controller

    DOEpatents

    Hysinger, Christopher L.; Beaman, Joseph J.; Melgaard, David K.; Williamson, Rodney L.

    1999-01-01

    A method and apparatus for controlling vacuum arc remelting (VAR) furnaces by estimation of electrode gap based on a plurality of secondary estimates derived from furnace outputs. The estimation is preferably performed by Kalman filter. Adaptive gain techniques may be employed, as well as detection of process anomalies such as glows.

  19. US EPA'S LANDSCAPE ECOLOGY RESEARCH: ASSESSING TRENDS FOR WETLANDS AND SURFACE WATERS USING REMORE SENSING, GIS, AND FIELD-BASED TECHNIQUES

    EPA Science Inventory

    The US EPA, Environmental Sciences Division-Las Vegas is using a variety of geopspatical and statistical modeling approaches to locate and assess the complex functions of wetland ecosystems. These assessments involve measuring landscape characteristrics and change, at multiple s...

  20. Music Composition in the High School Curriculum: A Multiple Case Study

    ERIC Educational Resources Information Center

    Menard, Elizabeth A.

    2015-01-01

    Student and teacher perceptions regarding composition instruction were investigated using case study techniques in two high school music programs: a general music program providing accelerated instruction to gifted musicians in small classes and a typical performance-based band program. Students in both programs participated in a composition…

  1. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers.

    PubMed

    Chen, Yi-Ting; Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-05-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  2. Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.

    PubMed

    Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal

    2018-06-01

    Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.

  3. Optical multiple access techniques for on-board routing

    NASA Technical Reports Server (NTRS)

    Mendez, Antonio J.; Park, Eugene; Gagliardi, Robert M.

    1992-01-01

    The purpose of this research contract was to design and analyze an optical multiple access system, based on Code Division Multiple Access (CDMA) techniques, for on board routing applications on a future communication satellite. The optical multiple access system was to effect the functions of a circuit switch under the control of an autonomous network controller and to serve eight (8) concurrent users at a point to point (port to port) data rate of 180 Mb/s. (At the start of this program, the bit error rate requirement (BER) was undefined, so it was treated as a design variable during the contract effort.) CDMA was selected over other multiple access techniques because it lends itself to bursty, asynchronous, concurrent communication and potentially can be implemented with off the shelf, reliable optical transceivers compatible with long term unattended operations. Temporal, temporal/spatial hybrids and single pulse per row (SPR, sometimes termed 'sonar matrices') matrix types of CDMA designs were considered. The design, analysis, and trade offs required by the statement of work selected a temporal/spatial CDMA scheme which has SPR properties as the preferred solution. This selected design can be implemented for feasibility demonstration with off the shelf components (which are identified in the bill of materials of the contract Final Report). The photonic network architecture of the selected design is based on M(8,4,4) matrix codes. The network requires eight multimode laser transmitters with laser pulses of 0.93 ns operating at 180 Mb/s and 9-13 dBm peak power, and 8 PIN diode receivers with sensitivity of -27 dBm for the 0.93 ns pulses. The wavelength is not critical, but 830 nm technology readily meets the requirements. The passive optical components of the photonic network are all multimode and off the shelf. Bit error rate (BER) computations, based on both electronic noise and intercode crosstalk, predict a raw BER of (10 exp -3) when all eight users are communicating concurrently. If better BER performance is required, then error correction codes (ECC) using near term electronic technology can be used. For example, the M(8,4,4) optical code together with Reed-Solomon (54,38,8) encoding provides a BER of better than (10 exp -11). The optical transceiver must then operate at 256 Mb/s with pulses of 0.65 ns because the 'bits' are now channel symbols.

  4. Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver

    2015-01-01

    Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.

  5. Three-Dimensional Registration for Handheld Profiling Systems Based on Multiple Shot Structured Light

    PubMed Central

    Ayaz, Shirazi Muhammad; Kim, Min Young

    2018-01-01

    In this article, a multi-view registration approach for the 3D handheld profiling system based on the multiple shot structured light technique is proposed. The multi-view registration approach is categorized into coarse registration and point cloud refinement using the iterative closest point (ICP) algorithm. Coarse registration of multiple point clouds was performed using relative orientation and translation parameters estimated via homography-based visual navigation. The proposed system was evaluated using an artificial human skull and a paper box object. For the quantitative evaluation of the accuracy of a single 3D scan, a paper box was reconstructed, and the mean errors in its height and breadth were found to be 9.4 μm and 23 μm, respectively. A comprehensive quantitative evaluation and comparison of proposed algorithm was performed with other variants of ICP. The root mean square error for the ICP algorithm to register a pair of point clouds of the skull object was also found to be less than 1 mm. PMID:29642552

  6. Neural-Fuzzy model Based Steel Pipeline Multiple Cracks Classification

    NASA Astrophysics Data System (ADS)

    Elwalwal, Hatem Mostafa; Mahzan, Shahruddin Bin Hj.; Abdalla, Ahmed N.

    2017-10-01

    While pipes are cheaper than other means of transportation, this cost saving comes with a major price: pipes are subject to cracks, corrosion etc., which in turn can cause leakage and environmental damage. In this paper, Neural-Fuzzy model for multiple cracks classification based on Lamb Guide Wave. Simulation results for 42 sample were collected using ANSYS software. The current research object to carry on the numerical simulation and experimental study, aiming at finding an effective way to detection and the localization of cracks and holes defects in the main body of pipeline. Considering the damage form of multiple cracks and holes which may exist in pipeline, to determine the respective position in the steel pipe. In addition, the technique used in this research a guided lamb wave based structural health monitoring method whereas piezoelectric transducers will use as exciting and receiving sensors by Pitch-Catch method. Implementation of simple learning mechanism has been developed specially for the ANN for fuzzy the system represented.

  7. Theoretical and experimental investigation of multispectral photoacoustic osteoporosis detection method

    NASA Astrophysics Data System (ADS)

    Steinberg, Idan; Hershkovich, Hadas Sara; Gannot, Israel; Eyal, Avishay

    2014-03-01

    Osteoporosis is a widespread disorder, which has a catastrophic impact on patients lives and overwhelming related to healthcare costs. Recently, we proposed a multispectral photoacoustic technique for early detection of osteoporosis. Such technique has great advantages over pure ultrasonic or optical methods as it allows the deduction of both bone functionality from the bone absorption spectrum and bone resistance to fracture from the characteristics of the ultrasound propagation. We demonstrated the propagation of multiple acoustic modes in animal bones in-vitro. To further investigate the effects of multiple wavelength excitations and of induced osteoporosis on the PA signal a multispectral photoacoustic system is presented. The experimental investigation is based on measuring the interference of multiple acoustic modes. The performance of the system is evaluated and a simple two mode theoretical model is fitted to the measured phase signals. The results show that such PA technique is accurate and repeatable. Then a multiple wavelength excitation is tested. It is shown that the PA response due to different excitation wavelengths revels that absorption by the different bone constitutes has a profound effect on the mode generation. The PA response is measured in single wavelength before and after induced osteoporosis. Results show that induced osteoporosis alters the measured amplitude and phase in a consistent manner which allows the detection of the onset of osteoporosis. These results suggest that a complete characterization of the bone over a region of both acoustic and optical frequencies might be used as a powerful tool for in-vivo bone evaluation.

  8. Informative graphing of continuous safety variables relative to normal reference limits.

    PubMed

    Breder, Christopher D

    2018-05-16

    Interpreting graphs of continuous safety variables can be complicated because differences in age, gender, and testing site methodologies data may give rise to multiple reference limits. Furthermore, data below the lower limit of normal are compressed relative to those points above the upper limit of normal. The objective of this study is to develop a graphing technique that addresses these issues and is visually intuitive. A mock dataset with multiple reference ranges is initially used to develop the graphing technique. Formulas are developed for conditions where data are above the upper limit of normal, normal, below the lower limit of normal, and below the lower limit of normal when the data value equals zero. After the formulae are developed, an anonymized dataset from an actual set of trials for an approved drug is evaluated comparing the technique developed in this study to standard graphical methods. Formulas are derived for the novel graphing method based on multiples of the normal limits. The formula for values scaled between the upper and lower limits of normal is a novel application of a readily available scaling formula. The formula for the lower limit of normal is novel and addresses the issue of this value potentially being indeterminate when the result to be scaled as a multiple is zero. The formulae and graphing method described in this study provides a visually intuitive method to graph continuous safety data including laboratory values, vital sign data.

  9. Tracheal injury as a sequence of multiple attempts of endotracheal intubation in the course of a preclinical cardiopulmonary resuscitation.

    PubMed

    Jaeger, K; Ruschulte, H; Osthaus, A; Scheinichen, D; Heine, J

    2000-01-01

    Management of the difficult airway requires an appropriate approach based on personal clinical experiences. For every physician involved in rescue and emergency medicine, it is important to know the difficult airway algorithm and be familiar with alternative techniques of managing the difficult airway. We report a case of tracheal injury caused by multiple attempts at intubating the trachea. Based on current knowledge, apart from surgical equipment for cricothyroidotomy the laryngeal mask airway (LMA) and the Combitube (ETC) should be available on any ambulance vehicle staffed by an emergency physician. In future, blind intubation through the intubating laryngeal mask airway (ILMA) could offer a new opportunity.

  10. Performance Enhancement of MC-CDMA System through Novel Sensitive Bit Algorithm Aided Turbo Multi User Detection

    PubMed Central

    Kumaravel, Rasadurai; Narayanaswamy, Kumaratharan

    2015-01-01

    Multi carrier code division multiple access (MC-CDMA) system is a promising multi carrier modulation (MCM) technique for high data rate wireless communication over frequency selective fading channels. MC-CDMA system is a combination of code division multiple access (CDMA) and orthogonal frequency division multiplexing (OFDM). The OFDM parts reduce multipath fading and inter symbol interference (ISI) and the CDMA part increases spectrum utilization. Advantages of this technique are its robustness in case of multipath propagation and improve security with the minimize ISI. Nevertheless, due to the loss of orthogonality at the receiver in a mobile environment, the multiple access interference (MAI) appears. The MAI is one of the factors that degrade the bit error rate (BER) performance of MC-CDMA system. The multiuser detection (MUD) and turbo coding are the two dominant techniques for enhancing the performance of the MC-CDMA systems in terms of BER as a solution of overcome to MAI effects. In this paper a low complexity iterative soft sensitive bits algorithm (SBA) aided logarithmic-Maximum a-Posteriori algorithm (Log MAP) based turbo MUD is proposed. Simulation results show that the proposed method provides better BER performance with low complexity decoding, by mitigating the detrimental effects of MAI. PMID:25714917

  11. Use of partial dissolution techniques in geochemical exploration

    USGS Publications Warehouse

    Chao, T.T.

    1984-01-01

    Application of partial dissolution techniques to geochemical exploration has advanced from an early empirical approach to an approach based on sound geochemical principles. This advance assures a prominent future position for the use of these techniques in geochemical exploration for concealed mineral deposits. Partial dissolution techniques are classified as single dissolution or sequential multiple dissolution depending on the number of steps taken in the procedure, or as "nonselective" extraction and as "selective" extraction in terms of the relative specificity of the extraction. The choice of dissolution techniques for use in geochemical exploration is dictated by the geology of the area, the type and degree of weathering, and the expected chemical forms of the ore and of the pathfinding elements. Case histories have illustrated many instances where partial dissolution techniques exhibit advantages over conventional methods of chemical analysis used in geochemical exploration. ?? 1984.

  12. Calibrating a novel multi-sensor physical activity measurement system.

    PubMed

    John, D; Liu, S; Sasaki, J E; Howe, C A; Staudenmayer, J; Gao, R X; Freedson, P S

    2011-09-01

    Advancing the field of physical activity (PA) monitoring requires the development of innovative multi-sensor measurement systems that are feasible in the free-living environment. The use of novel analytical techniques to combine and process these multiple sensor signals is equally important. This paper describes a novel multi-sensor 'integrated PA measurement system' (IMS), the lab-based methodology used to calibrate the IMS, techniques used to predict multiple variables from the sensor signals, and proposes design changes to improve the feasibility of deploying the IMS in the free-living environment. The IMS consists of hip and wrist acceleration sensors, two piezoelectric respiration sensors on the torso, and an ultraviolet radiation sensor to obtain contextual information (indoors versus outdoors) of PA. During lab-based calibration of the IMS, data were collected on participants performing a PA routine consisting of seven different ambulatory and free-living activities while wearing a portable metabolic unit (criterion measure) and the IMS. Data analyses on the first 50 adult participants are presented. These analyses were used to determine if the IMS can be used to predict the variables of interest. Finally, physical modifications for the IMS that could enhance the feasibility of free-living use are proposed and refinement of the prediction techniques is discussed.

  13. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be detected accurately. This will be an important step towards automatic multiple image analysis for CAD techniques.

  14. Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems

    PubMed Central

    Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao

    2016-01-01

    In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm. PMID:26985896

  15. Wearable Sensor Localization Considering Mixed Distributed Sources in Health Monitoring Systems.

    PubMed

    Wan, Liangtian; Han, Guangjie; Wang, Hao; Shu, Lei; Feng, Nanxing; Peng, Bao

    2016-03-12

    In health monitoring systems, the base station (BS) and the wearable sensors communicate with each other to construct a virtual multiple input and multiple output (VMIMO) system. In real applications, the signal that the BS received is a distributed source because of the scattering, reflection, diffraction and refraction in the propagation path. In this paper, a 2D direction-of-arrival (DOA) estimation algorithm for incoherently-distributed (ID) and coherently-distributed (CD) sources is proposed based on multiple VMIMO systems. ID and CD sources are separated through the second-order blind identification (SOBI) algorithm. The traditional estimating signal parameters via the rotational invariance technique (ESPRIT)-based algorithm is valid only for one-dimensional (1D) DOA estimation for the ID source. By constructing the signal subspace, two rotational invariant relationships are constructed. Then, we extend the ESPRIT to estimate 2D DOAs for ID sources. For DOA estimation of CD sources, two rational invariance relationships are constructed based on the application of generalized steering vectors (GSVs). Then, the ESPRIT-based algorithm is used for estimating the eigenvalues of two rational invariance matrices, which contain the angular parameters. The expressions of azimuth and elevation for ID and CD sources have closed forms, which means that the spectrum peak searching is avoided. Therefore, compared to the traditional 2D DOA estimation algorithms, the proposed algorithm imposes significantly low computational complexity. The intersecting point of two rays, which come from two different directions measured by two uniform rectangle arrays (URA), can be regarded as the location of the biosensor (wearable sensor). Three BSs adopting the smart antenna (SA) technique cooperate with each other to locate the wearable sensors using the angulation positioning method. Simulation results demonstrate the effectiveness of the proposed algorithm.

  16. A Software Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Donald J.; Martin, Richard E.; Seebo, Jeff P.; Trinh, Long B.; Walker, James L.; Winfree, William P.

    2007-01-01

    Ultrasonic, microwave, and terahertz nondestructive evaluation imaging systems generally require the acquisition of waveforms at each scan point to form an image. For such systems, signal and image processing methods are commonly needed to extract information from the waves and improve resolution of, and highlight, defects in the image. Since some similarity exists for all waveform-based NDE methods, it would seem a common software platform containing multiple signal and image processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. This presentation describes NASA Glenn Research Center's approach in developing a common software platform for processing waveform-based NDE signals and images. This platform is currently in use at NASA Glenn and at Lockheed Martin Michoud Assembly Facility for processing of pulsed terahertz and ultrasonic data. Highlights of the software operation will be given. A case study will be shown for use with terahertz data. The authors also request scientists and engineers who are interested in sharing customized signal and image processing algorithms to contribute to this effort by letting the authors code up and include these algorithms in future releases.

  17. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  18. A coherent through-wall MIMO phased array imaging radar based on time-duplexed switching

    NASA Astrophysics Data System (ADS)

    Chen, Qingchao; Chetty, Kevin; Brennan, Paul; Lok, Lai Bun; Ritchie, Matthiew; Woodbridge, Karl

    2017-05-01

    Through-the-Wall (TW) radar sensors are gaining increasing interest for security, surveillance and search and rescue applications. Additionally, the integration of Multiple-Input, Multiple-Output (MIMO) techniques with phased array radar is allowing higher performance at lower cost. In this paper we present a 4-by-4 TW MIMO phased array imaging radar operating at 2.4 GHz with 200 MHz bandwidth. To achieve high imaging resolution in a cost-effective manner, the 4 Tx and 4 Rx elements are used to synthesize a uniform linear array (ULA) of 16 virtual elements. Furthermore, the transmitter is based on a single-channel 4-element time-multiplexed switched array. In transmission, the radar utilizes frequency modulated continuous wave (FMCW) waveforms that undergo de-ramping on receive to allow digitization at relatively low sampling rates, which then simplifies the imaging process. This architecture has been designed for the short-range TW scenarios envisaged, and permits sufficient time to switch between antenna elements. The paper first outlines the system characteristics before describing the key signal processing and imaging algorithms which are based on traditional Fast Fourier Transform (FFT) processing. These techniques are implemented in LabVIEW software. Finally, we report results from an experimental campaign that investigated the imaging capabilities of the system and demonstrated the detection of personnel targets. Moreover, we show that multiple targets within a room with greater than approximately 1 meter separation can be distinguished from one another.

  19. Regularized Embedded Multiple Kernel Dimensionality Reduction for Mine Signal Processing.

    PubMed

    Li, Shuang; Liu, Bing; Zhang, Chen

    2016-01-01

    Traditional multiple kernel dimensionality reduction models are generally based on graph embedding and manifold assumption. But such assumption might be invalid for some high-dimensional or sparse data due to the curse of dimensionality, which has a negative influence on the performance of multiple kernel learning. In addition, some models might be ill-posed if the rank of matrices in their objective functions was not high enough. To address these issues, we extend the traditional graph embedding framework and propose a novel regularized embedded multiple kernel dimensionality reduction method. Different from the conventional convex relaxation technique, the proposed algorithm directly takes advantage of a binary search and an alternative optimization scheme to obtain optimal solutions efficiently. The experimental results demonstrate the effectiveness of the proposed method for supervised, unsupervised, and semisupervised scenarios.

  20. Neutron multiplicity counting: Confidence intervals for reconstruction parameters

    DOE PAGES

    Verbeke, Jerome M.

    2016-03-09

    From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less

  1. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  2. Multiple grid arrangement improves ligand docking with unknown binding sites: Application to the inverse docking problem.

    PubMed

    Ban, Tomohiro; Ohue, Masahito; Akiyama, Yutaka

    2018-04-01

    The identification of comprehensive drug-target interactions is important in drug discovery. Although numerous computational methods have been developed over the years, a gold standard technique has not been established. Computational ligand docking and structure-based drug design allow researchers to predict the binding affinity between a compound and a target protein, and thus, they are often used to virtually screen compound libraries. In addition, docking techniques have also been applied to the virtual screening of target proteins (inverse docking) to predict target proteins of a drug candidate. Nevertheless, a more accurate docking method is currently required. In this study, we proposed a method in which a predicted ligand-binding site is covered by multiple grids, termed multiple grid arrangement. Notably, multiple grid arrangement facilitates the conformational search for a grid-based ligand docking software and can be applied to the state-of-the-art commercial docking software Glide (Schrödinger, LLC). We validated the proposed method by re-docking with the Astex diverse benchmark dataset and blind binding site situations, which improved the correct prediction rate of the top scoring docking pose from 27.1% to 34.1%; however, only a slight improvement in target prediction accuracy was observed with inverse docking scenarios. These findings highlight the limitations and challenges of current scoring functions and the need for more accurate docking methods. The proposed multiple grid arrangement method was implemented in Glide by modifying a cross-docking script for Glide, xglide.py. The script of our method is freely available online at http://www.bi.cs.titech.ac.jp/mga_glide/. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Role of 18F-FDG PET/CT in the diagnosis and management of multiple myeloma and other plasma cell disorders: a consensus statement by the International Myeloma Working Group.

    PubMed

    Cavo, Michele; Terpos, Evangelos; Nanni, Cristina; Moreau, Philippe; Lentzsch, Suzanne; Zweegman, Sonja; Hillengass, Jens; Engelhardt, Monika; Usmani, Saad Z; Vesole, David H; San-Miguel, Jesus; Kumar, Shaji K; Richardson, Paul G; Mikhael, Joseph R; da Costa, Fernando Leal; Dimopoulos, Meletios-Athanassios; Zingaretti, Chiara; Abildgaard, Niels; Goldschmidt, Hartmut; Orlowski, Robert Z; Chng, Wee Joo; Einsele, Hermann; Lonial, Sagar; Barlogie, Bart; Anderson, Kenneth C; Rajkumar, S Vincent; Durie, Brian G M; Zamagni, Elena

    2017-04-01

    The International Myeloma Working Group consensus aimed to provide recommendations for the optimal use of 18 fluorodeoxyglucose ( 18 F-FDG) PET/CT in patients with multiple myeloma and other plasma cell disorders, including smouldering multiple myeloma and solitary plasmacytoma. 18 F-FDG PET/CT can be considered a valuable tool for the work-up of patients with both newly diagnosed and relapsed or refractory multiple myeloma because it assesses bone damage with relatively high sensitivity and specificity, and detects extramedullary sites of proliferating clonal plasma cells while providing important prognostic information. The use of 18 F-FDG PET/CT is mandatory to confirm a suspected diagnosis of solitary plasmacytoma, provided that whole-body MRI is unable to be performed, and to distinguish between smouldering and active multiple myeloma, if whole-body X-ray (WBXR) is negative and whole-body MRI is unavailable. Based on the ability of 18 F-FDG PET/CT to distinguish between metabolically active and inactive disease, this technique is now the preferred functional imaging modality to evaluate and to monitor the effect of therapy on myeloma-cell metabolism. Changes in FDG avidity can provide an earlier evaluation of response to therapy compared to MRI scans, and can predict outcomes, particularly for patients who are eligible to receive autologous stem-cell transplantation. 18 F-FDG PET/CT can be coupled with sensitive bone marrow-based techniques to detect minimal residual disease (MRD) inside and outside the bone marrow, helping to identify those patients who are defined as having imaging MRD negativity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Parallel seed-based approach to multiple protein structure similarities detection

    DOE PAGES

    Chapuis, Guillaume; Le Boudic-Jamin, Mathilde; Andonov, Rumen; ...

    2015-01-01

    Finding similarities between protein structures is a crucial task in molecular biology. Most of the existing tools require proteins to be aligned in order-preserving way and only find single alignments even when multiple similar regions exist. We propose a new seed-based approach that discovers multiple pairs of similar regions. Its computational complexity is polynomial and it comes with a quality guarantee—the returned alignments have both root mean squared deviations (coordinate-based as well as internal-distances based) lower than a given threshold, if such exist. We do not require the alignments to be order preserving (i.e., we consider nonsequential alignments), which makesmore » our algorithm suitable for detecting similar domains when comparing multidomain proteins as well as to detect structural repetitions within a single protein. Because the search space for nonsequential alignments is much larger than for sequential ones, the computational burden is addressed by extensive use of parallel computing techniques: a coarse-grain level parallelism making use of available CPU cores for computation and a fine-grain level parallelism exploiting bit-level concurrency as well as vector instructions.« less

  5. An ensemble framework for clustering protein-protein interaction networks.

    PubMed

    Asur, Sitaram; Ucar, Duygu; Parthasarathy, Srinivasan

    2007-07-01

    Protein-Protein Interaction (PPI) networks are believed to be important sources of information related to biological processes and complex metabolic functions of the cell. The presence of biologically relevant functional modules in these networks has been theorized by many researchers. However, the application of traditional clustering algorithms for extracting these modules has not been successful, largely due to the presence of noisy false positive interactions as well as specific topological challenges in the network. In this article, we propose an ensemble clustering framework to address this problem. For base clustering, we introduce two topology-based distance metrics to counteract the effects of noise. We develop a PCA-based consensus clustering technique, designed to reduce the dimensionality of the consensus problem and yield informative clusters. We also develop a soft consensus clustering variant to assign multifaceted proteins to multiple functional groups. We conduct an empirical evaluation of different consensus techniques using topology-based, information theoretic and domain-specific validation metrics and show that our approaches can provide significant benefits over other state-of-the-art approaches. Our analysis of the consensus clusters obtained demonstrates that ensemble clustering can (a) produce improved biologically significant functional groupings; and (b) facilitate soft clustering by discovering multiple functional associations for proteins. Supplementary data are available at Bioinformatics online.

  6. An isolation-enhanced quad-element antenna using suspended solid wires for LTE small-cell base stations

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Sheng; Zhou, Huang-Cheng

    2017-05-01

    This paper presents a multiple-input-multiple-output (MIMO) antenna that has four-unit elements enabled by an isolation technique for long-term evolution (LTE) small-cell base stations. While earlier studies on MIMO base-station antennas cope with either a lower LTE band (698-960 MHz) or an upper LTE band (1710-2690 MHz), the proposed antenna meets the full LTE specification, yet it uses the maximum number of unit elements to increase channel capacity. The antenna configuration is optimized for good impedance matching and high radiation efficiency. In particular, as the spacing between unit elements is so small that severe mutual coupling occurs, we propose a simple structure with extremely low costs to enhance the isolation. By using suspended solid wires interconnecting the position having strong coupled current of two adjacent elements, an isolation enhancement of 37 dB is achieved. Although solid wires inherently aim at direct-current applications, this work successfully employs such a low-cost technique to microwave antenna development. Experimental results have validated the design guidelines and the proposed configuration, showing that antenna performances including impedance matching, isolation, radiation features, signal correlation, and channel capacity gain are highly desired for LTE small-cell base stations.

  7. Multiple descriptions based on multirate coding for JPEG 2000 and H.264/AVC.

    PubMed

    Tillo, Tammam; Baccaglini, Enrico; Olmo, Gabriella

    2010-07-01

    Multiple description coding (MDC) makes use of redundant representations of multimedia data to achieve resiliency. Descriptions should be generated so that the quality obtained when decoding a subset of them only depends on their number and not on the particular received subset. In this paper, we propose a method based on the principle of encoding the source at several rates, and properly blending the data encoded at different rates to generate the descriptions. The aim is to achieve efficient redundancy exploitation, and easy adaptation to different network scenarios by means of fine tuning of the encoder parameters. We apply this principle to both JPEG 2000 images and H.264/AVC video data. We consider as the reference scenario the distribution of contents on application-layer overlays with multiple-tree topology. The experimental results reveal that our method favorably compares with state-of-art MDC techniques.

  8. Ultrasound image edge detection based on a novel multiplicative gradient and Canny operator.

    PubMed

    Zheng, Yinfei; Zhou, Yali; Zhou, Hao; Gong, Xiaohong

    2015-07-01

    To achieve the fast and accurate segmentation of ultrasound image, a novel edge detection method for speckle noised ultrasound images was proposed, which was based on the traditional Canny and a novel multiplicative gradient operator. The proposed technique combines a new multiplicative gradient operator of non-Newtonian type with the traditional Canny operator to generate the initial edge map, which is subsequently optimized by the following edge tracing step. To verify the proposed method, we compared it with several other edge detection methods that had good robustness to noise, with experiments on the simulated and in vivo medical ultrasound image. Experimental results showed that the proposed algorithm has higher speed for real-time processing, and the edge detection accuracy could be 75% or more. Thus, the proposed method is very suitable for fast and accurate edge detection of medical ultrasound images. © The Author(s) 2014.

  9. Cerebrovascular pattern improved by ozone autohemotherapy: an entropy-based study on multiple sclerosis patients.

    PubMed

    Molinari, Filippo; Rimini, Daniele; Liboni, William; Acharya, U Rajendra; Franzini, Marianno; Pandolfi, Sergio; Ricevuti, Giovanni; Vaiano, Francesco; Valdenassi, Luigi; Simonetti, Vincenzo

    2017-08-01

    Ozone major autohemotherapy is effective in reducing the symptoms of multiple sclerosis (MS) patients, but its effects on brain are still not clear. In this work, we have monitored the changes in the cerebrovascular pattern of MS patients and normal subjects during major ozone autohemotherapy by using near-infrared spectroscopy (NIRS) as functional and vascular technique. NIRS signals are analyzed using a combination of time, time-frequency analysis and nonlinear analysis of intrinsic mode function signals obtained from empirical mode decomposition technique. Our results show that there is an improvement in the cerebrovascular pattern of all subjects indicated by increasing the entropy of the NIRS signals. Hence, we can conclude that the ozone therapy increases the brain metabolism and helps to recover from the lower activity levels which is predominant in MS patients.

  10. High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing

    PubMed Central

    Hu, Chenyuan; Bai, Wei

    2018-01-01

    A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing. PMID:29495263

  11. High-Speed Interrogation for Large-Scale Fiber Bragg Grating Sensing.

    PubMed

    Hu, Chenyuan; Bai, Wei

    2018-02-24

    A high-speed interrogation scheme for large-scale fiber Bragg grating (FBG) sensing arrays is presented. This technique employs parallel computing and pipeline control to modulate incident light and demodulate the reflected sensing signal. One Electro-optic modulator (EOM) and one semiconductor optical amplifier (SOA) were used to generate a phase delay to filter reflected spectrum form multiple candidate FBGs with the same optical path difference (OPD). Experimental results showed that the fastest interrogation delay time for the proposed method was only about 27.2 us for a single FBG interrogation, and the system scanning period was only limited by the optical transmission delay in the sensing fiber owing to the multiple simultaneous central wavelength calculations. Furthermore, the proposed FPGA-based technique had a verified FBG wavelength demodulation stability of ±1 pm without average processing.

  12. [Progress in industrial bioprocess engineering in China].

    PubMed

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  13. Multi-Sectional Views Textural Based SVM for MS Lesion Segmentation in Multi-Channels MRIs

    PubMed Central

    Abdullah, Bassem A; Younis, Akmal A; John, Nigel M

    2012-01-01

    In this paper, a new technique is proposed for automatic segmentation of multiple sclerosis (MS) lesions from brain magnetic resonance imaging (MRI) data. The technique uses a trained support vector machine (SVM) to discriminate between the blocks in regions of MS lesions and the blocks in non-MS lesion regions mainly based on the textural features with aid of the other features. The classification is done on each of the axial, sagittal and coronal sectional brain view independently and the resultant segmentations are aggregated to provide more accurate output segmentation. The main contribution of the proposed technique described in this paper is the use of textural features to detect MS lesions in a fully automated approach that does not rely on manually delineating the MS lesions. In addition, the technique introduces the concept of the multi-sectional view segmentation to produce verified segmentation. The proposed textural-based SVM technique was evaluated using three simulated datasets and more than fifty real MRI datasets. The results were compared with state of the art methods. The obtained results indicate that the proposed method would be viable for use in clinical practice for the detection of MS lesions in MRI. PMID:22741026

  14. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Reconstructed Image Spatial Resolution of Multiple Coincidences Compton Imager

    NASA Astrophysics Data System (ADS)

    Andreyev, Andriy; Sitek, Arkadiusz; Celler, Anna

    2010-02-01

    We study the multiple coincidences Compton imager (MCCI) which is based on a simultaneous acquisition of several photons emitted in cascade from a single nuclear decay. Theoretically, this technique should provide a major improvement in localization of a single radioactive source as compared to a standard Compton camera. In this work, we investigated the performance and limitations of MCCI using Monte Carlo computer simulations. Spatial resolutions of the reconstructed point source have been studied as a function of the MCCI parameters, including geometrical dimensions and detector characteristics such as materials, energy and spatial resolutions.

  16. Consistent detection and identification of individuals in a large camera network

    NASA Astrophysics Data System (ADS)

    Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.

    2007-10-01

    In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.

  17. [Non-contrast time-resolved magnetic resonance angiography combining high resolution multiple phase echo planar imaging based signal targeting and alternating radiofrequency contrast inherent inflow enhanced multi phase angiography combining spatial resolution echo planar imaging based signal targeting and alternating radiofrequency in intracranial arteries].

    PubMed

    Nakamura, Masanobu; Yoneyama, Masami; Tabuchi, Takashi; Takemura, Atsushi; Obara, Makoto; Sawano, Seishi

    2012-01-01

    Detailed information on anatomy and hemodynamics in cerebrovascular disorders such as AVM and Moyamoya disease is mandatory for defined diagnosis and treatment planning. Arterial spin labeling technique has come to be applied to magnetic resonance angiography (MRA) and perfusion imaging in recent years. However, those non-contrast techniques are mostly limited to single frame images. Recently we have proposed a non-contrast time-resolved MRA technique termed contrast inherent inflow enhanced multi phase angiography combining spatial resolution echo planar imaging based signal targeting and alternating radiofrequency (CINEMA-STAR). CINEMA-STAR can extract the blood flow in the major intracranial arteries at an interval of 70 ms and thus permits us to observe vascular construction in full by preparing MIP images of axial acquisitions with high spatial resolution. This preliminary study demonstrates the usefulness of the CINEMA-STAR technique in evaluating the cerebral vasculature.

  18. Time-Domain Fluorescence Lifetime Imaging Techniques Suitable for Solid-State Imaging Sensor Arrays

    PubMed Central

    Li, David Day-Uei; Ameer-Beg, Simon; Arlt, Jochen; Tyndall, David; Walker, Richard; Matthews, Daniel R.; Visitkul, Viput; Richardson, Justin; Henderson, Robert K.

    2012-01-01

    We have successfully demonstrated video-rate CMOS single-photon avalanche diode (SPAD)-based cameras for fluorescence lifetime imaging microscopy (FLIM) by applying innovative FLIM algorithms. We also review and compare several time-domain techniques and solid-state FLIM systems, and adapt the proposed algorithms for massive CMOS SPAD-based arrays and hardware implementations. The theoretical error equations are derived and their performances are demonstrated on the data obtained from 0.13 μm CMOS SPAD arrays and the multiple-decay data obtained from scanning PMT systems. In vivo two photon fluorescence lifetime imaging data of FITC-albumin labeled vasculature of a P22 rat carcinosarcoma (BD9 rat window chamber) are used to test how different algorithms perform on bi-decay data. The proposed techniques are capable of producing lifetime images with enough contrast. PMID:22778606

  19. Fiber Bragg grating sensor interrogators on chip: challenges and opportunities

    NASA Astrophysics Data System (ADS)

    Marin, Yisbel; Nannipieri, Tiziano; Oton, Claudio J.; Di Pasquale, Fabrizio

    2017-04-01

    In this paper we present an overview of the current efforts towards integration of Fiber Bragg Grating (FBG) sensor interrogators. Different photonic integration platforms will be discussed, including monolithic planar lightwave circuit technology, silicon on insulator (SOI), indium phosphide (InP) and gallium arsenide (GaAs) material platforms. Also various possible techniques for wavelength metering and methods for FBG multiplexing will be discussed and compared in terms of resolution, dynamic performance, multiplexing capabilities and reliability. The use of linear filters, array waveguide gratings (AWG) as multiple linear filters and AWG based centroid signal processing techniques will be addressed as well as interrogation techniques based on tunable micro-ring resonators and Mach-Zehnder interferometers (MZI) for phase sensitive detection. The paper will also discuss the challenges and perspectives of photonic integration to address the increasing requirements of several industrial applications.

  20. The correlated k-distribution technique as applied to the AVHRR channels

    NASA Technical Reports Server (NTRS)

    Kratz, David P.

    1995-01-01

    Correlated k-distributions have been created to account for the molecular absorption found in the spectral ranges of the five Advanced Very High Resolution Radiometer (AVHRR) satellite channels. The production of the k-distributions was based upon an exponential-sum fitting of transmissions (ESFT) technique which was applied to reference line-by-line absorptance calculations. To account for the overlap of spectral features from different molecular species, the present routines made use of the multiplication transmissivity property which allows for considerable flexibility, especially when altering relative mixing ratios of the various molecular species. To determine the accuracy of the correlated k-distribution technique as compared to the line-by-line procedure, atmospheric flux and heating rate calculations were run for a wide variety of atmospheric conditions. For the atmospheric conditions taken into consideration, the correlated k-distribution technique has yielded results within about 0.5% for both the cases where the satellite spectral response functions were applied and where they were not. The correlated k-distribution's principal advantages is that it can be incorporated directly into multiple scattering routines that consider scattering as well as absorption by clouds and aerosol particles.

  1. Estimation of liver T₂ in transfusion-related iron overload in patients with weighted least squares T₂ IDEAL.

    PubMed

    Vasanawala, Shreyas S; Yu, Huanzhou; Shimakawa, Ann; Jeng, Michael; Brittain, Jean H

    2012-01-01

    MRI imaging of hepatic iron overload can be achieved by estimating T(2) values using multiple-echo sequences. The purpose of this work is to develop and clinically evaluate a weighted least squares algorithm based on T(2) Iterative Decomposition of water and fat with Echo Asymmetry and Least-squares estimation (IDEAL) technique for volumetric estimation of hepatic T(2) in the setting of iron overload. The weighted least squares T(2) IDEAL technique improves T(2) estimation by automatically decreasing the impact of later, noise-dominated echoes. The technique was evaluated in 37 patients with iron overload. Each patient underwent (i) a standard 2D multiple-echo gradient echo sequence for T(2) assessment with nonlinear exponential fitting, and (ii) a 3D T(2) IDEAL technique, with and without a weighted least squares fit. Regression and Bland-Altman analysis demonstrated strong correlation between conventional 2D and T(2) IDEAL estimation. In cases of severe iron overload, T(2) IDEAL without weighted least squares reconstruction resulted in a relative overestimation of T(2) compared with weighted least squares. Copyright © 2011 Wiley-Liss, Inc.

  2. Planar Laser Imaging of Sprays for Liquid Rocket Studies

    NASA Technical Reports Server (NTRS)

    Lee, W.; Pal, S.; Ryan, H. M.; Strakey, P. A.; Santoro, Robert J.

    1990-01-01

    A planar laser imaging technique which incorporates an optical polarization ratio technique for droplet size measurement was studied. A series of pressure atomized water sprays were studied with this technique and compared with measurements obtained using a Phase Doppler Particle Analyzer. In particular, the effects of assuming a logarithmic normal distribution function for the droplet size distribution within a spray was evaluated. Reasonable agreement between the instrument was obtained for the geometric mean diameter of the droplet distribution. However, comparisons based on the Sauter mean diameter show larger discrepancies, essentially because of uncertainties in the appropriate standard deviation to be applied for the polarization ratio technique. Comparisons were also made between single laser pulse (temporally resolved) measurements with multiple laser pulse visualizations of the spray.

  3. X-ray Phase Contrast Allows Three Dimensional, Quantitative Imaging of Hydrogel Implants

    PubMed Central

    Appel, Alyssa A.; Larson, Jeffery C.; Jiang, Bin; Zhong, Zhong; Anastasio, Mark A.; Brey, Eric M.

    2015-01-01

    Three dimensional imaging techniques are needed for the evaluation and assessment of biomaterials used for tissue engineering and drug delivery applications. Hydrogels are a particularly popular class of materials for medical applications but are difficult to image in tissue using most available imaging modalities. Imaging techniques based on X-ray Phase Contrast (XPC) have shown promise for tissue engineering applications due to their ability to provide image contrast based on multiple X-ray properties. In this manuscript, we investigate the use of XPC for imaging a model hydrogel and soft tissue structure. Porous fibrin loaded poly(ethylene glycol) hydrogels were synthesized and implanted in a rodent subcutaneous model. Samples were explanted and imaged with an analyzer-based XPC technique and processed and stained for histology for comparison. Both hydrogel and soft tissues structures could be identified in XPC images. Structure in skeletal muscle adjacent could be visualized and invading fibrovascular tissue could be quantified. There were no differences between invading tissue measurements from XPC and the gold-standard histology. These results provide evidence of the significant potential of techniques based on XPC for 3D imaging of hydrogel structure and local tissue response. PMID:26487123

  4. X-ray Phase Contrast Allows Three Dimensional, Quantitative Imaging of Hydrogel Implants

    DOE PAGES

    Appel, Alyssa A.; Larson, Jeffrey C.; Jiang, Bin; ...

    2015-10-20

    Three dimensional imaging techniques are needed for the evaluation and assessment of biomaterials used for tissue engineering and drug delivery applications. Hydrogels are a particularly popular class of materials for medical applications but are difficult to image in tissue using most available imaging modalities. Imaging techniques based on X-ray Phase Contrast (XPC) have shown promise for tissue engineering applications due to their ability to provide image contrast based on multiple X-ray properties. In this manuscript we describe results using XPC to image a model hydrogel and soft tissue structure. Porous fibrin loaded poly(ethylene glycol) hydrogels were synthesized and implanted inmore » a rodent subcutaneous model. Samples were explanted and imaged with an analyzer-based XPC technique and processed and stained for histology for comparison. Both hydrogel and soft tissues structures could be identified in XPC images. Structure in skeletal muscle adjacent could be visualized and invading fibrovascular tissue could be quantified. In quantitative results, there were no differences between XPC and the gold-standard histological measurements. These results provide evidence of the significant potential of techniques based on XPC for 3D imaging of hydrogel structure and local tissue response.« less

  5. Neural network disturbance observer-based distributed finite-time formation tracking control for multiple unmanned helicopters.

    PubMed

    Wang, Dandan; Zong, Qun; Tian, Bailing; Shao, Shikai; Zhang, Xiuyun; Zhao, Xinyi

    2018-02-01

    The distributed finite-time formation tracking control problem for multiple unmanned helicopters is investigated in this paper. The control object is to maintain the positions of follower helicopters in formation with external interferences. The helicopter model is divided into a second order outer-loop subsystem and a second order inner-loop subsystem based on multiple-time scale features. Using radial basis function neural network (RBFNN) technique, we first propose a novel finite-time multivariable neural network disturbance observer (FMNNDO) to estimate the external disturbance and model uncertainty, where the neural network (NN) approximation errors can be dynamically compensated by adaptive law. Next, based on FMNNDO, a distributed finite-time formation tracking controller and a finite-time attitude tracking controller are designed using the nonsingular fast terminal sliding mode (NFTSM) method. In order to estimate the second derivative of the virtual desired attitude signal, a novel finite-time sliding mode integral filter is designed. Finally, Lyapunov analysis and multiple-time scale principle ensure the realization of control goal in finite-time. The effectiveness of the proposed FMNNDO and controllers are then verified by numerical simulations. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  6. The Capacity Gain of Orbital Angular Momentum Based Multiple-Input-Multiple-Output System

    PubMed Central

    Zhang, Zhuofan; Zheng, Shilie; Chen, Yiling; Jin, Xiaofeng; Chi, Hao; Zhang, Xianmin

    2016-01-01

    Wireless communication using electromagnetic wave carrying orbital angular momentum (OAM) has attracted increasing interest in recent years, and its potential to increase channel capacity has been explored widely. In this paper, we compare the technique of using uniform linear array consist of circular traveling-wave OAM antennas for multiplexing with the conventional multiple-in-multiple-out (MIMO) communication method, and numerical results show that the OAM based MIMO system can increase channel capacity while communication distance is long enough. An equivalent model is proposed to illustrate that the OAM multiplexing system is equivalent to a conventional MIMO system with a larger element spacing, which means OAM waves could decrease the spatial correlation of MIMO channel. In addition, the effects of some system parameters, such as OAM state interval and element spacing, on the capacity advantage of OAM based MIMO are also investigated. Our results reveal that OAM waves are complementary with MIMO method. OAM waves multiplexing is suitable for long-distance line-of-sight (LoS) communications or communications in open area where the multi-path effect is weak and can be used in massive MIMO systems as well. PMID:27146453

  7. A Direct Position-Determination Approach for Multiple Sources Based on Neural Network Computation.

    PubMed

    Chen, Xin; Wang, Ding; Yin, Jiexin; Wu, Ying

    2018-06-13

    The most widely used localization technology is the two-step method that localizes transmitters by measuring one or more specified positioning parameters. Direct position determination (DPD) is a promising technique that directly localizes transmitters from sensor outputs and can offer superior localization performance. However, existing DPD algorithms such as maximum likelihood (ML)-based and multiple signal classification (MUSIC)-based estimations are computationally expensive, making it difficult to satisfy real-time demands. To solve this problem, we propose the use of a modular neural network for multiple-source DPD. In this method, the area of interest is divided into multiple sub-areas. Multilayer perceptron (MLP) neural networks are employed to detect the presence of a source in a sub-area and filter sources in other sub-areas, and radial basis function (RBF) neural networks are utilized for position estimation. Simulation results show that a number of appropriately trained neural networks can be successfully used for DPD. The performance of the proposed MLP-MLP-RBF method is comparable to the performance of the conventional MUSIC-based DPD algorithm for various signal-to-noise ratios and signal power ratios. Furthermore, the MLP-MLP-RBF network is less computationally intensive than the classical DPD algorithm and is therefore an attractive choice for real-time applications.

  8. Extensive traumatic anterior skull base fractures with cerebrospinal fluid leak: classification and repair techniques using combined vascularized tissue flaps.

    PubMed

    Archer, Jacob B; Sun, Hai; Bonney, Phillip A; Zhao, Yan Daniel; Hiebert, Jared C; Sanclement, Jose A; Little, Andrew S; Sughrue, Michael E; Theodore, Nicholas; James, Jeffrey; Safavi-Abbasi, Sam

    2016-03-01

    This article introduces a classification scheme for extensive traumatic anterior skull base fracture to help stratify surgical treatment options. The authors describe their multilayer repair technique for cerebrospinal fluid (CSF) leak resulting from extensive anterior skull base fracture using a combination of laterally pediculated temporalis fascial-pericranial, nasoseptal-pericranial, and anterior pericranial flaps. Retrospective chart review identified patients treated surgically between January 2004 and May 2014 for anterior skull base fractures with CSF fistulas. All patients were treated with bifrontal craniotomy and received pedicled tissue flaps. Cases were classified according to the extent of fracture: Class I (frontal bone/sinus involvement only); Class II (extent of involvement to ethmoid cribriform plate); and Class III (extent of involvement to sphenoid bone/sinus). Surgical repair techniques were tailored to the types of fractures. Patients were assessed for CSF leak at follow-up. The Fisher exact test was applied to investigate whether the repair techniques were associated with persistent postoperative CSF leak. Forty-three patients were identified in this series. Thirty-seven (86%) were male. The patients' mean age was 33 years (range 11-79 years). The mean overall length of follow-up was 14 months (range 5-45 months). Six fractures were classified as Class I, 8 as Class II, and 29 as Class III. The anterior pericranial flap alone was used in 33 patients (77%). Multiple flaps were used in 10 patients (3 salvage) (28%)--1 with Class II and 9 with Class III fractures. Five (17%) of the 30 patients with Class II or III fractures who received only a single anterior pericranial flap had persistent CSF leak (p < 0.31). No CSF leak was found in patients who received multiple flaps. Although postoperative CSF leak occurred only in high-grade fractures with single anterior flap repair, this finding was not significant. Extensive anterior skull base fractures often require aggressive treatment to provide the greatest long-term functional and cosmetic benefits. Several vascularized tissue flaps can be used, either alone or in combination. Vascularized flaps are an ideal substrate for cranial base repair. Dual and triple flap techniques that combine the use of various anterior, lateral, and nasoseptal flaps allow for a comprehensive arsenal in multilayered skull base repair and salvage therapy for extensive and severe fractures.

  9. RBT-GA: a novel metaheuristic for solving the Multiple Sequence Alignment problem.

    PubMed

    Taheri, Javid; Zomaya, Albert Y

    2009-07-07

    Multiple Sequence Alignment (MSA) has always been an active area of research in Bioinformatics. MSA is mainly focused on discovering biologically meaningful relationships among different sequences or proteins in order to investigate the underlying main characteristics/functions. This information is also used to generate phylogenetic trees. This paper presents a novel approach, namely RBT-GA, to solve the MSA problem using a hybrid solution methodology combining the Rubber Band Technique (RBT) and the Genetic Algorithm (GA) metaheuristic. RBT is inspired by the behavior of an elastic Rubber Band (RB) on a plate with several poles, which is analogues to locations in the input sequences that could potentially be biologically related. A GA attempts to mimic the evolutionary processes of life in order to locate optimal solutions in an often very complex landscape. RBT-GA is a population based optimization algorithm designed to find the optimal alignment for a set of input protein sequences. In this novel technique, each alignment answer is modeled as a chromosome consisting of several poles in the RBT framework. These poles resemble locations in the input sequences that are most likely to be correlated and/or biologically related. A GA-based optimization process improves these chromosomes gradually yielding a set of mostly optimal answers for the MSA problem. RBT-GA is tested with one of the well-known benchmarks suites (BALiBASE 2.0) in this area. The obtained results show that the superiority of the proposed technique even in the case of formidable sequences.

  10. Spectral-spatial hyperspectral image classification using super-pixel-based spatial pyramid representation

    NASA Astrophysics Data System (ADS)

    Fan, Jiayuan; Tan, Hui Li; Toomik, Maria; Lu, Shijian

    2016-10-01

    Spatial pyramid matching has demonstrated its power for image recognition task by pooling features from spatially increasingly fine sub-regions. Motivated by the concept of feature pooling at multiple pyramid levels, we propose a novel spectral-spatial hyperspectral image classification approach using superpixel-based spatial pyramid representation. This technique first generates multiple superpixel maps by decreasing the superpixel number gradually along with the increased spatial regions for labelled samples. By using every superpixel map, sparse representation of pixels within every spatial region is then computed through local max pooling. Finally, features learned from training samples are aggregated and trained by a support vector machine (SVM) classifier. The proposed spectral-spatial hyperspectral image classification technique has been evaluated on two public hyperspectral datasets, including the Indian Pines image containing 16 different agricultural scene categories with a 20m resolution acquired by AVIRIS and the University of Pavia image containing 9 land-use categories with a 1.3m spatial resolution acquired by the ROSIS-03 sensor. Experimental results show significantly improved performance compared with the state-of-the-art works. The major contributions of this proposed technique include (1) a new spectral-spatial classification approach to generate feature representation for hyperspectral image, (2) a complementary yet effective feature pooling approach, i.e. the superpixel-based spatial pyramid representation that is used for the spatial correlation study, (3) evaluation on two public hyperspectral image datasets with superior image classification performance.

  11. Multiple Beam Interferometry in Elementary Teaching

    ERIC Educational Resources Information Center

    Tolansky, S.

    1970-01-01

    Discusses a relatively simple technique for demonstrating multiple beam interferometry. The technique can be applied to measuring (1) radii of curvature of lenses, (2) surface finish of glass, and (3) differential phase change on reflection. Microtopographies, modulated fringe systems and opaque objects may also be observed by this technique.…

  12. A novel multiple description scalable coding scheme for mobile wireless video transmission

    NASA Astrophysics Data System (ADS)

    Zheng, Haifeng; Yu, Lun; Chen, Chang Wen

    2005-03-01

    We proposed in this paper a novel multiple description scalable coding (MDSC) scheme based on in-band motion compensation temporal filtering (IBMCTF) technique in order to achieve high video coding performance and robust video transmission. The input video sequence is first split into equal-sized groups of frames (GOFs). Within a GOF, each frame is hierarchically decomposed by discrete wavelet transform. Since there is a direct relationship between wavelet coefficients and what they represent in the image content after wavelet decomposition, we are able to reorganize the spatial orientation trees to generate multiple bit-streams and employed SPIHT algorithm to achieve high coding efficiency. We have shown that multiple bit-stream transmission is very effective in combating error propagation in both Internet video streaming and mobile wireless video. Furthermore, we adopt the IBMCTF scheme to remove the redundancy for inter-frames along the temporal direction using motion compensated temporal filtering, thus high coding performance and flexible scalability can be provided in this scheme. In order to make compressed video resilient to channel error and to guarantee robust video transmission over mobile wireless channels, we add redundancy to each bit-stream and apply error concealment strategy for lost motion vectors. Unlike traditional multiple description schemes, the integration of these techniques enable us to generate more than two bit-streams that may be more appropriate for multiple antenna transmission of compressed video. Simulate results on standard video sequences have shown that the proposed scheme provides flexible tradeoff between coding efficiency and error resilience.

  13. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  14. A single-sampling hair trap for mesocarnivores

    Treesearch

    Jonathan N. Pauli; Matthew B. Hamilton; Edward B. Crain; Steven W. Buskirk

    2007-01-01

    Although techniques to analyze and quantifY DNA-based data have progressed, methods to noninvasively collect samples lag behind. Samples are generally collected from devices that permit coincident sampling of multiple individuals. Because of cross-contamination, substantive genotyping errors can arise. We developed a cost-effective (US$4.60/trap) single-capture hair...

  15. COMPUTER TECHNIQUES FOR WEEKLY MULTIPLE-CHOICE TESTING.

    ERIC Educational Resources Information Center

    BROYLES, DAVID

    TO ENCOURAGE POLITICAL SCIENCE STUDENTS TO READ PROPERLY AND CONTINUOUSLY, THE AUTHOR GIVES FREQUENT SHORT QUIZZES BASED ON THE ASSIGNED READINGS. FOR EASE IN ADMINISTRATION AND SCORING, HE USES MARK-SENSE CARDS, ON WHICH THE STUDENT MARKS DESIGNATED AREAS TO INDICATE HIS NUMBER AND HIS CHOICE OF ANSWERS. TO EMPHASIZE THE VALUE OF CONTINUED HIGH…

  16. Confidence-Based Assessments within an Adult Learning Environment

    ERIC Educational Resources Information Center

    Novacek, Paul

    2013-01-01

    Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…

  17. Robust Angle Estimation for MIMO Radar with the Coexistence of Mutual Coupling and Colored Noise.

    PubMed

    Wang, Junxiang; Wang, Xianpeng; Xu, Dingjie; Bi, Guoan

    2018-03-09

    This paper deals with joint estimation of direction-of-departure (DOD) and direction-of- arrival (DOA) in bistatic multiple-input multiple-output (MIMO) radar with the coexistence of unknown mutual coupling and spatial colored noise by developing a novel robust covariance tensor-based angle estimation method. In the proposed method, a third-order tensor is firstly formulated for capturing the multidimensional nature of the received data. Then taking advantage of the temporal uncorrelated characteristic of colored noise and the banded complex symmetric Toeplitz structure of the mutual coupling matrices, a novel fourth-order covariance tensor is constructed for eliminating the influence of both spatial colored noise and mutual coupling. After a robust signal subspace estimation is obtained by using the higher-order singular value decomposition (HOSVD) technique, the rotational invariance technique is applied to achieve the DODs and DOAs. Compared with the existing HOSVD-based subspace methods, the proposed method can provide superior angle estimation performance and automatically jointly perform the DODs and DOAs. Results from numerical experiments are presented to verify the effectiveness of the proposed method.

  18. Strain and Ge concentration determinations in SiGe/Si multiple quantum wells by transmission electron microscopy methods

    NASA Astrophysics Data System (ADS)

    Benedetti, A.; Norris, D. J.; Hetherington, C. J. D.; Cullis, A. G.; Robbins, D. J.; Wallis, D. J.

    2003-04-01

    SiGe/Si multiple quantum wells, nominally 4 nm thick, were grown by low pressure chemical vapor deposition and the Ge distribution within the wells was studied using a variety of transmission electron microscope-based techniques. Energy-dispersive x-ray spectroscopy and electron energy-loss imaging were used to directly measure the Ge compositional profile across the SiGe wells. In addition, the average Ge concentration was deduced indirectly from measurement of the strain-induced lattice displacements in high resolution images, obtained from the relative phase shift of the Si lattice planes on either side of a SiGe well. The results from both the direct and indirect measurement techniques were compared and found to be in good agreement with one another. The Ge profiles exhibited an asymmetric shape consistent with the occurrence of Ge segregation during growth. However, the amplitude of the asymmetry indicated that an additional factor, in particular gas dwell times within the reactor, also needed to be taken into account. Based upon this approach, a successful theoretical model of the growth process was derived.

  19. Robust Angle Estimation for MIMO Radar with the Coexistence of Mutual Coupling and Colored Noise

    PubMed Central

    Wang, Junxiang; Wang, Xianpeng; Xu, Dingjie; Bi, Guoan

    2018-01-01

    This paper deals with joint estimation of direction-of-departure (DOD) and direction-of- arrival (DOA) in bistatic multiple-input multiple-output (MIMO) radar with the coexistence of unknown mutual coupling and spatial colored noise by developing a novel robust covariance tensor-based angle estimation method. In the proposed method, a third-order tensor is firstly formulated for capturing the multidimensional nature of the received data. Then taking advantage of the temporal uncorrelated characteristic of colored noise and the banded complex symmetric Toeplitz structure of the mutual coupling matrices, a novel fourth-order covariance tensor is constructed for eliminating the influence of both spatial colored noise and mutual coupling. After a robust signal subspace estimation is obtained by using the higher-order singular value decomposition (HOSVD) technique, the rotational invariance technique is applied to achieve the DODs and DOAs. Compared with the existing HOSVD-based subspace methods, the proposed method can provide superior angle estimation performance and automatically jointly perform the DODs and DOAs. Results from numerical experiments are presented to verify the effectiveness of the proposed method. PMID:29522499

  20. A Tensor-Based Subspace Approach for Bistatic MIMO Radar in Spatial Colored Noise

    PubMed Central

    Wang, Xianpeng; Wang, Wei; Li, Xin; Wang, Junxiang

    2014-01-01

    In this paper, a new tensor-based subspace approach is proposed to estimate the direction of departure (DOD) and the direction of arrival (DOA) for bistatic multiple-input multiple-output (MIMO) radar in the presence of spatial colored noise. Firstly, the received signals can be packed into a third-order measurement tensor by exploiting the inherent structure of the matched filter. Then, the measurement tensor can be divided into two sub-tensors, and a cross-covariance tensor is formulated to eliminate the spatial colored noise. Finally, the signal subspace is constructed by utilizing the higher-order singular value decomposition (HOSVD) of the cross-covariance tensor, and the DOD and DOA can be obtained through the estimation of signal parameters via rotational invariance technique (ESPRIT) algorithm, which are paired automatically. Since the multidimensional inherent structure and the cross-covariance tensor technique are used, the proposed method provides better angle estimation performance than Chen's method, the ESPRIT algorithm and the multi-SVD method. Simulation results confirm the effectiveness and the advantage of the proposed method. PMID:24573313

  1. A tensor-based subspace approach for bistatic MIMO radar in spatial colored noise.

    PubMed

    Wang, Xianpeng; Wang, Wei; Li, Xin; Wang, Junxiang

    2014-02-25

    In this paper, a new tensor-based subspace approach is proposed to estimate the direction of departure (DOD) and the direction of arrival (DOA) for bistatic multiple-input multiple-output (MIMO) radar in the presence of spatial colored noise. Firstly, the received signals can be packed into a third-order measurement tensor by exploiting the inherent structure of the matched filter. Then, the measurement tensor can be divided into two sub-tensors, and a cross-covariance tensor is formulated to eliminate the spatial colored noise. Finally, the signal subspace is constructed by utilizing the higher-order singular value decomposition (HOSVD) of the cross-covariance tensor, and the DOD and DOA can be obtained through the estimation of signal parameters via rotational invariance technique (ESPRIT) algorithm, which are paired automatically. Since the multidimensional inherent structure and the cross-covariance tensor technique are used, the proposed method provides better angle estimation performance than Chen's method, the ESPRIT algorithm and the multi-SVD method. Simulation results confirm the effectiveness and the advantage of the proposed method.

  2. Approximation-Based Adaptive Neural Tracking Control of Nonlinear MIMO Unknown Time-Varying Delay Systems With Full State Constraints.

    PubMed

    Li, Da-Peng; Li, Dong-Juan; Liu, Yan-Jun; Tong, Shaocheng; Chen, C L Philip

    2017-10-01

    This paper deals with the tracking control problem for a class of nonlinear multiple input multiple output unknown time-varying delay systems with full state constraints. To overcome the challenges which cause by the appearances of the unknown time-varying delays and full-state constraints simultaneously in the systems, an adaptive control method is presented for such systems for the first time. The appropriate Lyapunov-Krasovskii functions and a separation technique are employed to eliminate the effect of unknown time-varying delays. The barrier Lyapunov functions are employed to prevent the violation of the full state constraints. The singular problems are dealt with by introducing the signal function. Finally, it is proven that the proposed method can both guarantee the good tracking performance of the systems output, all states are remained in the constrained interval and all the closed-loop signals are bounded in the design process based on choosing appropriate design parameters. The practicability of the proposed control technique is demonstrated by a simulation study in this paper.

  3. Classification of mathematics deficiency using shape and scale analysis of 3D brain structures

    NASA Astrophysics Data System (ADS)

    Kurtek, Sebastian; Klassen, Eric; Gore, John C.; Ding, Zhaohua; Srivastava, Anuj

    2011-03-01

    We investigate the use of a recent technique for shape analysis of brain substructures in identifying learning disabilities in third-grade children. This Riemannian technique provides a quantification of differences in shapes of parameterized surfaces, using a distance that is invariant to rigid motions and re-parameterizations. Additionally, it provides an optimal registration across surfaces for improved matching and comparisons. We utilize an efficient gradient based method to obtain the optimal re-parameterizations of surfaces. In this study we consider 20 different substructures in the human brain and correlate the differences in their shapes with abnormalities manifested in deficiency of mathematical skills in 106 subjects. The selection of these structures is motivated in part by the past links between their shapes and cognitive skills, albeit in broader contexts. We have studied the use of both individual substructures and multiple structures jointly for disease classification. Using a leave-one-out nearest neighbor classifier, we obtained a 62.3% classification rate based on the shape of the left hippocampus. The use of multiple structures resulted in an improved classification rate of 71.4%.

  4. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  5. Assessing an ensemble docking-based virtual screening strategy for kinase targets by considering protein flexibility.

    PubMed

    Tian, Sheng; Sun, Huiyong; Pan, Peichen; Li, Dan; Zhen, Xuechu; Li, Youyong; Hou, Tingjun

    2014-10-27

    In this study, to accommodate receptor flexibility, based on multiple receptor conformations, a novel ensemble docking protocol was developed by using the naïve Bayesian classification technique, and it was evaluated in terms of the prediction accuracy of docking-based virtual screening (VS) of three important targets in the kinase family: ALK, CDK2, and VEGFR2. First, for each target, the representative crystal structures were selected by structural clustering, and the capability of molecular docking based on each representative structure to discriminate inhibitors from non-inhibitors was examined. Then, for each target, 50 ns molecular dynamics (MD) simulations were carried out to generate an ensemble of the conformations, and multiple representative structures/snapshots were extracted from each MD trajectory by structural clustering. On average, the representative crystal structures outperform the representative structures extracted from MD simulations in terms of the capabilities to separate inhibitors from non-inhibitors. Finally, by using the naïve Bayesian classification technique, an integrated VS strategy was developed to combine the prediction results of molecular docking based on different representative conformations chosen from crystal structures and MD trajectories. It was encouraging to observe that the integrated VS strategy yields better performance than the docking-based VS based on any single rigid conformation. This novel protocol may provide an improvement over existing strategies to search for more diverse and promising active compounds for a target of interest.

  6. Multiple Uses of a Word Study Technique

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Orlins, Andrew

    2005-01-01

    This paper presents two case studies that illustrate the multiple uses of word sorts, a word study phonics technique. Case study children were Sara, a second grader, who had difficulty with reading basic words and John, a third grader, who had difficulty with spelling basic words. Multiple baseline designs were employed to study the effects of…

  7. Hardware Implementation of Multiple Fan Beam Projection Technique in Optical Fibre Process Tomography

    PubMed Central

    Rahim, Ruzairi Abdul; Fazalul Rahiman, Mohd Hafiz; Leong, Lai Chen; Chan, Kok San; Pang, Jon Fea

    2008-01-01

    The main objective of this project is to implement the multiple fan beam projection technique using optical fibre sensors with the aim to achieve a high data acquisition rate. Multiple fan beam projection technique here is defined as allowing more than one emitter to transmit light at the same time using the switch-mode fan beam method. For the thirty-two pairs of sensors used, the 2-projection technique and 4-projection technique are being investigated. Sixteen sets of projections will complete one frame of light emission for the 2-projection technique while eight sets of projection will complete one frame of light emission for the 4-projection technique. In order to facilitate data acquisition process, PIC microcontroller and the sample and hold circuit are being used. This paper summarizes the hardware configuration and design for this project. PMID:27879885

  8. A Paper-Based Device for Performing Loop-Mediated Isothermal Amplification with Real-Time Simultaneous Detection of Multiple DNA Targets.

    PubMed

    Seok, Youngung; Joung, Hyou-Arm; Byun, Ju-Young; Jeon, Hyo-Sung; Shin, Su Jeong; Kim, Sanghyo; Shin, Young-Beom; Han, Hyung Soo; Kim, Min-Gon

    2017-01-01

    Paper-based diagnostic devices have many advantages as a one of the multiple diagnostic test platforms for point-of-care (POC) testing because they have simplicity, portability, and cost-effectiveness. However, despite high sensitivity and specificity of nucleic acid testing (NAT), the development of NAT based on a paper platform has not progressed as much as the others because various specific conditions for nucleic acid amplification reactions such as pH, buffer components, and temperature, inhibitions from technical differences of paper-based device. Here, we propose a paper-based device for performing loop-mediated isothermal amplification (LAMP) with real-time simultaneous detection of multiple DNA targets. We determined the optimal chemical components to enable dry conditions for the LAMP reaction without lyophilization or other techniques. We also devised the simple paper device structure by sequentially stacking functional layers, and employed a newly discovered property of hydroxynaphthol blue fluorescence to analyze real-time LAMP signals in the paper device. This proposed platform allowed analysis of three different meningitis DNA samples in a single device with single-step operation. This LAMP-based multiple diagnostic device has potential for real-time analysis with quantitative detection of 10 2 -10 5 copies of genomic DNA. Furthermore, we propose the transformation of DNA amplification devices to a simple and affordable paper system approach with great potential for realizing a paper-based NAT system for POC testing.

  9. A Paper-Based Device for Performing Loop-Mediated Isothermal Amplification with Real-Time Simultaneous Detection of Multiple DNA Targets

    PubMed Central

    Seok, Youngung; Joung, Hyou-Arm; Byun, Ju-Young; Jeon, Hyo-Sung; Shin, Su Jeong; Kim, Sanghyo; Shin, Young-Beom; Han, Hyung Soo; Kim, Min-Gon

    2017-01-01

    Paper-based diagnostic devices have many advantages as a one of the multiple diagnostic test platforms for point-of-care (POC) testing because they have simplicity, portability, and cost-effectiveness. However, despite high sensitivity and specificity of nucleic acid testing (NAT), the development of NAT based on a paper platform has not progressed as much as the others because various specific conditions for nucleic acid amplification reactions such as pH, buffer components, and temperature, inhibitions from technical differences of paper-based device. Here, we propose a paper-based device for performing loop-mediated isothermal amplification (LAMP) with real-time simultaneous detection of multiple DNA targets. We determined the optimal chemical components to enable dry conditions for the LAMP reaction without lyophilization or other techniques. We also devised the simple paper device structure by sequentially stacking functional layers, and employed a newly discovered property of hydroxynaphthol blue fluorescence to analyze real-time LAMP signals in the paper device. This proposed platform allowed analysis of three different meningitis DNA samples in a single device with single-step operation. This LAMP-based multiple diagnostic device has potential for real-time analysis with quantitative detection of 102-105 copies of genomic DNA. Furthermore, we propose the transformation of DNA amplification devices to a simple and affordable paper system approach with great potential for realizing a paper-based NAT system for POC testing. PMID:28740546

  10. Experimental demonstration of high spectral efficient 4 × 4 MIMO SCMA-OFDM/OQAM radio over multi-core fiber system.

    PubMed

    Liu, Chang; Deng, Lei; He, Jiale; Li, Di; Fu, Songnian; Tang, Ming; Cheng, Mengfan; Liu, Deming

    2017-07-24

    In this paper, 4 × 4 multiple-input multiple-output (MIMO) radio over 7-core fiber system based on sparse code multiple access (SCMA) and OFDM/OQAM techniques is proposed. No cyclic prefix (CP) is required by properly designing the prototype filters in OFDM/OQAM modulator, and non-orthogonally overlaid codewords by using SCMA is help to serve more users simultaneously under the condition of using equal number of time and frequency resources compared with OFDMA, resulting in the increase of spectral efficiency (SE) and system capacity. In our experiment, 11.04 Gb/s 4 × 4 MIMO SCMA-OFDM/OQAM signal is successfully transmitted over 20 km 7-core fiber and 0.4 m air distance in both uplink and downlink. As a comparison, 6.681 Gb/s traditional MIMO-OFDM signal with the same occupied bandwidth has been evaluated for both uplink and downlink transmission. The experimental results show that SE could be increased by 65.2% with no bit error rate (BER) performance degradation compared with the traditional MIMO-OFDM technique.

  11. Source-space ICA for MEG source imaging.

    PubMed

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  12. Contribution of multiple inert gas elimination technique to pulmonary medicine. 1. Principles and information content of the multiple inert gas elimination technique.

    PubMed Central

    Roca, J.; Wagner, P. D.

    1994-01-01

    This introductory review summarises four different aspects of the multiple inert gas elimination technique (MIGET). Firstly, the historical background that facilitated, in the mid 1970s, the development of the MIGET as a tool to obtain more information about the entire spectrum of VA/Q distribution in the lung by measuring the exchange of six gases of different solubility in trace concentrations. Its principle is based on the observation that the retention (or excretion) of any gas is dependent on the solubility (lambda) of that gas and the VA/Q distribution. A second major aspect is the analysis of the information content and limitations of the technique. During the last 15 years a substantial amount of clinical research using the MIGET has been generated by several groups around the world. The technique has been shown to be adequate in understanding the mechanisms of hypoxaemia in different forms of pulmonary disease and the effects of therapeutic interventions, but also in separately determining the quantitative role of each extrapulmonary factor on systemic arterial PO2 when they change between two conditions of MIGET measurement. This information will be extensively reviewed in the forthcoming articles of this series. Next, the different modalities of the MIGET, practical considerations involved in the measurements and the guidelines for quality control have been indicated. Finally, a section has been devoted to the analysis of available data in healthy subjects under different conditions. The lack of systematic information on the VA/Q distributions of older healthy subjects is emphasised, since it will be required to fully understand the changes brought about by diseases that affect the older population. PMID:8091330

  13. Development and comparison of projection and image space 3D nodule insertion techniques

    NASA Astrophysics Data System (ADS)

    Robins, Marthony; Solomon, Justin; Sahbaee, Pooyan; Samei, Ehsan

    2016-04-01

    This study aimed to develop and compare two methods of inserting computerized virtual lesions into CT datasets. 24 physical (synthetic) nodules of three sizes and four morphologies were inserted into an anthropomorphic chest phantom (LUNGMAN, KYOTO KAGAKU). The phantom was scanned (Somatom Definition Flash, Siemens Healthcare) with and without nodules present, and images were reconstructed with filtered back projection and iterative reconstruction (SAFIRE) at 0.6 mm slice thickness using a standard thoracic CT protocol at multiple dose settings. Virtual 3D CAD models based on the physical nodules were virtually inserted (accounting for the system MTF) into the nodule-free CT data using two techniques. These techniques include projection-based and image-based insertion. Nodule volumes were estimated using a commercial segmentation tool (iNtuition, TeraRecon, Inc.). Differences were tested using paired t-tests and R2 goodness of fit between the virtually and physically inserted nodules. Both insertion techniques resulted in nodule volumes very similar to the real nodules (<3% difference) and in most cases the differences were not statistically significant. Also, R2 values were all <0.97 for both insertion techniques. These data imply that these techniques can confidently be used as a means of inserting virtual nodules in CT datasets. These techniques can be instrumental in building hybrid CT datasets composed of patient images with virtually inserted nodules.

  14. Single, double or multiple-injection techniques for non-ultrasound guided axillary brachial plexus block in adults undergoing surgery of the lower arm.

    PubMed

    Chin, Ki Jinn; Alakkad, Husni; Cubillos, Javier E

    2013-08-08

    Regional anaesthesia comprising axillary block of the brachial plexus is a common anaesthetic technique for distal upper limb surgery. This is an update of a review first published in 2006 and updated in 2011. To compare the relative effects (benefits and harms) of three injection techniques (single, double and multiple) of axillary block of the brachial plexus for distal upper extremity surgery. We considered these effects primarily in terms of anaesthetic effectiveness; the complication rate (neurological and vascular); and pain and discomfort caused by performance of the block. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE, EMBASE and reference lists of trials. We contacted trial authors. The date of the last search was March 2013 (updated from March 2011). We included randomized controlled trials that compared double with single-injection techniques, multiple with single-injection techniques, or multiple with double-injection techniques for axillary block in adults undergoing surgery of the distal upper limb. We excluded trials using ultrasound-guided techniques. Independent study selection, risk of bias assessment and data extraction were performed by at least two investigators. We undertook meta-analysis. The 21 included trials involved a total of 2148 participants who received regional anaesthesia for hand, wrist, forearm or elbow surgery. Risk of bias assessment indicated that trial design and conduct were generally adequate; the most common areas of weakness were in blinding and allocation concealment.Eight trials comparing double versus single injections showed a statistically significant decrease in primary anaesthesia failure (risk ratio (RR 0.51), 95% confidence interval (CI) 0.30 to 0.85). Subgroup analysis by method of nerve location showed that the effect size was greater when neurostimulation was used rather than the transarterial technique.Eight trials comparing multiple with single injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.25, 95% CI 0.14 to 0.44) and of incomplete motor block (RR 0.61, 95% CI 0.39 to 0.96) in the multiple injection group.Eleven trials comparing multiple with double injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.28, 95% CI 0.20 to 0.40) and of incomplete motor block (RR 0.55, 95% CI 0.36 to 0.85) in the multiple injection group.Tourniquet pain was significantly reduced with multiple injections compared with double injections (RR 0.53, 95% CI 0.33 to 0.84). Otherwise there were no statistically significant differences between groups in any of the three comparisons on secondary analgesia failure, complications and patient discomfort. The time for block performance was significantly shorter for single and double injections compared with multiple injections. This review provides evidence that multiple-injection techniques using nerve stimulation for axillary plexus block produce more effective anaesthesia than either double or single-injection techniques. However, there was insufficient evidence for a significant difference in other outcomes, including safety.

  15. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  16. Performance analysis of fiber-based free-space optical communications with coherent detection spatial diversity.

    PubMed

    Li, Kangning; Ma, Jing; Tan, Liying; Yu, Siyuan; Zhai, Chao

    2016-06-10

    The performances of fiber-based free-space optical (FSO) communications over gamma-gamma distributed turbulence are studied for multiple aperture receiver systems. The equal gain combining (EGC) technique is considered as a practical scheme to mitigate the atmospheric turbulence. Bit error rate (BER) performances for binary-phase-shift-keying-modulated coherent detection fiber-based free-space optical communications are derived and analyzed for EGC diversity receptions through an approximation method. To show the net diversity gain of a multiple aperture receiver system, BER performances of EGC are compared with a single monolithic aperture receiver system with the same total aperture area (same average total incident optical power on the aperture surface) for fiber-based free-space optical communications. The analytical results are verified by Monte Carlo simulations. System performances are also compared for EGC diversity coherent FSO communications with or without considering fiber-coupling efficiencies.

  17. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  18. Clustering Single-Cell Expression Data Using Random Forest Graphs.

    PubMed

    Pouyan, Maziyar Baran; Nourani, Mehrdad

    2017-07-01

    Complex tissues such as brain and bone marrow are made up of multiple cell types. As the study of biological tissue structure progresses, the role of cell-type-specific research becomes increasingly important. Novel sequencing technology such as single-cell cytometry provides researchers access to valuable biological data. Applying machine-learning techniques to these high-throughput datasets provides deep insights into the cellular landscape of the tissue where those cells are a part of. In this paper, we propose the use of random-forest-based single-cell profiling, a new machine-learning-based technique, to profile different cell types of intricate tissues using single-cell cytometry data. Our technique utilizes random forests to capture cell marker dependences and model the cellular populations using the cell network concept. This cellular network helps us discover what cell types are in the tissue. Our experimental results on public-domain datasets indicate promising performance and accuracy of our technique in extracting cell populations of complex tissues.

  19. Sparsity-aware multiple relay selection in large multi-hop decode-and-forward relay networks

    NASA Astrophysics Data System (ADS)

    Gouissem, A.; Hamila, R.; Al-Dhahir, N.; Foufou, S.

    2016-12-01

    In this paper, we propose and investigate two novel techniques to perform multiple relay selection in large multi-hop decode-and-forward relay networks. The two proposed techniques exploit sparse signal recovery theory to select multiple relays using the orthogonal matching pursuit algorithm and outperform state-of-the-art techniques in terms of outage probability and computation complexity. To reduce the amount of collected channel state information (CSI), we propose a limited-feedback scheme where only a limited number of relays feedback their CSI. Furthermore, a detailed performance-complexity tradeoff investigation is conducted for the different studied techniques and verified by Monte Carlo simulations.

  20. Least-squares deconvolution of evoked potentials and sequence optimization for multiple stimuli under low-jitter conditions.

    PubMed

    Bardy, Fabrice; Dillon, Harvey; Van Dun, Bram

    2014-04-01

    Rapid presentation of stimuli in an evoked response paradigm can lead to overlap of multiple responses and consequently difficulties interpreting waveform morphology. This paper presents a deconvolution method allowing overlapping multiple responses to be disentangled. The deconvolution technique uses a least-squared error approach. A methodology is proposed to optimize the stimulus sequence associated with the deconvolution technique under low-jitter conditions. It controls the condition number of the matrices involved in recovering the responses. Simulations were performed using the proposed deconvolution technique. Multiple overlapping responses can be recovered perfectly in noiseless conditions. In the presence of noise, the amount of error introduced by the technique can be controlled a priori by the condition number of the matrix associated with the used stimulus sequence. The simulation results indicate the need for a minimum amount of jitter, as well as a sufficient number of overlap combinations to obtain optimum results. An aperiodic model is recommended to improve reconstruction. We propose a deconvolution technique allowing multiple overlapping responses to be extracted and a method of choosing the stimulus sequence optimal for response recovery. This technique may allow audiologists, psychologists, and electrophysiologists to optimize their experimental designs involving rapidly presented stimuli, and to recover evoked overlapping responses. Copyright © 2013 International Federation of Clinical Neurophysiology. All rights reserved.

  1. A brief review of extrusion-based tissue scaffold bio-printing.

    PubMed

    Ning, Liqun; Chen, Xiongbiao

    2017-08-01

    Extrusion-based bio-printing has great potential as a technique for manipulating biomaterials and living cells to create three-dimensional (3D) scaffolds for damaged tissue repair and function restoration. Over the last two decades, advances in both engineering techniques and life sciences have evolved extrusion-based bio-printing from a simple technique to one able to create diverse tissue scaffolds from a wide range of biomaterials and cell types. However, the complexities associated with synthesis of materials for bio-printing and manipulation of multiple materials and cells in bio-printing pose many challenges for scaffold fabrication. This paper presents an overview of extrusion-based bio-printing for scaffold fabrication, focusing on the prior-printing considerations (such as scaffold design and materials/cell synthesis), working principles, comparison to other techniques, and to-date achievements. This paper also briefly reviews the recent development of strategies with regard to hydrogel synthesis, multi-materials/cells manipulation, and process-induced cell damage in extrusion-based bio-printing. The key issue and challenges for extrusion-based bio-printing are also identified and discussed along with recommendations for future, aimed at developing novel biomaterials and bio-printing systems, creating patterned vascular networks within scaffolds, and preserving the cell viability and functions in scaffold bio-printing. The address of these challenges will significantly enhance the capability of extrusion-based bio-printing. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The purpose of this study is the development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates and the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The procedure was also modified to allow coarse parallelization of the solution algorithm. This document is a final report outlining the development and techniques used in the procedure. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Numerical dissipation is used to gain solution stability but is reduced in viscous dominated flow regions. Local time stepping and implicit residual smoothing are used to increase the rate of convergence. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes being generated by the system (TIGG3D) developed earlier under this contract. The grid generation scheme meets the average-passage requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. Pure internal flow solutions were obtained as well as solutions with flow about the cowl/nacelle and various engine core flow conditions. The efficiency of the solution procedure was shown to be the same as the original analysis.

  3. Introducing passive acoustic filter in acoustic based condition monitoring: Motor bike piston-bore fault identification

    NASA Astrophysics Data System (ADS)

    Jena, D. P.; Panigrahi, S. N.

    2016-03-01

    Requirement of designing a sophisticated digital band-pass filter in acoustic based condition monitoring has been eliminated by introducing a passive acoustic filter in the present work. So far, no one has attempted to explore the possibility of implementing passive acoustic filters in acoustic based condition monitoring as a pre-conditioner. In order to enhance the acoustic based condition monitoring, a passive acoustic band-pass filter has been designed and deployed. Towards achieving an efficient band-pass acoustic filter, a generalized design methodology has been proposed to design and optimize the desired acoustic filter using multiple filter components in series. An appropriate objective function has been identified for genetic algorithm (GA) based optimization technique with multiple design constraints. In addition, the sturdiness of the proposed method has been demonstrated in designing a band-pass filter by using an n-branch Quincke tube, a high pass filter and multiple Helmholtz resonators. The performance of the designed acoustic band-pass filter has been shown by investigating the piston-bore defect of a motor-bike using engine noise signature. On the introducing a passive acoustic filter in acoustic based condition monitoring reveals the enhancement in machine learning based fault identification practice significantly. This is also a first attempt of its own kind.

  4. Parametric study of sensor placement for vision-based relative navigation system of multiple spacecraft

    NASA Astrophysics Data System (ADS)

    Jeong, Junho; Kim, Seungkeun; Suk, Jinyoung

    2017-12-01

    In order to overcome the limited range of GPS-based techniques, vision-based relative navigation methods have recently emerged as alternative approaches for a high Earth orbit (HEO) or deep space missions. Therefore, various vision-based relative navigation systems use for proximity operations between two spacecraft. For the implementation of these systems, a sensor placement problem can occur on the exterior of spacecraft due to its limited space. To deal with the sensor placement, this paper proposes a novel methodology for a vision-based relative navigation based on multiple position sensitive diode (PSD) sensors and multiple infrared beacon modules. For the proposed method, an iterated parametric study is used based on the farthest point optimization (FPO) and a constrained extended Kalman filter (CEKF). Each algorithm is applied to set the location of the sensors and to estimate relative positions and attitudes according to each combination by the PSDs and beacons. After that, scores for the sensor placement are calculated with respect to parameters: the number of the PSDs, number of the beacons, and accuracy of relative estimates. Then, the best scoring candidate is determined for the sensor placement. Moreover, the results of the iterated estimation show that the accuracy improves dramatically, as the number of the PSDs increases from one to three.

  5. Light-field camera-based 3D volumetric particle image velocimetry with dense ray tracing reconstruction technique

    NASA Astrophysics Data System (ADS)

    Shi, Shengxian; Ding, Junfei; New, T. H.; Soria, Julio

    2017-07-01

    This paper presents a dense ray tracing reconstruction technique for a single light-field camera-based particle image velocimetry. The new approach pre-determines the location of a particle through inverse dense ray tracing and reconstructs the voxel value using multiplicative algebraic reconstruction technique (MART). Simulation studies were undertaken to identify the effects of iteration number, relaxation factor, particle density, voxel-pixel ratio and the effect of the velocity gradient on the performance of the proposed dense ray tracing-based MART method (DRT-MART). The results demonstrate that the DRT-MART method achieves higher reconstruction resolution at significantly better computational efficiency than the MART method (4-50 times faster). Both DRT-MART and MART approaches were applied to measure the velocity field of a low speed jet flow which revealed that for the same computational cost, the DRT-MART method accurately resolves the jet velocity field with improved precision, especially for the velocity component along the depth direction.

  6. Coupling discrete and continuum concentration particle models for multiscale and hybrid molecular-continuum simulations

    NASA Astrophysics Data System (ADS)

    Petsev, Nikolai D.; Leal, L. Gary; Shell, M. Scott

    2017-12-01

    Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely resolved (e.g., molecular dynamics) and coarse-grained (e.g., continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 084115 (2016)], simulated using a particle-based continuum method known as smoothed dissipative particle dynamics. An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.

  7. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.

    PubMed

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-09-16

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  8. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener

    PubMed Central

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-01-01

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ. PMID:27649200

  9. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms.

    PubMed

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro

    2015-06-15

    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  10. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Interaction

    NASA Technical Reports Server (NTRS)

    DeCarvalho, N. V.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Ratcliffe, J. G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  11. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Migration

    NASA Technical Reports Server (NTRS)

    DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  12. 3D polymer gel dosimetry using a 3D (DESS) and a 2D MultiEcho SE (MESE) sequence

    NASA Astrophysics Data System (ADS)

    Maris, Thomas G.; Pappas, Evangelos; Karolemeas, Kostantinos; Papadakis, Antonios E.; Zacharopoulou, Fotini; Papanikolaou, Nickolas; Gourtsoyiannis, Nicholas

    2006-12-01

    The utilization of 3D techniques in Magnetic Resonance Imaging data aquisition and post-processing analysis is a prerequisite especially when modern radiotherapy techniques (conformal RT, IMRT, Stereotactic RT) are to be used. The aim of this work is to compare a 3D Double Echo Steady State (DESS) and a 2D Multiple Echo Spin Echo (MESE) sequence in 3D MRI radiation dosimetry using two different MRI scanners and utilising N-VInylPyrrolidone (VIPAR) based polymer gels.

  13. Evaluation of accuracy of complete-arch multiple-unit abutment-level dental implant impressions using different impression and splinting materials.

    PubMed

    Buzayan, Muaiyed; Baig, Mirza Rustum; Yunus, Norsiah

    2013-01-01

    This in vitro study evaluated the accuracy of multiple-unit dental implant casts obtained from splinted or nonsplinted direct impression techniques using various splinting materials by comparing the casts to the reference models. The effect of two different impression materials on the accuracy of the implant casts was also evaluated for abutment-level impressions. A reference model with six internal-connection implant replicas placed in the completely edentulous mandibular arch and connected to multi-base abutments was fabricated from heat-curing acrylic resin. Forty impressions of the reference model were made, 20 each with polyether (PE) and polyvinylsiloxane (PVS) impression materials using the open tray technique. The PE and PVS groups were further subdivided into four subgroups of five each on the bases of splinting type: no splinting, bite registration PE, bite registration addition silicone, or autopolymerizing acrylic resin. The positional accuracy of the implant replica heads was measured on the poured casts using a coordinate measuring machine to assess linear differences in interimplant distances in all three axes. The collected data (linear and three-dimensional [3D] displacement values) were compared with the measurements calculated on the reference resin model and analyzed with nonparametric tests (Kruskal-Wallis and Mann-Whitney). No significant differences were found between the various splinting groups for both PE and PVS impression materials in terms of linear and 3D distortions. However, small but significant differences were found between the two impression materials (PVS, 91 μm; PE, 103 μm) in terms of 3D discrepancies, irrespective of the splinting technique employed. Casts obtained from both impression materials exhibited differences from the reference model. The impression material influenced impression inaccuracy more than the splinting material for multiple-unit abutment-level impressions.

  14. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  15. Multiradar tracking for theater missile defense

    NASA Astrophysics Data System (ADS)

    Sviestins, Egils

    1995-09-01

    A prototype system for tracking tactical ballistic missiles using multiple radars has been developed. The tracking is based on measurement level fusion (`true' multi-radar) tracking. Strobes from passive sensors can also be used. We describe various features of the system with some emphasis on the filtering technique. This is based on the Interacting Multiple Model framework where the states are Free Flight, Drag, Boost, and Auxiliary. Measurement error modeling includes the signal to noise ratio dependence; outliers and miscorrelations are handled in the same way. The launch point is calculated within one minute from the detection of the missile. The impact point, and its uncertainty region, is calculated continually by extrapolating the track state vector using the equations of planetary motion.

  16. Sparsity-Aware DOA Estimation Scheme for Noncircular Source in MIMO Radar.

    PubMed

    Wang, Xianpeng; Wang, Wei; Li, Xin; Liu, Qi; Liu, Jing

    2016-04-14

    In this paper, a novel sparsity-aware direction of arrival (DOA) estimation scheme for a noncircular source is proposed in multiple-input multiple-output (MIMO) radar. In the proposed method, the reduced-dimensional transformation technique is adopted to eliminate the redundant elements. Then, exploiting the noncircularity of signals, a joint sparsity-aware scheme based on the reweighted l1 norm penalty is formulated for DOA estimation, in which the diagonal elements of the weight matrix are the coefficients of the noncircular MUSIC-like (NC MUSIC-like) spectrum. Compared to the existing l1 norm penalty-based methods, the proposed scheme provides higher angular resolution and better DOA estimation performance. Results from numerical experiments are used to show the effectiveness of our proposed method.

  17. 2x2 MIMO-OFDM Gigabit fiber-wireless access system based on polarization division multiplexed WDM-PON.

    PubMed

    Deng, Lei; Pang, Xiaodan; Zhao, Ying; Othman, M B; Jensen, Jesper Bevensee; Zibar, Darko; Yu, Xianbin; Liu, Deming; Monroy, Idelfonso Tafur

    2012-02-13

    We propose a spectral efficient radio over wavelength division multiplexed passive optical network (WDM-PON) system by combining optical polarization division multiplexing (PDM) and wireless multiple input multiple output (MIMO) spatial multiplexing techniques. In our experiment, a training-based zero forcing (ZF) channel estimation algorithm is designed to compensate the polarization rotation and wireless multipath fading. A 797 Mb/s net data rate QPSK-OFDM signal with error free (<1 × 10(5)) performance and a 1.59 Gb/s net data rate 16QAM-OFDM signal with BER performance of 1.2 × 10(2) are achieved after transmission of 22.8 km single mode fiber followed by 3 m and 1 m air distances, respectively.

  18. A simple test of association for contingency tables with multiple column responses.

    PubMed

    Decady, Y J; Thomas, D R

    2000-09-01

    Loughin and Scherer (1998, Biometrics 54, 630-637) investigated tests of association in two-way tables when one of the categorical variables allows for multiple-category responses from individual respondents. Standard chi-squared tests are invalid in this case, and they developed a bootstrap test procedure that provides good control of test levels under the null hypothesis. This procedure and some others that have been proposed are computationally involved and are based on techniques that are relatively unfamiliar to many practitioners. In this paper, the methods introduced by Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) for analyzing complex survey data are used to develop a simple test based on a corrected chi-squared statistic.

  19. Study of repeater technology for advanced multifunctional communications satellites

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Investigations are presented concerning design concepts and implementation approaches for the satellite communication repeater subsystems of advanced multifunctional satellites. In such systems the important concepts are the use of multiple antenna beams, repeater switching (routing), and efficient spectrum utilization through frequency reuse. An information base on these techniques was developed and tradeoff analyses were made of repeater design concepts, with the work design taken in a broad sense to include modulation beam coverage patterns. There were five major areas of study: requirements analysis and processing; study of interbeam interference in multibeam systems; characterization of multiple-beam switching repeaters; estimation of repeater weight and power for a number of alternatives; and tradeoff analyses based on these weight and power data.

  20. On-chip wavelength multiplexed detection of cancer DNA biomarkers in blood

    PubMed Central

    Cai, H.; Stott, M. A.; Ozcelik, D.; Parks, J. W.; Hawkins, A. R.; Schmidt, H.

    2016-01-01

    We have developed an optofluidic analysis system that processes biomolecular samples starting from whole blood and then analyzes and identifies multiple targets on a silicon-based molecular detection platform. We demonstrate blood filtration, sample extraction, target enrichment, and fluorescent labeling using programmable microfluidic circuits. We detect and identify multiple targets using a spectral multiplexing technique based on wavelength-dependent multi-spot excitation on an antiresonant reflecting optical waveguide chip. Specifically, we extract two types of melanoma biomarkers, mutated cell-free nucleic acids —BRAFV600E and NRAS, from whole blood. We detect and identify these two targets simultaneously using the spectral multiplexing approach with up to a 96% success rate. These results point the way toward a full front-to-back chip-based optofluidic compact system for high-performance analysis of complex biological samples. PMID:28058082

  1. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device

    PubMed Central

    He, Xiang; Aloi, Daniel N.; Li, Jia

    2015-01-01

    Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design. PMID:26694387

  2. Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device.

    PubMed

    He, Xiang; Aloi, Daniel N; Li, Jia

    2015-12-14

    Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.

  3. Feature and Score Fusion Based Multiple Classifier Selection for Iris Recognition

    PubMed Central

    Islam, Md. Rabiul

    2014-01-01

    The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al. PMID:25114676

  4. Feature and score fusion based multiple classifier selection for iris recognition.

    PubMed

    Islam, Md Rabiul

    2014-01-01

    The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al.

  5. Retrodiction for Bayesian multiple-hypothesis/multiple-target tracking in densely cluttered environment

    NASA Astrophysics Data System (ADS)

    Koch, Wolfgang

    1996-05-01

    Sensor data processing in a dense target/dense clutter environment is inevitably confronted with data association conflicts which correspond with the multiple hypothesis character of many modern approaches (MHT: multiple hypothesis tracking). In this paper we analyze the efficiency of retrodictive techniques that generalize standard fixed interval smoothing to MHT applications. 'Delayed estimation' based on retrodiction provides uniquely interpretable and accurate trajectories from ambiguous MHT output if a certain time delay is tolerated. In a Bayesian framework the theoretical background of retrodiction and its intimate relation to Bayesian MHT is sketched. By a simulated example with two closely-spaced targets, relatively low detection probabilities, and rather high false return densities, we demonstrate the benefits of retrodiction and quantitatively discuss the achievable track accuracies and the time delays involved for typical radar parameters.

  6. Distributed finite-time containment control for double-integrator multiagent systems.

    PubMed

    Wang, Xiangyu; Li, Shihua; Shi, Peng

    2014-09-01

    In this paper, the distributed finite-time containment control problem for double-integrator multiagent systems with multiple leaders and external disturbances is discussed. In the presence of multiple dynamic leaders, by utilizing the homogeneous control technique, a distributed finite-time observer is developed for the followers to estimate the weighted average of the leaders' velocities at first. Then, based on the estimates and the generalized adding a power integrator approach, distributed finite-time containment control algorithms are designed to guarantee that the states of the followers converge to the dynamic convex hull spanned by those of the leaders in finite time. Moreover, as a special case of multiple dynamic leaders with zero velocities, the proposed containment control algorithms also work for the case of multiple stationary leaders without using the distributed observer. Simulations demonstrate the effectiveness of the proposed control algorithms.

  7. Measurement and modeling of the acoustic field near an underwater vehicle and implications for acoustic source localization.

    PubMed

    Lepper, Paul A; D'Spain, Gerald L

    2007-08-01

    The performance of traditional techniques of passive localization in ocean acoustics such as time-of-arrival (phase differences) and amplitude ratios measured by multiple receivers may be degraded when the receivers are placed on an underwater vehicle due to effects of scattering. However, knowledge of the interference pattern caused by scattering provides a potential enhancement to traditional source localization techniques. Results based on a study using data from a multi-element receiving array mounted on the inner shroud of an autonomous underwater vehicle show that scattering causes the localization ambiguities (side lobes) to decrease in overall level and to move closer to the true source location, thereby improving localization performance, for signals in the frequency band 2-8 kHz. These measurements are compared with numerical modeling results from a two-dimensional time domain finite difference scheme for scattering from two fluid-loaded cylindrical shells. Measured and numerically modeled results are presented for multiple source aspect angles and frequencies. Matched field processing techniques quantify the source localization capabilities for both measurements and numerical modeling output.

  8. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  9. Using stereophotogrammetric technology for obtaining intraoral digital impressions of implants.

    PubMed

    Pradíes, Guillermo; Ferreiroa, Alberto; Özcan, Mutlu; Giménez, Beatriz; Martínez-Rus, Francisco

    2014-04-01

    The procedure for making impressions of multiple implants continues to be a challenge, despite the various techniques proposed to date. The authors' objective in this case report is to describe a novel digital impression method for multiple implants involving the use of stereophotogrammetric technology. The authors present three cases of patients who had multiple implants in which the impressions were obtained with this technology. Initially, a stereo camera with an infrared flash detects the position of special flag abutments screwed into the implants. This process is based on registering the x, y and z coordinates of each implant and the distances between them. This information is converted into a stereolithographic (STL) file. To add the soft-tissue information, the user must obtain another STL file by using an intraoral or extraoral scanner. In the first case presented, this information was acquired from the plaster model with an extraoral scanner; in the second case, from a Digital Imaging and Communication in Medicine (DICOM) file of the plaster model obtained with cone-beam computed tomography; and in the third case, through an intraoral digital impression with a confocal scanner. In the three cases, the frameworks manufactured from this technique showed a correct clinical passive fit. At follow-up appointments held six, 12 and 24 months after insertion of the prosthesis, no complications were reported. Stereophotogrammetric technology is a viable, accurate and easy technique for making multiple implant impressions. Clinicians can use stereophotogrammetric technology to acquire reliable digital master models as a first step in producing frameworks with a correct passive fit.

  10. SIRE: a MIMO radar for landmine/IED detection

    NASA Astrophysics Data System (ADS)

    Ojowu, Ode; Wu, Yue; Li, Jian; Nguyen, Lam

    2013-05-01

    Multiple-input multiple-output (MIMO) radar systems have been shown to have significant performance improvements over their single-input multiple-output (SIMO) counterparts. For transmit and receive elements that are collocated, the waveform diversity afforded by this radar is exploited for performance improvements. These improvements include but are not limited to improved target detection, improved parameter identifiability and better resolvability. In this paper, we present the Synchronous Impulse Reconstruction Radar (SIRE) Ultra-wideband (UWB) radar designed by the Army Research Lab (ARL) for landmine and improvised explosive device (IED) detection as a 2 by 16 MIMO radar (with collocated antennas). Its improvement over its SIMO counterpart in terms of beampattern/cross range resolution are discussed and demonstrated using simulated data herein. The limitations of this radar for Radio Frequency Interference (RFI) suppression are also discussed in this paper. A relaxation method (RELAX) combined with averaging of multiple realizations of the measured data is presented for RFI suppression; results show no noticeable target signature distortion after suppression. In this paper, the back-projection (delay and sum) data independent method is used for generating SAR images. A side-lobe minimization technique called recursive side-lobe minimization (RSM) is also discussed for reducing side-lobes in this data independent approach. We introduce a data-dependent sparsity based spectral estimation technique called Sparse Learning via Iterative Minimization (SLIM) as well as a data-dependent CLEAN approach for generating SAR images for the SIRE radar. These data-adaptive techniques show improvement in side-lobe reduction and resolution for simulated data for the SIRE radar.

  11. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  12. Combination of multiple model population analysis and mid-infrared technology for the estimation of copper content in Tegillarca granosa

    NASA Astrophysics Data System (ADS)

    Hu, Meng-Han; Chen, Xiao-Jing; Ye, Peng-Chao; Chen, Xi; Shi, Yi-Jian; Zhai, Guang-Tao; Yang, Xiao-Kang

    2016-11-01

    The aim of this study was to use mid-infrared spectroscopy coupled with multiple model population analysis based on Monte Carlo-uninformative variable elimination for rapidly estimating the copper content of Tegillarca granosa. Copper-specific wavelengths were first extracted from the whole spectra, and subsequently, a least square-support vector machine was used to develop the prediction models. Compared with the prediction model based on full wavelengths, models that used 100 multiple MC-UVE selected wavelengths without and with bin operation showed comparable performances with Rp (root mean square error of Prediction) of 0.97 (14.60 mg/kg) and 0.94 (20.85 mg/kg) versus 0.96 (17.27 mg/kg), as well as ratio of percent deviation (number of wavelength) of 2.77 (407) and 1.84 (45) versus 2.32 (1762). The obtained results demonstrated that the mid-infrared technique could be used for estimating copper content in T. granosa. In addition, the proposed multiple model population analysis can eliminate uninformative, weakly informative and interfering wavelengths effectively, that substantially reduced the model complexity and computation time.

  13. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  14. Electrophysiological models of neural processing.

    PubMed

    Nelson, Mark E

    2011-01-01

    The brain is an amazing information processing system that allows organisms to adaptively monitor and control complex dynamic interactions with their environment across multiple spatial and temporal scales. Mathematical modeling and computer simulation techniques have become essential tools in understanding diverse aspects of neural processing ranging from sub-millisecond temporal coding in the sound localization circuity of barn owls to long-term memory storage and retrieval in humans that can span decades. The processing capabilities of individual neurons lie at the core of these models, with the emphasis shifting upward and downward across different levels of biological organization depending on the nature of the questions being addressed. This review provides an introduction to the techniques for constructing biophysically based models of individual neurons and local networks. Topics include Hodgkin-Huxley-type models of macroscopic membrane currents, Markov models of individual ion-channel currents, compartmental models of neuronal morphology, and network models involving synaptic interactions among multiple neurons.

  15. Balancing focused combinatorial libraries based on multiple GPCR ligands

    NASA Astrophysics Data System (ADS)

    Soltanshahi, Farhad; Mansley, Tamsin E.; Choi, Sun; Clark, Robert D.

    2006-08-01

    G-Protein coupled receptors (GPCRs) are important targets for drug discovery, and combinatorial chemistry is an important tool for pharmaceutical development. The absence of detailed structural information, however, limits the kinds of combinatorial design techniques that can be applied to GPCR targets. This is particularly problematic given the current emphasis on focused combinatorial libraries. By linking an incremental construction method (OptDesign) to the very fast shape-matching capability of ChemSpace, we have created an efficient method for designing targeted sublibraries that are topomerically similar to known actives. Multi-objective scoring allows consideration of multiple queries (actives) simultaneously. This can lead to a distribution of products skewed towards one particular query structure, however, particularly when the ligands of interest are quite dissimilar to one another. A novel pivoting technique is described which makes it possible to generate promising designs even under those circumstances. The approach is illustrated by application to some serotonergic agonists and chemokine antagonists.

  16. Model-based cell number quantification using online single-oxygen sensor data for tissue engineering perfusion bioreactors.

    PubMed

    Lambrechts, T; Papantoniou, I; Sonnaert, M; Schrooten, J; Aerts, J-M

    2014-10-01

    Online and non-invasive quantification of critical tissue engineering (TE) construct quality attributes in TE bioreactors is indispensable for the cost-effective up-scaling and automation of cellular construct manufacturing. However, appropriate monitoring techniques for cellular constructs in bioreactors are still lacking. This study presents a generic and robust approach to determine cell number and metabolic activity of cell-based TE constructs in perfusion bioreactors based on single oxygen sensor data in dynamic perfusion conditions. A data-based mechanistic modeling technique was used that is able to correlate the number of cells within the scaffold (R(2)  = 0.80) and the metabolic activity of the cells (R(2)  = 0.82) to the dynamics of the oxygen response to step changes in the perfusion rate. This generic non-destructive measurement technique is effective for a large range of cells, from as low as 1.0 × 10(5) cells to potentially multiple millions of cells, and can open-up new possibilities for effective bioprocess monitoring. © 2014 Wiley Periodicals, Inc.

  17. Laser direct-write for fabrication of three-dimensional paper-based devices.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2016-08-16

    We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.

  18. Scheduling Real-Time Mixed-Criticality Jobs

    NASA Astrophysics Data System (ADS)

    Baruah, Sanjoy K.; Bonifaci, Vincenzo; D'Angelo, Gianlorenzo; Li, Haohan; Marchetti-Spaccamela, Alberto; Megow, Nicole; Stougie, Leen

    Many safety-critical embedded systems are subject to certification requirements; some systems may be required to meet multiple sets of certification requirements, from different certification authorities. Certification requirements in such "mixed-criticality" systems give rise to interesting scheduling problems, that cannot be satisfactorily addressed using techniques from conventional scheduling theory. In this paper, we study a formal model for representing such mixed-criticality workloads. We demonstrate first the intractability of determining whether a system specified in this model can be scheduled to meet all its certification requirements, even for systems subject to two sets of certification requirements. Then we quantify, via the metric of processor speedup factor, the effectiveness of two techniques, reservation-based scheduling and priority-based scheduling, that are widely used in scheduling such mixed-criticality systems, showing that the latter of the two is superior to the former. We also show that the speedup factors are tight for these two techniques.

  19. Magnetic induction tomography of objects for security applications

    NASA Astrophysics Data System (ADS)

    Ward, Rob; Joseph, Max; Langley, Abbi; Taylor, Stuart; Watson, Joe C.

    2017-10-01

    A coil array imaging system has been further developed from previous investigations, focusing on designing its application for fast screening of small bags or parcels, with a view to the production of a compact instrument for security applications. In addition to reducing image acquisition times, work was directed toward exploring potential cost effective manufacturing routes. Based on magnetic induction tomography and eddy-current principles, the instrument captured images of conductive targets using a lock-in amplifier, individually multiplexing signals between a primary driver coil and a 20 by 21 imaging array of secondary passive coils constructed using a reproducible multiple tile design. The design was based on additive manufacturing techniques and provided 2 orthogonal imaging planes with an ability to reconstruct images in less than 10 seconds. An assessment of one of the imaging planes is presented. This technique potentially provides a cost effective threat evaluation technique that may compliment conventional radiographic approaches.

  20. Phased-mission system analysis using Boolean algebraic methods

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.; Trivedi, Kishor S.

    1993-01-01

    Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.

  1. An Optimized Integrator Windup Protection Technique Applied to a Turbofan Engine Control

    NASA Technical Reports Server (NTRS)

    Watts, Stephen R.; Garg, Sanjay

    1995-01-01

    This paper introduces a new technique for providing memoryless integrator windup protection which utilizes readily available optimization software tools. This integrator windup protection synthesis provides a concise methodology for creating integrator windup protection for each actuation system loop independently while assuring both controller and closed loop system stability. The individual actuation system loops' integrator windup protection can then be combined to provide integrator windup protection for the entire system. This technique is applied to an H(exp infinity) based multivariable control designed for a linear model of an advanced afterburning turbofan engine. The resulting transient characteristics are examined for the integrated system while encountering single and multiple actuation limits.

  2. C-MOS array design techniques: SUMC multiprocessor system study

    NASA Technical Reports Server (NTRS)

    Clapp, W. A.; Helbig, W. A.; Merriam, A. S.

    1972-01-01

    The current capabilities of LSI techniques for speed and reliability, plus the possibilities of assembling large configurations of LSI logic and storage elements, have demanded the study of multiprocessors and multiprocessing techniques, problems, and potentialities. Evaluated are three previous systems studies for a space ultrareliable modular computer multiprocessing system, and a new multiprocessing system is proposed that is flexibly configured with up to four central processors, four 1/0 processors, and 16 main memory units, plus auxiliary memory and peripheral devices. This multiprocessor system features a multilevel interrupt, qualified S/360 compatibility for ground-based generation of programs, virtual memory management of a storage hierarchy through 1/0 processors, and multiport access to multiple and shared memory units.

  3. Large antenna experiments aboard the space shuttle: Application of nonuniform sampling techniques

    NASA Technical Reports Server (NTRS)

    Rahmatsamii, Y.

    1988-01-01

    Future satellite communication and scientific spacecraft will utilize antennas with dimensions as large as 20 meters. In order to commercially use these large, low sidelobe and multiple beam antennas, a high level of confidence must be established as to their performance in the 0-g and space environment. Furthermore, it will be desirable to demonstrate the applicability of surface compensation techniques for slowly varying surface distortions which could result from thermal effects. An overview of recent advances in performing RF measurements on large antennas is presented with emphasis given to the application of a space based far-field range utilizing the Space Shuttle and the concept of a newly developed nonuniform sampling technique.

  4. Compton imaging tomography technique for NDE of large nonuniform structures

    NASA Astrophysics Data System (ADS)

    Grubsky, Victor; Romanov, Volodymyr; Patton, Ned; Jannson, Tomasz

    2011-09-01

    In this paper we describe a new nondestructive evaluation (NDE) technique called Compton Imaging Tomography (CIT) for reconstructing the complete three-dimensional internal structure of an object, based on the registration of multiple two-dimensional Compton-scattered x-ray images of the object. CIT provides high resolution and sensitivity with virtually any material, including lightweight structures and organics, which normally pose problems in conventional x-ray computed tomography because of low contrast. The CIT technique requires only one-sided access to the object, has no limitation on the object's size, and can be applied to high-resolution real-time in situ NDE of large aircraft/spacecraft structures and components. Theoretical and experimental results will be presented.

  5. Negative base encoding in optical linear algebra processors

    NASA Technical Reports Server (NTRS)

    Perlee, C.; Casasent, D.

    1986-01-01

    In the digital multiplication by analog convolution algorithm, the bits of two encoded numbers are convolved to form the product of the two numbers in mixed binary representation; this output can be easily converted to binary. Attention is presently given to negative base encoding, treating base -2 initially, and then showing that the negative base system can be readily extended to any radix. In general, negative base encoding in optical linear algebra processors represents a more efficient technique than either sign magnitude or 2's complement encoding, when the additions of digitally encoded products are performed in parallel.

  6. Anovulation and ovulation induction

    PubMed Central

    Katsikis, I; Kita, M; Karkanaki, A; Prapas, N; Panidis, D

    2006-01-01

    Conventional treatment of normogonadotropic anovulatory infertility is ovulation induction using the antiestrogen clomiphene citrate, followed by follicle-stimulating hormone. Multiple follicle development, associated with ovarian hyperstimulation, and multiple pregnancy remain the major complications. Cumulative singleton and multiple pregnancy rate data after different induction treatments are needed. Newer ovulation induction interventions, such as insulin-sensitizing drugs, aromatase inhibitors and laparoscopic ovarian electrocoagulation, should be compared with conventional treatments. Ovulation induction efficiency might improve if patient subgroups with altered chances for success or complications with new or conventional techniques could be identified, using multivariate prediction models based on initial screening characteristics. This would make ovulation induction more cost-effective, safe and convenient, enabling doctors to advise patients on the most effective and patient-tailored treatment strategy. PMID:20351807

  7. Drag reduction of a car model by linear genetic programming control

    NASA Astrophysics Data System (ADS)

    Li, Ruiying; Noack, Bernd R.; Cordier, Laurent; Borée, Jacques; Harambat, Fabien

    2017-08-01

    We investigate open- and closed-loop active control for aerodynamic drag reduction of a car model. Turbulent flow around a blunt-edged Ahmed body is examined at ReH≈ 3× 105 based on body height. The actuation is performed with pulsed jets at all trailing edges (multiple inputs) combined with a Coanda deflection surface. The flow is monitored with 16 pressure sensors distributed at the rear side (multiple outputs). We apply a recently developed model-free control strategy building on genetic programming in Dracopoulos and Kent (Neural Comput Appl 6:214-228, 1997) and Gautier et al. (J Fluid Mech 770:424-441, 2015). The optimized control laws comprise periodic forcing, multi-frequency forcing and sensor-based feedback including also time-history information feedback and combinations thereof. Key enabler is linear genetic programming (LGP) as powerful regression technique for optimizing the multiple-input multiple-output control laws. The proposed LGP control can select the best open- or closed-loop control in an unsupervised manner. Approximately 33% base pressure recovery associated with 22% drag reduction is achieved in all considered classes of control laws. Intriguingly, the feedback actuation emulates periodic high-frequency forcing. In addition, the control identified automatically the only sensor which listens to high-frequency flow components with good signal to noise ratio. Our control strategy is, in principle, applicable to all multiple actuators and sensors experiments.

  8. Monitoring and identification of spatiotemporal landscape changes in multiple remote sensing images by using a stratified conditional Latin hypercube sampling approach and geostatistical simulation.

    PubMed

    Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh

    2011-06-01

    This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.

  9. Technique of nonvascularized toe phalangeal transfer and distraction lengthening in the treatment of multiple digit symbrachydactyly.

    PubMed

    Netscher, David T; Lewis, Eric V

    2008-06-01

    A combination of nonvascularized multiple toe phalangeal transfers, web space deepening, and distraction lengthening may provide excellent function in the child born with the oligodactylous type of symbrachydactyly. These techniques may reconstruct multiple digits, maintaining a wide and stable grip span with good prehension to the thumb. We detail the techniques of each of these 3 stages in reconstruction and describe appropriate patient selection. Potential complications are discussed. However, with strict attention to technical details, these complications can be minimized.

  10. Indirect fabrication of multiple post-and-core patterns with a vinyl polysiloxane matrix.

    PubMed

    Sabbak, Sahar Asaad

    2002-11-01

    In the described technique, a vinyl polysiloxane material is used as a matrix for the indirect fabrication of multiple custom-cast posts and cores. The matrix technique enables the clinician to fabricate multiple posts and cores in a short period of time. The form, harmony, and common axis of preparation for all cores are well controlled before the definitive crown/fixed partial denture restorations are fabricated. Oral tissues are not exposed to the heat of polymerization or the excess monomer of the resin material when this technique is used.

  11. A variance-decomposition approach to investigating multiscale habitat associations

    USGS Publications Warehouse

    Lawler, J.J.; Edwards, T.C.

    2006-01-01

    The recognition of the importance of spatial scale in ecology has led many researchers to take multiscale approaches to studying habitat associations. However, few of the studies that investigate habitat associations at multiple spatial scales have considered the potential effects of cross-scale correlations in measured habitat variables. When cross-scale correlations in such studies are strong, conclusions drawn about the relative strength of habitat associations at different spatial scales may be inaccurate. Here we adapt and demonstrate an analytical technique based on variance decomposition for quantifying the influence of cross-scale correlations on multiscale habitat associations. We used the technique to quantify the variation in nest-site locations of Red-naped Sapsuckers (Sphyrapicus nuchalis) and Northern Flickers (Colaptes auratus) associated with habitat descriptors at three spatial scales. We demonstrate how the method can be used to identify components of variation that are associated only with factors at a single spatial scale as well as shared components of variation that represent cross-scale correlations. Despite the fact that no explanatory variables in our models were highly correlated (r < 0.60), we found that shared components of variation reflecting cross-scale correlations accounted for roughly half of the deviance explained by the models. These results highlight the importance of both conducting habitat analyses at multiple spatial scales and of quantifying the effects of cross-scale correlations in such analyses. Given the limits of conventional analytical techniques, we recommend alternative methods, such as the variance-decomposition technique demonstrated here, for analyzing habitat associations at multiple spatial scales. ?? The Cooper Ornithological Society 2006.

  12. Dynamic phase-sensitive optical coherence elastography at a true kilohertz frame-rate

    NASA Astrophysics Data System (ADS)

    Singh, Manmohan; Wu, Chen; Liu, Chih-Hao; Li, Jiasong; Schill, Alexander; Nair, Achuth; Larin, Kirill V.

    2016-03-01

    Dynamic optical coherence elastography (OCE) techniques have rapidly emerged as a noninvasive way to characterize the biomechanical properties of tissue. However, clinical applications of the majority of these techniques have been unfeasible due to the extended acquisition time because of multiple temporal OCT acquisitions (M-B mode). Moreover, multiple excitations, large datasets, and prolonged laser exposure prohibit their translation to the clinic, where patient discomfort and safety are critical criteria. Here, we demonstrate the feasibility of noncontact true kilohertz frame-rate dynamic optical coherence elastography by directly imaging a focused air-pulse induced elastic wave with a home-built phase-sensitive OCE system. The OCE system was based on a 4X buffered Fourier Domain Mode Locked swept source laser with an A-scan rate of ~1.5 MHz, and imaged the elastic wave propagation at a frame rate of ~7.3 kHz. Because the elastic wave directly imaged, only a single excitation was utilized for one line scan measurement. Rather than acquiring multiple temporal scans at successive spatial locations as with previous techniques, here, successive B-scans were acquired over the measurement region (B-M mode). Preliminary measurements were taken on tissue-mimicking agar phantoms of various concentrations, and the results showed good agreement with uniaxial mechanical compression testing. Then, the elasticity of an in situ porcine cornea in the whole eye-globe configuration at various intraocular pressures was measured. The results showed that this technique can acquire a depth-resolved elastogram in milliseconds. Furthermore, the ultra-fast acquisition ensured that the laser safety exposure limit for the cornea was not exceeded.

  13. A fast multi-resolution approach to tomographic PIV

    NASA Astrophysics Data System (ADS)

    Discetti, Stefano; Astarita, Tommaso

    2012-03-01

    Tomographic particle image velocimetry (Tomo-PIV) is a recently developed three-component, three-dimensional anemometric non-intrusive measurement technique, based on an optical tomographic reconstruction applied to simultaneously recorded images of the distribution of light intensity scattered by seeding particles immersed into the flow. Nowadays, the reconstruction process is carried out mainly by iterative algebraic reconstruction techniques, well suited to handle the problem of limited number of views, but computationally intensive and memory demanding. The adoption of the multiplicative algebraic reconstruction technique (MART) has become more and more accepted. In the present work, a novel multi-resolution approach is proposed, relying on the adoption of a coarser grid in the first step of the reconstruction to obtain a fast estimation of a reliable and accurate first guess. A performance assessment, carried out on three-dimensional computer-generated distributions of particles, shows a substantial acceleration of the reconstruction process for all the tested seeding densities with respect to the standard method based on 5 MART iterations; a relevant reduction in the memory storage is also achieved. Furthermore, a slight accuracy improvement is noticed. A modified version, improved by a multiplicative line of sight estimation of the first guess on the compressed configuration, is also tested, exhibiting a further remarkable decrease in both memory storage and computational effort, mostly at the lowest tested seeding densities, while retaining the same performances in terms of accuracy.

  14. High-Resolution Surface Reconstruction from Imagery for Close Range Cultural Heritage Applications

    NASA Astrophysics Data System (ADS)

    Wenzel, K.; Abdel-Wahab, M.; Cefalu, A.; Fritsch, D.

    2012-07-01

    The recording of high resolution point clouds with sub-mm resolution is a demanding and cost intensive task, especially with current equipment like handheld laser scanners. We present an image based approached, where techniques of image matching and dense surface reconstruction are combined with a compact and affordable rig of off-the-shelf industry cameras. Such cameras provide high spatial resolution with low radiometric noise, which enables a one-shot solution and thus an efficient data acquisition while satisfying high accuracy requirements. However, the largest drawback of image based solutions is often the acquisition of surfaces with low texture where the image matching process might fail. Thus, an additional structured light projector is employed, represented here by the pseudo-random pattern projector of the Microsoft Kinect. Its strong infrared-laser projects speckles of different sizes. By using dense image matching techniques on the acquired images, a 3D point can be derived for almost each pixel. The use of multiple cameras enables the acquisition of a high resolution point cloud with high accuracy for each shot. For the proposed system up to 3.5 Mio. 3D points with sub-mm accuracy can be derived per shot. The registration of multiple shots is performed by Structure and Motion reconstruction techniques, where feature points are used to derive the camera positions and rotations automatically without initial information.

  15. A simulation-based evaluation of methods for inferring linear barriers to gene flow

    Treesearch

    Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol

    2012-01-01

    Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...

  16. MURI: Adaptive Waveform Design for Full Spectral Dominance

    DTIC Science & Technology

    2011-03-11

    a three- dimensional urban tracking model, based on the nonlinear measurement model (that uses the urban multipath geometry with different types of ... the time evolution of the scattering function with a high dimensional dynamic system; a multiple particle filter technique is used to sequentially...integration of space -time coding with a fixed set of beams. It complements the

  17. A Multimodal Dialog System for Language Assessment: Current State and Future Directions. Research Report. ETS RR-17-21

    ERIC Educational Resources Information Center

    Suendermann-Oeft, David; Ramanarayanan, Vikram; Yu, Zhou; Qian, Yao; Evanini, Keelan; Lange, Patrick; Wang, Xinhao; Zechner, Klaus

    2017-01-01

    We present work in progress on a multimodal dialog system for English language assessment using a modular cloud-based architecture adhering to open industry standards. Among the modules being developed for the system, multiple modules heavily exploit machine learning techniques, including speech recognition, spoken language proficiency rating,…

  18. Multidomain, multirecord-type Datatrieve-11 databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horning, R.R.

    1983-01-01

    Data bases consisting of multiple domains and multirecord-type domains present special problems in design, loading, maintenance, and report generation. The logical association of records is a fundamental concern in all these problem areas. This paper describes techniques for dealing with this and other specifics using Datatrieve-11, Sort-11, FORTRAN 77, and the RSX-11M Indirect Command File Processor.

  19. New Software for Market Segmentation Analysis: A Chi-Square Interaction Detector. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Lay, Robert S.

    The advantages and disadvantages of new software for market segmentation analysis are discussed, and the application of this new, chi-square based procedure (CHAID), is illustrated. A comparison is presented of an earlier, binary segmentation technique (THAID) and a multiple discriminant analysis. It is suggested that CHAID is superior to earlier…

  20. Reducing Router Forwarding Table Size Using Aggregation and Caching

    ERIC Educational Resources Information Center

    Liu, Yaoqing

    2013-01-01

    The fast growth of global routing table size has been causing concerns that the Forwarding Information Base (FIB) will not be able to fit in existing routers' expensive line-card memory, and upgrades will lead to a higher cost for network operators and customers. FIB Aggregation, a technique that merges multiple FIB entries into one, is probably…

  1. Monitoring and Management of a Sensitive Resource: A Landscape-level Approach with Amphibians

    DTIC Science & Technology

    2001-03-01

    Results show that each technique is effective for a portion of the amphibian community and that the use of multiple techniques is essential to any...combinations of species. These results show that multiple techniques are needed for a full assessment of amphibian populations and communities at...against which future assessments of amphibian populations and communities on each installation can be evaluated. The standardized techniques used in FY

  2. Multi-object model-based multi-atlas segmentation for rodent brains using dense discrete correspondences

    NASA Astrophysics Data System (ADS)

    Lee, Joohwi; Kim, Sun Hyung; Styner, Martin

    2016-03-01

    The delineation of rodent brain structures is challenging due to low-contrast multiple cortical and subcortical organs that are closely interfacing to each other. Atlas-based segmentation has been widely employed due to its ability to delineate multiple organs at the same time via image registration. The use of multiple atlases and subsequent label fusion techniques has further improved the robustness and accuracy of atlas-based segmentation. However, the accuracy of atlas-based segmentation is still prone to registration errors; for example, the segmentation of in vivo MR images can be less accurate and robust against image artifacts than the segmentation of post mortem images. In order to improve the accuracy and robustness of atlas-based segmentation, we propose a multi-object, model-based, multi-atlas segmentation method. We first establish spatial correspondences across atlases using a set of dense pseudo-landmark particles. We build a multi-object point distribution model using those particles in order to capture inter- and intra- subject variation among brain structures. The segmentation is obtained by fitting the model into a subject image, followed by label fusion process. Our result shows that the proposed method resulted in greater accuracy than comparable segmentation methods, including a widely used ANTs registration tool.

  3. Effective in-service training design and delivery: evidence from an integrative literature review.

    PubMed

    Bluestone, Julia; Johnson, Peter; Fullerton, Judith; Carr, Catherine; Alderman, Jessica; BonTempo, James

    2013-10-01

    In-service training represents a significant financial investment for supporting continued competence of the health care workforce. An integrative review of the education and training literature was conducted to identify effective training approaches for health worker continuing professional education (CPE) and what evidence exists of outcomes derived from CPE. A literature review was conducted from multiple databases including PubMed, the Cochrane Library and Cumulative Index to Nursing and Allied Health Literature (CINAHL) between May and June 2011. The initial review of titles and abstracts produced 244 results. Articles selected for analysis after two quality reviews consisted of systematic reviews, randomized controlled trials (RCTs) and programme evaluations published in peer-reviewed journals from 2000 to 2011 in the English language. The articles analysed included 37 systematic reviews and 32 RCTs. The research questions focused on the evidence supporting educational techniques, frequency, setting and media used to deliver instruction for continuing health professional education. The evidence suggests the use of multiple techniques that allow for interaction and enable learners to process and apply information. Case-based learning, clinical simulations, practice and feedback are identified as effective educational techniques. Didactic techniques that involve passive instruction, such as reading or lecture, have been found to have little or no impact on learning outcomes. Repetitive interventions, rather than single interventions, were shown to be superior for learning outcomes. Settings similar to the workplace improved skill acquisition and performance. Computer-based learning can be equally or more effective than live instruction and more cost efficient if effective techniques are used. Effective techniques can lead to improvements in knowledge and skill outcomes and clinical practice behaviours, but there is less evidence directly linking CPE to improved clinical outcomes. Very limited quality data are available from low- to middle-income countries. Educational techniques are critical to learning outcomes. Targeted, repetitive interventions can result in better learning outcomes. Setting should be selected to support relevant and realistic practice and increase efficiency. Media should be selected based on the potential to support effective educational techniques and efficiency of instruction. CPE can lead to improved learning outcomes if effective techniques are used. Limited data indicate that there may also be an effect on improving clinical practice behaviours. The research agenda calls for well-constructed evaluations of culturally appropriate combinations of technique, setting, frequency and media, developed for and tested among all levels of health workers in low- and middle-income countries.

  4. Effective in-service training design and delivery: evidence from an integrative literature review

    PubMed Central

    2013-01-01

    Background In-service training represents a significant financial investment for supporting continued competence of the health care workforce. An integrative review of the education and training literature was conducted to identify effective training approaches for health worker continuing professional education (CPE) and what evidence exists of outcomes derived from CPE. Methods A literature review was conducted from multiple databases including PubMed, the Cochrane Library and Cumulative Index to Nursing and Allied Health Literature (CINAHL) between May and June 2011. The initial review of titles and abstracts produced 244 results. Articles selected for analysis after two quality reviews consisted of systematic reviews, randomized controlled trials (RCTs) and programme evaluations published in peer-reviewed journals from 2000 to 2011 in the English language. The articles analysed included 37 systematic reviews and 32 RCTs. The research questions focused on the evidence supporting educational techniques, frequency, setting and media used to deliver instruction for continuing health professional education. Results The evidence suggests the use of multiple techniques that allow for interaction and enable learners to process and apply information. Case-based learning, clinical simulations, practice and feedback are identified as effective educational techniques. Didactic techniques that involve passive instruction, such as reading or lecture, have been found to have little or no impact on learning outcomes. Repetitive interventions, rather than single interventions, were shown to be superior for learning outcomes. Settings similar to the workplace improved skill acquisition and performance. Computer-based learning can be equally or more effective than live instruction and more cost efficient if effective techniques are used. Effective techniques can lead to improvements in knowledge and skill outcomes and clinical practice behaviours, but there is less evidence directly linking CPE to improved clinical outcomes. Very limited quality data are available from low- to middle-income countries. Conclusions Educational techniques are critical to learning outcomes. Targeted, repetitive interventions can result in better learning outcomes. Setting should be selected to support relevant and realistic practice and increase efficiency. Media should be selected based on the potential to support effective educational techniques and efficiency of instruction. CPE can lead to improved learning outcomes if effective techniques are used. Limited data indicate that there may also be an effect on improving clinical practice behaviours. The research agenda calls for well-constructed evaluations of culturally appropriate combinations of technique, setting, frequency and media, developed for and tested among all levels of health workers in low- and middle-income countries. PMID:24083659

  5. Near-Field Source Localization by Using Focusing Technique

    NASA Astrophysics Data System (ADS)

    He, Hongyang; Wang, Yide; Saillard, Joseph

    2008-12-01

    We discuss two fast algorithms to localize multiple sources in near field. The symmetry-based method proposed by Zhi and Chia (2007) is first improved by implementing a search-free procedure for the reduction of computation cost. We present then a focusing-based method which does not require symmetric array configuration. By using focusing technique, the near-field signal model is transformed into a model possessing the same structure as in the far-field situation, which allows the bearing estimation with the well-studied far-field methods. With the estimated bearing, the range estimation of each source is consequently obtained by using 1D MUSIC method without parameter pairing. The performance of the improved symmetry-based method and the proposed focusing-based method is compared by Monte Carlo simulations and with Crammer-Rao bound as well. Unlike other near-field algorithms, these two approaches require neither high-computation cost nor high-order statistics.

  6. Atmospheric turbulence compensation in orbital angular momentum communications: Advances and perspectives

    NASA Astrophysics Data System (ADS)

    Li, Shuhui; Chen, Shi; Gao, Chunqing; Willner, Alan E.; Wang, Jian

    2018-02-01

    Orbital angular momentum (OAM)-carrying beams have recently generated considerable interest due to their potential use in communication systems to increase transmission capacity and spectral efficiency. For OAM-based free-space optical (FSO) links, a critical challenge is the atmospheric turbulence that will distort the helical wavefronts of OAM beams leading to the decrease of received power, introducing crosstalk between multiple channels, and impairing link performance. In this paper, we review recent advances in turbulence effects compensation techniques for OAM-based FSO communication links. First, basic concepts of atmospheric turbulence and theoretical model are introduced. Second, atmospheric turbulence effects on OAM beams are theoretically and experimentally investigated and discussed. Then, several typical turbulence compensation approaches, including both adaptive optics-based (optical domain) and signal processing-based (electrical domain) techniques, are presented. Finally, key challenges and perspectives of compensation of turbulence-distorted OAM links are discussed.

  7. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors.

    PubMed

    Kawahito, Shoji; Seo, Min-Woong

    2016-11-06

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e - rms ) when compared with the CMS gain of two (2.4 e - rms ), or 16 (1.1 e - rms ).

  8. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    PubMed Central

    Kawahito, Shoji; Seo, Min-Woong

    2016-01-01

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e−rms) when compared with the CMS gain of two (2.4 e−rms), or 16 (1.1 e−rms). PMID:27827972

  9. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  10. Control of free-flying space robot manipulator systems

    NASA Technical Reports Server (NTRS)

    Cannon, Robert H., Jr.

    1989-01-01

    Control techniques for self-contained, autonomous free-flying space robots are being tested and developed. Free-flying space robots are envisioned as a key element of any successful long term presence in space. These robots must be capable of performing the assembly, maintenance, and inspection, and repair tasks that currently require astronaut extra-vehicular activity (EVA). Use of robots will provide economic savings as well as improved astronaut safety by reducing and in many cases, eliminating the need for human EVA. The focus of the work is to develop and carry out a set of research projects using laboratory models of satellite robots. These devices use air-cushion-vehicle (ACV) technology to simulate in two dimensions the drag-free, zero-g conditions of space. Current work is divided into six major projects or research areas. Fixed-base cooperative manipulation work represents our initial entry into multiple arm cooperation and high-level control with a sophisticated user interface. The floating-base cooperative manipulation project strives to transfer some of the technologies developed in the fixed-base work onto a floating base. The global control and navigation experiment seeks to demonstrate simultaneous control of the robot manipulators and the robot base position so that tasks can be accomplished while the base is undergoing a controlled motion. The multiple-vehicle cooperation project's goal is to demonstrate multiple free-floating robots working in teams to carry out tasks too difficult or complex for a single robot to perform. The Location Enhancement Arm Push-off (LEAP) activity's goal is to provide a viable alternative to expendable gas thrusters for vehicle propulsion wherein the robot uses its manipulators to throw itself from place to place. Because the successful execution of the LEAP technique requires an accurate model of the robot and payload mass properties, it was deemed an attractive testbed for adaptive control technology.

  11. The Economy in Autologous Tissue Transfer: Part 1. The Kiss Flap Technique.

    PubMed

    Zhang, Yi Xin; Hayakawa, Thomas J; Levin, L Scott; Hallock, Geoffrey G; Lazzeri, Davide

    2016-03-01

    All reconstructive microsurgeons realize the need to improve aesthetic and functional donor-site outcomes. A "kiss" flap design concept was developed to increase the surface area of skin flap coverage while minimizing donor-site morbidity. The main goal of the kiss flap technique is to harvest multiple skin paddles that are smaller than those raised with traditional techniques, to minimize donor-site morbidity. These smaller flap components are then sutured to each other, or said to kiss each other side-by-side, to create a large, wide flap. The skin paddles in the kiss technique can be linked to one another by a variety of native intrinsic vascular connections, by additional microanastomosis, or both. This technique can be widely applied to both free and pedicle flaps, and essentially allows for the reconstruction of a large defect while providing the easy primary closure of a smaller donor-site defect. According to their origin of blood supply, kiss flaps are classified into three styles and five types. All of the different types of kiss flaps are unique in both flap design and harvest technique. Most kiss flaps are based on common flaps already familiar to the reconstructive surgeon. The basis of the kiss flap design concept is to convert multiple narrow flaps into a single unified flap of the desired greater width. This maximizes the size of the resulting flap and minimizes donor-site morbidity, as a direct linear closure is usually possible. Therapeutic, V.

  12. Self-calibrated multiple-echo acquisition with radial trajectories using the conjugate gradient method (SMART-CG).

    PubMed

    Jung, Youngkyoo; Samsonov, Alexey A; Bydder, Mark; Block, Walter F

    2011-04-01

    To remove phase inconsistencies between multiple echoes, an algorithm using a radial acquisition to provide inherent phase and magnitude information for self correction was developed. The information also allows simultaneous support for parallel imaging for multiple coil acquisitions. Without a separate field map acquisition, a phase estimate from each echo in multiple echo train was generated. When using a multiple channel coil, magnitude and phase estimates from each echo provide in vivo coil sensitivities. An algorithm based on the conjugate gradient method uses these estimates to simultaneously remove phase inconsistencies between echoes, and in the case of multiple coil acquisition, simultaneously provides parallel imaging benefits. The algorithm is demonstrated on single channel, multiple channel, and undersampled data. Substantial image quality improvements were demonstrated. Signal dropouts were completely removed and undersampling artifacts were well suppressed. The suggested algorithm is able to remove phase cancellation and undersampling artifacts simultaneously and to improve image quality of multiecho radial imaging, the important technique for fast three-dimensional MRI data acquisition. Copyright © 2011 Wiley-Liss, Inc.

  13. Self-calibrated Multiple-echo Acquisition with Radial Trajectories using the Conjugate Gradient Method (SMART-CG)

    PubMed Central

    Jung, Youngkyoo; Samsonov, Alexey A; Bydder, Mark; Block, Walter F.

    2011-01-01

    Purpose To remove phase inconsistencies between multiple echoes, an algorithm using a radial acquisition to provide inherent phase and magnitude information for self correction was developed. The information also allows simultaneous support for parallel imaging for multiple coil acquisitions. Materials and Methods Without a separate field map acquisition, a phase estimate from each echo in multiple echo train was generated. When using a multiple channel coil, magnitude and phase estimates from each echo provide in-vivo coil sensitivities. An algorithm based on the conjugate gradient method uses these estimates to simultaneously remove phase inconsistencies between echoes, and in the case of multiple coil acquisition, simultaneously provides parallel imaging benefits. The algorithm is demonstrated on single channel, multiple channel, and undersampled data. Results Substantial image quality improvements were demonstrated. Signal dropouts were completely removed and undersampling artifacts were well suppressed. Conclusion The suggested algorithm is able to remove phase cancellation and undersampling artifacts simultaneously and to improve image quality of multiecho radial imaging, the important technique for fast 3D MRI data acquisition. PMID:21448967

  14. Integration of Network Topological and Connectivity Properties for Neuroimaging Classification

    PubMed Central

    Jie, Biao; Gao, Wei; Wang, Qian; Wee, Chong-Yaw

    2014-01-01

    Rapid advances in neuroimaging techniques have provided an efficient and noninvasive way for exploring the structural and functional connectivity of the human brain. Quantitative measurement of abnormality of brain connectivity in patients with neurodegenerative diseases, such as mild cognitive impairment (MCI) and Alzheimer’s disease (AD), have also been widely reported, especially at a group level. Recently, machine learning techniques have been applied to the study of AD and MCI, i.e., to identify the individuals with AD/MCI from the healthy controls (HCs). However, most existing methods focus on using only a single property of a connectivity network, although multiple network properties, such as local connectivity and global topological properties, can potentially be used. In this paper, by employing multikernel based approach, we propose a novel connectivity based framework to integrate multiple properties of connectivity network for improving the classification performance. Specifically, two different types of kernels (i.e., vector-based kernel and graph kernel) are used to quantify two different yet complementary properties of the network, i.e., local connectivity and global topological properties. Then, multikernel learning (MKL) technique is adopted to fuse these heterogeneous kernels for neuroimaging classification. We test the performance of our proposed method on two different data sets. First, we test it on the functional connectivity networks of 12 MCI and 25 HC subjects. The results show that our method achieves significant performance improvement over those using only one type of network property. Specifically, our method achieves a classification accuracy of 91.9%, which is 10.8% better than those by single network-property-based methods. Then, we test our method for gender classification on a large set of functional connectivity networks with 133 infants scanned at birth, 1 year, and 2 years, also demonstrating very promising results. PMID:24108708

  15. Creating ensembles of decision trees through sampling

    DOEpatents

    Kamath, Chandrika; Cantu-Paz, Erick

    2005-08-30

    A system for decision tree ensembles that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in ensembles.

  16. Spontaneous abortion in multiple pregnancy: focus on fetal pathology.

    PubMed

    Joó, József Gábor; Csaba, Ákos; Szigeti, Zsanett; Rigó, János

    2012-08-15

    Multiple pregnancy with its wide array of medical consequences poses an important condition during pregnancy. We performed perinatal autopsy in 49 cases of spontaneous abortion resulting from multiple pregnancies during the study period. Twenty-seven of the 44 twin pregnancies ending in miscarriage were conceived naturally, whereas 17 were conceived through assisted reproductive techniques. Each of the 5 triplet pregnancies ending in miscarriage was conceived through assisted reproductive techniques. There was a positive history of miscarriage in 22.4% of the cases. Monochorial placentation occurred more commonly in multiple pregnancies terminating with miscarriage than in multiple pregnancies without miscarriage. A fetal congenital malformation was found in 8 cases. Three of these cases were conceived through assisted reproductive techniques, and 5 were conceived naturally. Miscarriage was due to intrauterine infection in 36% of the cases. Our study confirms that spontaneous abortion is more common in multiple than in singleton pregnancies. Monochorial placentation predicted a higher fetal morbidity and mortality. In pregnancies where all fetuses were of male gender, miscarriage was more common than in pregnancies where all fetuses were female. Assisted reproductive techniques do not predispose to the development of fetal malformations. Copyright © 2012 Elsevier GmbH. All rights reserved.

  17. Laser 3D micro-manufacturing

    NASA Astrophysics Data System (ADS)

    Piqué, Alberto; Auyeung, Raymond C. Y.; Kim, Heungsoo; Charipar, Nicholas A.; Mathews, Scott A.

    2016-06-01

    Laser-based materials processing techniques are gaining widespread use in micro-manufacturing applications. The use of laser microfabrication techniques enables the processing of micro- and nanostructures from a wide range of materials and geometries without the need for masking and etching steps commonly associated with photolithography. This review aims to describe the broad applications space covered by laser-based micro- and nanoprocessing techniques and the benefits offered by the use of lasers in micro-manufacturing processes. Given their non-lithographic nature, these processes are also referred to as laser direct-write and constitute some of the earliest demonstrations of 3D printing or additive manufacturing at the microscale. As this review will show, the use of lasers enables precise control of the various types of processing steps—from subtractive to additive—over a wide range of scales with an extensive materials palette. Overall, laser-based direct-write techniques offer multiple modes of operation including the removal (via ablative processes) and addition (via photopolymerization or printing) of most classes of materials using the same equipment in many cases. The versatility provided by these multi-function, multi-material and multi-scale laser micro-manufacturing processes cannot be matched by photolithography nor with other direct-write microfabrication techniques and offer unique opportunities for current and future 3D micro-manufacturing applications.

  18. Incomplete data based parameter identification of nonlinear and time-variant oscillators with fractional derivative elements

    NASA Astrophysics Data System (ADS)

    Kougioumtzoglou, Ioannis A.; dos Santos, Ketson R. M.; Comerford, Liam

    2017-09-01

    Various system identification techniques exist in the literature that can handle non-stationary measured time-histories, or cases of incomplete data, or address systems following a fractional calculus modeling. However, there are not many (if any) techniques that can address all three aforementioned challenges simultaneously in a consistent manner. In this paper, a novel multiple-input/single-output (MISO) system identification technique is developed for parameter identification of nonlinear and time-variant oscillators with fractional derivative terms subject to incomplete non-stationary data. The technique utilizes a representation of the nonlinear restoring forces as a set of parallel linear sub-systems. In this regard, the oscillator is transformed into an equivalent MISO system in the wavelet domain. Next, a recently developed L1-norm minimization procedure based on compressive sensing theory is applied for determining the wavelet coefficients of the available incomplete non-stationary input-output (excitation-response) data. Finally, these wavelet coefficients are utilized to determine appropriately defined time- and frequency-dependent wavelet based frequency response functions and related oscillator parameters. Several linear and nonlinear time-variant systems with fractional derivative elements are used as numerical examples to demonstrate the reliability of the technique even in cases of noise corrupted and incomplete data.

  19. Fabry-Perot Based Radiometers for Precise Measurement of Greenhouse Gases

    NASA Technical Reports Server (NTRS)

    Heaps, William S.; Wilson, Emily L.; Georgieva, Elena

    2007-01-01

    Differential radiometers based upon the Fabry-Perot interferometer have been developed and demonstrated that exhibit very great sensitivity to changes in the atmospheric column of carbon dioxide, oxygen, and water vapor. These instruments employ a solid Fabry-Perot etalon that is tuned to the proper wavelength by changing the temperature. By choosing the thickness of the etalon its multiple pass bands can be made to align with regularly space absorption features of the molecule under investigation. Use of multiple absorption features improves the optical throughput of the instrument and improves the stability of the instrument response with respect to environmental changes. Efforts are underway at Goddard to extend this technique to the carbon 13 isotope of carbon dioxide and to methane. These instruments are intrinsically rugged and can be made rather small and inexpensively. They therefore hold promise for widespread use in ground based networks for calibration of satellite instruments such as OCO and GOSAT. Results will be presented for ground based and airborne operations for these systems. The effects of atmospheric scattering, pointing errors, pressure broadening and temperature effects will be discussed with regard to achieving precision better than .5% required for validation of carbon dioxide column measured from space. Designs permitting the extension of the technique to an even larger number of atmospheric species will be discussed along with theoretical analysis of potential system performance.

  20. Classification techniques on computerized systems to predict and/or to detect Apnea: A systematic review.

    PubMed

    Pombo, Nuno; Garcia, Nuno; Bousson, Kouamana

    2017-03-01

    Sleep apnea syndrome (SAS), which can significantly decrease the quality of life is associated with a major risk factor of health implications such as increased cardiovascular disease, sudden death, depression, irritability, hypertension, and learning difficulties. Thus, it is relevant and timely to present a systematic review describing significant applications in the framework of computational intelligence-based SAS, including its performance, beneficial and challenging effects, and modeling for the decision-making on multiple scenarios. This study aims to systematically review the literature on systems for the detection and/or prediction of apnea events using a classification model. Forty-five included studies revealed a combination of classification techniques for the diagnosis of apnea, such as threshold-based (14.75%) and machine learning (ML) models (85.25%). In addition, the ML models, were clustered in a mind map, include neural networks (44.26%), regression (4.91%), instance-based (11.47%), Bayesian algorithms (1.63%), reinforcement learning (4.91%), dimensionality reduction (8.19%), ensemble learning (6.55%), and decision trees (3.27%). A classification model should provide an auto-adaptive and no external-human action dependency. In addition, the accuracy of the classification models is related with the effective features selection. New high-quality studies based on randomized controlled trials and validation of models using a large and multiple sample of data are recommended. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  1. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  2. Current state of the art of vision based SLAM

    NASA Astrophysics Data System (ADS)

    Muhammad, Naveed; Fofi, David; Ainouz, Samia

    2009-02-01

    The ability of a robot to localise itself and simultaneously build a map of its environment (Simultaneous Localisation and Mapping or SLAM) is a fundamental characteristic required for autonomous operation of the robot. Vision Sensors are very attractive for application in SLAM because of their rich sensory output and cost effectiveness. Different issues are involved in the problem of vision based SLAM and many different approaches exist in order to solve these issues. This paper gives a classification of state-of-the-art vision based SLAM techniques in terms of (i) imaging systems used for performing SLAM which include single cameras, stereo pairs, multiple camera rigs and catadioptric sensors, (ii) features extracted from the environment in order to perform SLAM which include point features and line/edge features, (iii) initialisation of landmarks which can either be delayed or undelayed, (iv) SLAM techniques used which include Extended Kalman Filtering, Particle Filtering, biologically inspired techniques like RatSLAM, and other techniques like Local Bundle Adjustment, and (v) use of wheel odometry information. The paper also presents the implementation and analysis of stereo pair based EKF SLAM for synthetic data. Results prove the technique to work successfully in the presence of considerable amounts of sensor noise. We believe that state of the art presented in the paper can serve as a basis for future research in the area of vision based SLAM. It will permit further research in the area to be carried out in an efficient and application specific way.

  3. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  4. Compromise-based Robust Prioritization of Climate Change Adaptation Strategies for Watershed Management

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Chung, E. S.

    2014-12-01

    This study suggests a robust prioritization framework for climate change adaptation strategies under multiple climate change scenarios with a case study of selecting sites for reusing treated wastewater (TWW) in a Korean urban watershed. The framework utilizes various multi-criteria decision making techniques, including the VIKOR method and the Shannon entropy-based weights. In this case study, the sustainability of TWW use is quantified with indicator-based approaches with the DPSIR framework, which considers both hydro-environmental and socio-economic aspects of the watershed management. Under the various climate change scenarios, the hydro-environmental responses to reusing TWW in potential alternative sub-watersheds are determined using the Hydrologic Simulation Program in Fortran (HSPF). The socio-economic indicators are obtained from the statistical databases. Sustainability scores for multiple scenarios are estimated individually and then integrated with the proposed approach. At last, the suggested framework allows us to prioritize adaptation strategies in a robust manner with varying levels of compromise between utility-based and regret-based strategies.

  5. Designing a Successful Bidding Strategy Using Fuzzy Sets and Agent Attitudes

    NASA Astrophysics Data System (ADS)

    Ma, Jun; Goyal, Madhu Lata

    To be successful in a multi-attribute auction, agents must be capable of adapting to continuously changing bidding price. This chapter presents a novel fuzzy attitude-based bidding strategy (FA-Bid), which employs dual assessment technique, i.e., assessment of multiple attributes of the goods as well as assessment of agents' attitude (eagerness) to procure an item in automated auction. The assessment of attributes adapts the fuzzy sets technique to handle uncertainty of the bidding process as well use heuristic rules to determine the attitude of bidding agents in simulated auctions to procure goods. The overall assessment is used to determine a price range based on current bid, which finally selects the best one as the new bid.

  6. Multiple fault separation and detection by joint subspace learning for the health assessment of wind turbine gearboxes

    NASA Astrophysics Data System (ADS)

    Du, Zhaohui; Chen, Xuefeng; Zhang, Han; Zi, Yanyang; Yan, Ruqiang

    2017-09-01

    The gearbox of a wind turbine (WT) has dominant failure rates and highest downtime loss among all WT subsystems. Thus, gearbox health assessment for maintenance cost reduction is of paramount importance. The concurrence of multiple faults in gearbox components is a common phenomenon due to fault induction mechanism. This problem should be considered before planning to replace the components of the WT gearbox. Therefore, the key fault patterns should be reliably identified from noisy observation data for the development of an effective maintenance strategy. However, most of the existing studies focusing on multiple fault diagnosis always suffer from inappropriate division of fault information in order to satisfy various rigorous decomposition principles or statistical assumptions, such as the smooth envelope principle of ensemble empirical mode decomposition and the mutual independence assumption of independent component analysis. Thus, this paper presents a joint subspace learning-based multiple fault detection (JSL-MFD) technique to construct different subspaces adaptively for different fault patterns. Its main advantage is its capability to learn multiple fault subspaces directly from the observation signal itself. It can also sparsely concentrate the feature information into a few dominant subspace coefficients. Furthermore, it can eliminate noise by simply performing coefficient shrinkage operations. Consequently, multiple fault patterns are reliably identified by utilizing the maximum fault information criterion. The superiority of JSL-MFD in multiple fault separation and detection is comprehensively investigated and verified by the analysis of a data set of a 750 kW WT gearbox. Results show that JSL-MFD is superior to a state-of-the-art technique in detecting hidden fault patterns and enhancing detection accuracy.

  7. Effectiveness of applying progressive muscle relaxation technique on quality of life of patients with multiple sclerosis.

    PubMed

    Ghafari, Somayeh; Ahmadi, Fazlolah; Nabavi, Masoud; Anoshirvan, Kazemnejad; Memarian, Robabe; Rafatbakhsh, Mohamad

    2009-08-01

    To identify the effects of applying Progressive Muscle Relaxation Technique on Quality of Life of patients with multiple Sclerosis. In view of the growing caring options in Multiple Sclerosis, improvement of quality of life has become increasingly relevant as a caring intervention. Complementary therapies are widely used by multiple sclerosis patients and Progressive Muscle Relaxation Technique is a form of complementary therapies. Quasi-experimental study. Multiple Sclerosis patients (n = 66) were selected with no probability sampling then assigned to experimental and control groups (33 patients in each group). Means of data collection included: Individual Information Questionnaire, SF-8 Health Survey, Self-reported checklist. PMRT performed for 63 sessions by experimental group during two months but no intervention was done for control group. Statistical analysis was done by SPSS software. Student t-test showed that there was no significant difference between two groups in mean scores of health-related quality of life before the study but this test showed a significant difference between two groups, one and two months after intervention (p < 0.05). anova test with repeated measurements showed that there is a significant difference in mean score of whole and dimensions of health-related quality of life between two groups in three times (p < 0.05). Although this study provides modest support for the effectiveness of Progressive Muscle Relaxation Technique on quality of life of multiple sclerosis patients, further research is required to determine better methods to promote quality of life of patients suffer multiple sclerosis and other chronic disease. Progressive Muscle Relaxation Technique is practically feasible and is associated with increase of life quality of multiple sclerosis patients; so that health professionals need to update their knowledge about complementary therapies.

  8. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Surface Anatomy: A Holistic Treatment of Multiple Disconnected Anatomical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Koay, Cheng Guan; Schaefer, Stacey M.; van Reekum, Carien M.; Schmitz, Lara Peschke; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2015-01-01

    Image-based parcellation of the brain often leads to multiple disconnected anatomical structures, which pose significant challenges for analyses of morphological shapes. Existing shape models, such as the widely used spherical harmonic (SPHARM) representation, assume topological invariance, so are unable to simultaneously parameterize multiple disjoint structures. In such a situation, SPHARM has to be applied separately to each individual structure. We present a novel surface parameterization technique using 4D hyperspherical harmonics in representing multiple disjoint objects as a single analytic function, terming it HyperSPHARM. The underlying idea behind Hyper-SPHARM is to stereographically project an entire collection of disjoint 3D objects onto the 4D hypersphere and subsequently simultaneously parameterize them with the 4D hyperspherical harmonics. Hence, HyperSPHARM allows for a holistic treatment of multiple disjoint objects, unlike SPHARM. In an imaging dataset of healthy adult human brains, we apply HyperSPHARM to the hippocampi and amygdalae. The HyperSPHARM representations are employed as a data smoothing technique, while the HyperSPHARM coefficients are utilized in a support vector machine setting for object classification. HyperSPHARM yields nearly identical results as SPHARM, as will be shown in the paper. Its key advantage over SPHARM lies computationally; Hyper-SPHARM possess greater computational efficiency than SPHARM because it can parameterize multiple disjoint structures using much fewer basis functions and stereographic projection obviates SPHARM's burdensome surface flattening. In addition, HyperSPHARM can handle any type of topology, unlike SPHARM, whose analysis is confined to topologically invariant structures. PMID:25828650

  9. An observational model for biomechanical assessment of sprint kayaking technique.

    PubMed

    McDonnell, Lisa K; Hume, Patria A; Nolte, Volker

    2012-11-01

    Sprint kayaking stroke phase descriptions for biomechanical analysis of technique vary among kayaking literature, with inconsistencies not conducive for the advancement of biomechanics applied service or research. We aimed to provide a consistent basis for the categorisation and analysis of sprint kayak technique by proposing a clear observational model. Electronic databases were searched using key words kayak, sprint, technique, and biomechanics, with 20 sources reviewed. Nine phase-defining positions were identified within the kayak literature and were divided into three distinct types based on how positions were defined: water-contact-defined positions, paddle-shaft-defined positions, and body-defined positions. Videos of elite paddlers from multiple camera views were reviewed to determine the visibility of positions used to define phases. The water-contact-defined positions of catch, immersion, extraction, and release were visible from multiple camera views, therefore were suitable for practical use by coaches and researchers. Using these positions, phases and sub-phases were created for a new observational model. We recommend that kayaking data should be reported using single strokes and described using two phases: water and aerial. For more detailed analysis without disrupting the basic two-phase model, a four-sub-phase model consisting of entry, pull, exit, and aerial sub-phases should be used.

  10. Implementing technical refinement in high-level athletics: exploring the knowledge schemas of coaches.

    PubMed

    Kearney, Philip E; Carson, Howie J; Collins, Dave

    2018-05-01

    This paper explores the approaches adopted by high-level field athletics coaches when attempting to refine an athlete's already well-established technique (long and triple jump and javelin throwing). Six coaches, who had all coached multiple athletes to multiple major championships, took part in semi-structured interviews focused upon a recent example of technique refinement. Data were analysed using a thematic content analysis. The coaching tools reported were generally consistent with those advised by the existing literature, focusing on attaining "buy-in", utilising part-practice, restoring movement automaticity and securing performance under pressure. Five of the six coaches reported using a systematic sequence of stages to implement the refinement, although the number and content of these stages varied between them. Notably, however, there were no formal sources of knowledge (e.g., coach education or training) provided to inform coaches' decision making. Instead, coaches' decisions were largely based on experience both within and outside the sporting domain. Data offer a useful stimulus for reflection amongst sport practitioners confronted by the problem of technique refinement. Certainly the limited awareness of existing guidelines on technique refinement expressed by the coaches emphasises a need for further collaborative work by researchers and coach educators to disseminate best practice.

  11. RBT-GA: a novel metaheuristic for solving the multiple sequence alignment problem

    PubMed Central

    Taheri, Javid; Zomaya, Albert Y

    2009-01-01

    Background Multiple Sequence Alignment (MSA) has always been an active area of research in Bioinformatics. MSA is mainly focused on discovering biologically meaningful relationships among different sequences or proteins in order to investigate the underlying main characteristics/functions. This information is also used to generate phylogenetic trees. Results This paper presents a novel approach, namely RBT-GA, to solve the MSA problem using a hybrid solution methodology combining the Rubber Band Technique (RBT) and the Genetic Algorithm (GA) metaheuristic. RBT is inspired by the behavior of an elastic Rubber Band (RB) on a plate with several poles, which is analogues to locations in the input sequences that could potentially be biologically related. A GA attempts to mimic the evolutionary processes of life in order to locate optimal solutions in an often very complex landscape. RBT-GA is a population based optimization algorithm designed to find the optimal alignment for a set of input protein sequences. In this novel technique, each alignment answer is modeled as a chromosome consisting of several poles in the RBT framework. These poles resemble locations in the input sequences that are most likely to be correlated and/or biologically related. A GA-based optimization process improves these chromosomes gradually yielding a set of mostly optimal answers for the MSA problem. Conclusion RBT-GA is tested with one of the well-known benchmarks suites (BALiBASE 2.0) in this area. The obtained results show that the superiority of the proposed technique even in the case of formidable sequences. PMID:19594869

  12. Key-Node-Separated Graph Clustering and Layouts for Human Relationship Graph Visualization.

    PubMed

    Itoh, Takayuki; Klein, Karsten

    2015-01-01

    Many graph-drawing methods apply node-clustering techniques based on the density of edges to find tightly connected subgraphs and then hierarchically visualize the clustered graphs. However, users may want to focus on important nodes and their connections to groups of other nodes for some applications. For this purpose, it is effective to separately visualize the key nodes detected based on adjacency and attributes of the nodes. This article presents a graph visualization technique for attribute-embedded graphs that applies a graph-clustering algorithm that accounts for the combination of connections and attributes. The graph clustering step divides the nodes according to the commonality of connected nodes and similarity of feature value vectors. It then calculates the distances between arbitrary pairs of clusters according to the number of connecting edges and the similarity of feature value vectors and finally places the clusters based on the distances. Consequently, the technique separates important nodes that have connections to multiple large clusters and improves the visibility of such nodes' connections. To test this technique, this article presents examples with human relationship graph datasets, including a coauthorship and Twitter communication network dataset.

  13. Optimal space communications techniques. [using digital and phase locked systems for signal processing

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1974-01-01

    Digital multiplication of two waveforms using delta modulation (DM) is discussed. It is shown that while conventional multiplication of two N bit words requires N2 complexity, multiplication using DM requires complexity which increases linearly with N. Bounds on the signal-to-quantization noise ratio (SNR) resulting from this multiplication are determined and compared with the SNR obtained using standard multiplication techniques. The phase locked loop (PLL) system, consisting of a phase detector, voltage controlled oscillator, and a linear loop filter, is discussed in terms of its design and system advantages. Areas requiring further research are identified.

  14. Real-time optical multiple object recognition and tracking system and method

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin (Inventor); Liu, Hua Kuang (Inventor)

    1987-01-01

    The invention relates to an apparatus and associated methods for the optical recognition and tracking of multiple objects in real time. Multiple point spatial filters are employed that pre-define the objects to be recognized at run-time. The system takes the basic technology of a Vander Lugt filter and adds a hololens. The technique replaces time, space and cost-intensive digital techniques. In place of multiple objects, the system can also recognize multiple orientations of a single object. This later capability has potential for space applications where space and weight are at a premium.

  15. Progressive structure-based alignment of homologous proteins: Adopting sequence comparison strategies.

    PubMed

    Joseph, Agnel Praveen; Srinivasan, Narayanaswamy; de Brevern, Alexandre G

    2012-09-01

    Comparison of multiple protein structures has a broad range of applications in the analysis of protein structure, function and evolution. Multiple structure alignment tools (MSTAs) are necessary to obtain a simultaneous comparison of a family of related folds. In this study, we have developed a method for multiple structure comparison largely based on sequence alignment techniques. A widely used Structural Alphabet named Protein Blocks (PBs) was used to transform the information on 3D protein backbone conformation as a 1D sequence string. A progressive alignment strategy similar to CLUSTALW was adopted for multiple PB sequence alignment (mulPBA). Highly similar stretches identified by the pairwise alignments are given higher weights during the alignment. The residue equivalences from PB based alignments are used to obtain a three dimensional fit of the structures followed by an iterative refinement of the structural superposition. Systematic comparisons using benchmark datasets of MSTAs underlines that the alignment quality is better than MULTIPROT, MUSTANG and the alignments in HOMSTRAD, in more than 85% of the cases. Comparison with other rigid-body and flexible MSTAs also indicate that mulPBA alignments are superior to most of the rigid-body MSTAs and highly comparable to the flexible alignment methods. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  16. Frequency-domain method for measuring spectral properties in multiple-scattering media: methemoglobin absorption spectrum in a tissuelike phantom

    NASA Astrophysics Data System (ADS)

    Fishkin, Joshua B.; So, Peter T. C.; Cerussi, Albert E.; Gratton, Enrico; Fantini, Sergio; Franceschini, Maria Angela

    1995-03-01

    We have measured the optical absorption and scattering coefficient spectra of a multiple-scattering medium (i.e., a biological tissue-simulating phantom comprising a lipid colloid) containing methemoglobin by using frequency-domain techniques. The methemoglobin absorption spectrum determined in the multiple-scattering medium is in excellent agreement with a corrected methemoglobin absorption spectrum obtained from a steady-state spectrophotometer measurement of the optical density of a minimally scattering medium. The determination of the corrected methemoglobin absorption spectrum takes into account the scattering from impurities in the methemoglobin solution containing no lipid colloid. Frequency-domain techniques allow for the separation of the absorbing from the scattering properties of multiple-scattering media, and these techniques thus provide an absolute

  17. Low-Temperature Growth and Doping of Mercury-Based II-Vi Multiple Quantum Well Structures by Molecular Beam Epitaxy

    NASA Astrophysics Data System (ADS)

    Lansari, Yamina

    The growth of Hg-based single layers and multiple quantum well structures by conventional molecular beam epitaxy (MBE) and photoassisted MBE was studied. The use of photoassisted MBE, an epitaxial growth technique developed at NCSU, has resulted in a substantial reduction of the film growth temperature. Indeed, substrate temperatures 50 to 100^circC lower than those customarily used by others for conventional MBE growth of Hg-based layers were successfully employed. Photoassisted MBE allowed the preparation of excellent structural quality HgTe layers (FWHM for the (400) diffraction peak ~ 40 arcsec), HgCdTe layers (FWHM for the (400) diffraction peak ~ 14 arcsec), and HgTeCdTe superlattices (FWHM for the (400) diffraction peak ~ 28 arcsec). In addition, n-type and p-type modulation-doping of Hg-based multilayers was accomplished by photoassisted MBE. This technique has been shown to have a significant effect on the growth process kinetics as well as on the desorption rates of the film species, thereby affecting dopant incorporation mechanisms and allowing for the successful substitutional doping of the multilayer structures. Finally, surface morphology studies were completed using scanning electron microscopy (SEM) and Nomarsky optical microscopy to study the effects of substrate surface preparation, growth initiation, and growth parameters on the density of pyramidal hillocks, a common growth defect plaguing the Hg-based layers grown in the (100) direction. Conditions which minimize the hillock density for (100) film growth have been determined.

  18. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  19. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    NASA Astrophysics Data System (ADS)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  20. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  1. Assessing the nonlinear response of fine particles to precursor emissions: Development and application of an extended response surface modeling technique v1.0

    DOE PAGES

    Zhao, B.; Wang, S. X.; Xing, J.; ...

    2015-01-30

    An innovative extended response surface modeling technique (ERSM v1.0) is developed to characterize the nonlinear response of fine particles (PM₂̣₅) to large and simultaneous changes of multiple precursor emissions from multiple regions and sectors. The ERSM technique is developed based on the conventional response surface modeling (RSM) technique; it first quantifies the relationship between PM₂̣₅ concentrations and the emissions of gaseous precursors from each single region using the conventional RSM technique, and then assesses the effects of inter-regional transport of PM₂̣₅ and its gaseous precursors on PM₂̣₅ concentrations in the target region. We apply this novel technique with a widelymore » used regional chemical transport model (CTM) over the Yangtze River delta (YRD) region of China, and evaluate the response of PM₂̣₅ and its inorganic components to the emissions of 36 pollutant–region–sector combinations. The predicted PM₂̣₅ concentrations agree well with independent CTM simulations; the correlation coefficients are larger than 0.98 and 0.99, and the mean normalized errors (MNEs) are less than 1 and 2% for January and August, respectively. It is also demonstrated that the ERSM technique could reproduce fairly well the response of PM₂̣₅ to continuous changes of precursor emission levels between zero and 150%. Employing this new technique, we identify the major sources contributing to PM₂̣₅ and its inorganic components in the YRD region. The nonlinearity in the response of PM₂̣₅ to emission changes is characterized and the underlying chemical processes are illustrated.« less

  2. Solar multi-conjugate adaptive optics based on high order ground layer adaptive optics and low order high altitude correction.

    PubMed

    Zhang, Lanqiang; Guo, Youming; Rao, Changhui

    2017-02-20

    Multi-conjugate adaptive optics (MCAO) is the most promising technique currently developed to enlarge the corrected field of view of adaptive optics for astronomy. In this paper, we propose a new configuration of solar MCAO based on high order ground layer adaptive optics and low order high altitude correction, which result in a homogeneous correction effect in the whole field of view. An individual high order multiple direction Shack-Hartmann wavefront sensor is employed in the configuration to detect the ground layer turbulence for low altitude correction. Furthermore, the other low order multiple direction Shack-Hartmann wavefront sensor supplies the wavefront information caused by high layers' turbulence through atmospheric tomography for high altitude correction. Simulation results based on the system design at the 1-meter New Vacuum Solar Telescope show that the correction uniform of the new scheme is obviously improved compared to conventional solar MCAO configuration.

  3. The FORE-SCE model: a practical approach for projecting land cover change using scenario-based modeling

    USGS Publications Warehouse

    Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.

    2007-01-01

    A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.

  4. Evaluations and Comparisons of Treatment Effects Based on Best Combinations of Biomarkers with Applications to Biomedical Studies

    PubMed Central

    Chen, Xiwei; Yu, Jihnhee

    2014-01-01

    Abstract Many clinical and biomedical studies evaluate treatment effects based on multiple biomarkers that commonly consist of pre- and post-treatment measurements. Some biomarkers can show significant positive treatment effects, while other biomarkers can reflect no effects or even negative effects of the treatments, giving rise to a necessity to develop methodologies that may correctly and efficiently evaluate the treatment effects based on multiple biomarkers as a whole. In the setting of pre- and post-treatment measurements of multiple biomarkers, we propose to apply a receiver operating characteristic (ROC) curve methodology based on the best combination of biomarkers maximizing the area under the receiver operating characteristic curve (AUC)-type criterion among all possible linear combinations. In the particular case with independent pre- and post-treatment measurements, we show that the proposed method represents the well-known Su and Liu's (1993) result. Further, proceeding from derived best combinations of biomarkers' measurements, we propose an efficient technique via likelihood ratio tests to compare treatment effects. We show an extensive Monte Carlo study that confirms the superiority of the proposed test in comparison with treatment effects based on multiple biomarkers in a paired data setting. For practical applications, the proposed method is illustrated with a randomized trial of chlorhexidine gluconate on oral bacterial pathogens in mechanically ventilated patients as well as a treatment study for children with attention deficit-hyperactivity disorder and severe mood dysregulation. PMID:25019920

  5. FALSTAFF: A new tool for fission studies

    NASA Astrophysics Data System (ADS)

    Dore, D.; Farget, F.; Lecolley, F.-R.; Lehaut, G.; Materna, T.; Pancin, J.; Panebianco, S.; Papaevangelou, Th.

    2013-12-01

    The future NFS installation will produce high intensity neutron beams from hundreds of keV up to 40 MeV. Taking advantage of this facility, data of particular interest for the nuclear community in view of the development of the fast reactor technology will be measured. The development of an experimental setup called FALSTAFF for a full characterization of actinide fission fragments has been undertaken. Fission fragment isotopic yields and associated neutron multiplicities will be measured as a function of the neutron energy. Based on time-of-flight and residual energy technique, the setup will allow the simultaneous measurement of the complementary fragments velocity and energy. The performances of TOF detectors of FALSTAFF will be presented and expected resolutions for fragment masses and neutron multiplicities, based on realistic simulations, will be shown.

  6. Tuning Fluorescence Direction with Plasmonic Metal–Dielectric– Metal Substrates

    PubMed Central

    Choudhury, Sharmistha Dutta; Badugu, Ramachandram; Nowaczyk, Kazimierz; Ray, Krishanu; Lakowicz, Joseph R.

    2013-01-01

    Controlling the emission properties of fluorophores is essential for improving the performance of fluorescence-based techniques in modern biochemical research, medical diagnosis, and sensing. Fluorescence emission is isotropic in nature, which makes it difficult to capture more than a small fraction of the total emission. Metal– dielectric–metal (MDM) substrates, discussed in this Letter, convert isotropic fluorescence into beaming emission normal to the substrate. This improves fluorescence collection efficiency and also opens up new avenues for a wide range of fluorescence-based applications. We suggest that MDM substrates can be readily adapted for multiple uses, such as in microarray formats, for directional fluorescence studies of multiple probes or for molecule-specific sensing with a high degree of spatial control over the fluorescence emission. SECTION: Physical Processes in Nanomaterials and Nanostructures PMID:24013521

  7. Sparsity-Aware DOA Estimation Scheme for Noncircular Source in MIMO Radar

    PubMed Central

    Wang, Xianpeng; Wang, Wei; Li, Xin; Liu, Qi; Liu, Jing

    2016-01-01

    In this paper, a novel sparsity-aware direction of arrival (DOA) estimation scheme for a noncircular source is proposed in multiple-input multiple-output (MIMO) radar. In the proposed method, the reduced-dimensional transformation technique is adopted to eliminate the redundant elements. Then, exploiting the noncircularity of signals, a joint sparsity-aware scheme based on the reweighted l1 norm penalty is formulated for DOA estimation, in which the diagonal elements of the weight matrix are the coefficients of the noncircular MUSIC-like (NC MUSIC-like) spectrum. Compared to the existing l1 norm penalty-based methods, the proposed scheme provides higher angular resolution and better DOA estimation performance. Results from numerical experiments are used to show the effectiveness of our proposed method. PMID:27089345

  8. Treatment Planning and Image Guidance for Radiofrequency Ablations of Large Tumors

    PubMed Central

    Ren, Hongliang; Campos-Nanez, Enrique; Yaniv, Ziv; Banovac, Filip; Abeledo, Hernan; Hata, Nobuhiko; Cleary, Kevin

    2014-01-01

    This article addresses the two key challenges in computer-assisted percutaneous tumor ablation: planning multiple overlapping ablations for large tumors while avoiding critical structures, and executing the prescribed plan. Towards semi-automatic treatment planning for image-guided surgical interventions, we develop a systematic approach to the needle-based ablation placement task, ranging from pre-operative planning algorithms to an intra-operative execution platform. The planning system incorporates clinical constraints on ablations and trajectories using a multiple objective optimization formulation, which consists of optimal path selection and ablation coverage optimization based on integer programming. The system implementation is presented and validated in phantom studies and on an animal model. The presented system can potentially be further extended for other ablation techniques such as cryotherapy. PMID:24235279

  9. Turning challenges into design principles: Telemonitoring systems for patients with multiple chronic conditions.

    PubMed

    Sultan, Mehwish; Kuluski, Kerry; McIsaac, Warren J; Cafazzo, Joseph A; Seto, Emily

    2018-01-01

    People with multiple chronic conditions often struggle with managing their health. The purpose of this research was to identify specific challenges of patients with multiple chronic conditions and to use the findings to form design principles for a telemonitoring system tailored for these patients. Semi-structured interviews with 15 patients with multiple chronic conditions and 10 clinicians were conducted to gain an understanding of their needs and preferences for a smartphone-based telemonitoring system. The interviews were analyzed using a conventional content analysis technique, resulting in six themes. Design principles developed from the themes included that the system must be modular to accommodate various combinations of conditions, reinforce a routine, consolidate record keeping, as well as provide actionable feedback to the patients. Designing an application for multiple chronic conditions is complex due to variability in patient conditions, and therefore, design principles developed in this study can help with future innovations aimed to help manage this population.

  10. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  11. Diagnosis of 25 genotypes of human papillomaviruses for their physical statuses in cervical precancerous/cancerous lesions: a comparison of E2/E6E7 ratio-based vs. multiple E1-L1/E6E7 ratio-based detection techniques.

    PubMed

    Zhang, Rong; He, Yi-feng; Chen, Mo; Chen, Chun-mei; Zhu, Qiu-jing; Lu, Huan; Wei, Zhen-hong; Li, Fang; Zhang, Xiao-xin; Xu, Cong-jian; Yu, Long

    2014-10-02

    Cervical lesions caused by integrated human papillomavirus (HPV) infection are highly dangerous because they can quickly develop into invasive cancers. However, clinicians are currently hampered by the lack of a quick, convenient and precise technique to detect integrated/mixed infections of various genotypes of HPVs in the cervix. This study aimed to develop a practical tool to determine the physical status of different HPVs and evaluate its clinical significance. The target population comprised 1162 women with an HPV infection history of > six months and an abnormal cervical cytological finding. The multiple E1-L1/E6E7 ratio analysis, a novel technique, was developed based on determining the ratios of E1/E6E7, E2/E6E7, E4E5/E6E7, L2/E6E7 and L1/E6E7 within the viral genome. Any imbalanced ratios indicate integration. Its diagnostic and predictive performances were compared with those of E2/E6E7 ratio analysis. The detection accuracy of both techniques was evaluated using the gold-standard technique "detection of integrated papillomavirus sequences" (DIPS). To realize a multigenotypic detection goal, a primer and probe library was established. The integration rate of a particular genotype of HPV was correlated with its tumorigenic potential and women with higher lesion grades often carried lower viral loads. The E1-L1/E6E7 ratio analysis achieved 92.7% sensitivity and 99.0% specificity in detecting HPV integration, while the E2/E6E7 ratio analysis showed a much lower sensitivity (75.6%) and a similar specificity (99.3%). Interference due to episomal copies was observed in both techniques, leading to false-negative results. However, some positive results of E1-L1/E6E7 ratio analysis were missed by DIPS due to its stochastic detection nature. The E1-L1/E6E7 ratio analysis is more efficient than E2/E6E7 ratio analysis and DIPS in predicting precancerous/cancerous lesions, in which both positive predictive values (36.7%-82.3%) and negative predictive values (75.9%-100%) were highest (based on the results of three rounds of biopsies). The multiple E1-L1/E6E7 ratio analysis is more sensitive and predictive than E2/E6E7 ratio analysis as a triage test for detecting HPV integration. It can effectively narrow the range of candidates for colposcopic examination and cervical biopsy, thereby lowering the expense of cervical cancer prevention.

  12. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  13. Multiple emulsions: an overview.

    PubMed

    Khan, Azhar Yaqoob; Talegaonkar, Sushama; Iqbal, Zeenat; Ahmed, Farhan Jalees; Khar, Roop Krishan

    2006-10-01

    Multiple emulsions are complex polydispersed systems where both oil in water and water in oil emulsion exists simultaneously which are stabilized by lipophillic and hydrophilic surfactants respectively. The ratio of these surfactants is important in achieving stable multiple emulsions. Among water-in-oil-in-water (w/o/w) and oil-in-water-in-oil (o/w/o) type multiple emulsions, the former has wider areas of application and hence are studied in great detail. Formulation, preparation techniques and in vitro characterization methods for multiple emulsions are reviewed. Various factors affecting the stability of multiple emulsions and the stabilization approaches with specific reference to w/o/w type multiple emulsions are discussed in detail. Favorable drug release mechanisms and/or rate along with in vivo fate of multiple emulsions make them a versatile carrier. It finds wide range of applications in controlled or sustained drug delivery, targeted delivery, taste masking, bioavailability enhancement, enzyme immobilization, etc. Multiple emulsions have also been employed as intermediate step in the microencapsulation process and are the systems of increasing interest for the oral delivery of hydrophilic drugs, which are unstable in gastrointestinal tract like proteins and peptides. With the advancement in techniques for preparation, stabilization and rheological characterization of multiple emulsions, it will be able to provide a novel carrier system for drugs, cosmetics and pharmaceutical agents. In this review, emphasis is laid down on formulation, stabilization techniques and potential applications of multiple emulsion system.

  14. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  15. HoloHands: games console interface for controlling holographic optical manipulation

    NASA Astrophysics Data System (ADS)

    McDonald, C.; McPherson, M.; McDougall, C.; McGloin, D.

    2013-03-01

    The increasing number of applications for holographic manipulation techniques has sparked the development of more accessible control interfaces. Here, we describe a holographic optical tweezers experiment which is controlled by gestures that are detected by a Microsoft Kinect. We demonstrate that this technique can be used to calibrate the tweezers using the Stokes drag method and compare this to automated calibrations. We also show that multiple particle manipulation can be handled. This is a promising new line of research for gesture-based control which could find applications in a wide variety of experimental situations.

  16. Multiple exposure photographic (MEP) technique: an objective assessment of sperm motility in infertility management.

    PubMed

    Adetoro, O O

    1988-06-01

    Multiple exposure photography (MEP), an objective technique, was used in determining the percentage of motile sperms in the semen samples from 41 males being investigated for infertility. This technique was compared with the conventional subjective ordinary microscopy method of spermatozoal motility assessment. A satisfactory correlation was observed in percentage sperm motility assessment using the two methods but the MEP estimation was more consistent and reliable. The value of this technique of sperm motility study in the developing world is discussed.

  17. An Econometric Approach to Evaluate Navy Advertising Efficiency.

    DTIC Science & Technology

    1996-03-01

    This thesis uses an econometric approach to systematically and comprehensively analyze Navy advertising and recruiting data to determine Navy... advertising cost efficiency in the Navy recruiting process. Current recruiting and advertising cost data are merged into an appropriate data base and...evaluated using multiple regression techniques to find assessments of the relationships between Navy advertising expenditures and recruit contracts attained

  18. Identifying Students' Mathematical Skills from a Multiple-Choice Diagnostic Test Using an Iterative Technique to Minimise False Positives

    ERIC Educational Resources Information Center

    Manning, S.; Dix, A.

    2008-01-01

    There is anecdotal evidence that a significant number of students studying computing related courses at degree level have difficulty with sub-GCE mathematics. Testing of students' skills is often performed using diagnostic tests and a number of computer-based diagnostic tests exist, which work, essentially, by testing one specific diagnostic skill…

  19. Improving Systematic Constraint-driven Analysis Using Incremental and Parallel Techniques

    DTIC Science & Technology

    2012-05-01

    and modeling latency of a cloud based subsystem. Members of my research group provided useful comments and ideas on my work in group meetings and...122 5.7.1 One structurally complex argument . . . . . . . . . . . . . . 122 5.7.2 Multiple independent arguments...Subject Tools . . . . . . . . . . . . . . . . . 131 6.1.1.1 JPF — Model Checker . . . . . . . . . . . . . . . . 131 6.1.1.2 Alloy — Using a SAT

  20. Phenotypic evaluation and genome wide association studies of two common bean (Phaseolus vulgaris) diversity panels in multiple locations highlight evaluation techniques, traits and lines useful for trait based selection

    USDA-ARS?s Scientific Manuscript database

    Common bean (Phaseolus vulgaris) productivity is constrained by abiotic soil conductions including drought and low fertility as well as by high temperature. High temperature primarily impacts pollen viability and growth. Soil water content and nutrients occur heterogeneously and often in a stratif...

  1. 3D Mapping of Language Networks in Clinical and Pre-Clinical Alzheimer's Disease

    ERIC Educational Resources Information Center

    Apostolova, Liana G.; Lu, Po; Rogers, Steve; Dutton, Rebecca A.; Hayashi, Kiralee M.; Toga, Arthur W.; Cummings, Jeffrey L.; Thompson, Paul M.

    2008-01-01

    We investigated the associations between Boston naming and the animal fluency tests and cortical atrophy in 19 probable AD and 5 multiple domain amnestic mild cognitive impairment patients who later converted to AD. We applied a surface-based computational anatomy technique to MRI scans of the brain and then used linear regression models to detect…

  2. Target Classification of Canonical Scatterers Using Classical Estimation and Dictionary Based Techniques

    DTIC Science & Technology

    2012-03-22

    shapes tested , when the objective parameter set was confined to a dictionary’s de - fined parameter space. These physical characteristics included...8 2.3 Hypothesis Testing and Detection Theory . . . . . . . . . . . . . . . 8 2.4 3-D SAR Scattering Models...basis pursuit de -noising (BPDN) algorithm is chosen to perform extraction due to inherent efficiency and error tolerance. Multiple shape dictionaries

  3. Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment.

    PubMed

    Onüt, Semih; Soner, Selin

    2008-01-01

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.

  4. MultiFacet: A Faceted Interface for Browsing Large Multimedia Collections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Hampton, Shawn D.; Endert, Alexander

    2013-10-31

    Faceted browsing is a common technique for exploring collections where the data can be grouped into a number of pre-defined categories, most often generated from textual metadata. Historically, faceted browsing has been applied to a single data type such as text or image data. However, typical collections contain multiple data types, such as information from web pages that contain text, images, and video. Additionally, when browsing a collection of images and video, facets are often created based on the metadata which may be incomplete, inaccurate, or missing altogether instead of the actual visual content contained within those images and video.more » In this work we address these limitations by presenting MultiFacet, a faceted browsing interface that supports multiple data types. MultiFacet constructs facets for images and video in a collection from the visual content using computer vision techniques. These visual facets can then be browsed in conjunction with text facets within a single interface to reveal relationships and phenomena within multimedia collections. Additionally, we present a use case based on real-world data, demonstrating the utility of this approach towards browsing a large multimedia data collection.« less

  5. LISA Framework for Enhancing Gravitational Wave Signal Extraction Techniques

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Thirumalainambi, Rajkumar

    2006-01-01

    This paper describes the development of a Framework for benchmarking and comparing signal-extraction and noise-interference-removal methods that are applicable to interferometric Gravitational Wave detector systems. The primary use is towards comparing signal and noise extraction techniques at LISA frequencies from multiple (possibly confused) ,gravitational wave sources. The Framework includes extensive hybrid learning/classification algorithms, as well as post-processing regularization methods, and is based on a unique plug-and-play (component) architecture. Published methods for signal extraction and interference removal at LISA Frequencies are being encoded, as well as multiple source noise models, so that the stiffness of GW Sensitivity Space can be explored under each combination of methods. Furthermore, synthetic datasets and source models can be created and imported into the Framework, and specific degraded numerical experiments can be run to test the flexibility of the analysis methods. The Framework also supports use of full current LISA Testbeds, Synthetic data systems, and Simulators already in existence through plug-ins and wrappers, thus preserving those legacy codes and systems in tact. Because of the component-based architecture, all selected procedures can be registered or de-registered at run-time, and are completely reusable, reconfigurable, and modular.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onuet, Semih; Soner, Selin

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker tomore » describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.« less

  7. Techniques for estimating Space Station aerodynamic characteristics

    NASA Technical Reports Server (NTRS)

    Thomas, Richard E.

    1993-01-01

    A method was devised and calculations were performed to determine the effects of reflected molecules on the aerodynamic force and moment coefficients for a body in free molecule flow. A procedure was developed for determining the velocity and temperature distributions of molecules reflected from a surface of arbitrary momentum and energy accommodation. A system of equations, based on momentum and energy balances for the surface, incident, and reflected molecules, was solved by a numerical optimization technique. The minimization of a 'cost' function, developed from the set of equations, resulted in the determination of the defining properties of the flow reflected from the arbitrary surface. The properties used to define both the incident and reflected flows were: average temperature of the molecules in the flow, angle of the flow with respect to a vector normal to the surface, and the molecular speed ratio. The properties of the reflected flow were used to calculate the contribution of multiply reflected molecules to the force and moments on a test body in the flow. The test configuration consisted of two flat plates joined along one edge at a right angle to each other. When force and moment coefficients of this 90 deg concave wedge were compared to results that did not include multiple reflections, it was found that multiple reflections could nearly double lift and drag coefficients, with nearly a 50 percent increase in pitching moment for cases with specular or nearly specular accommodation. The cases of diffuse or nearly diffuse accommodation often had minor reductions in axial and normal forces when multiple reflections were included. There were several cases of intermediate accommodation where the addition of multiple reflection effects more than tripled the lift coefficient over the convex technique.

  8. A Unified Approach to Functional Principal Component Analysis and Functional Multiple-Set Canonical Correlation.

    PubMed

    Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S

    2017-06-01

    Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.

  9. Non-Orthogonal Multiple Access for Ubiquitous Wireless Sensor Networks.

    PubMed

    Anwar, Asim; Seet, Boon-Chong; Ding, Zhiguo

    2018-02-08

    Ubiquitous wireless sensor networks (UWSNs) have become a critical technology for enabling smart cities and other ubiquitous monitoring applications. Their deployment, however, can be seriously hampered by the spectrum available to the sheer number of sensors for communication. To support the communication needs of UWSNs without requiring more spectrum resources, the power-domain non-orthogonal multiple access (NOMA) technique originally proposed for 5th Generation (5G) cellular networks is investigated for UWSNs for the first time in this paper. However, unlike 5G networks that operate in the licensed spectrum, UWSNs mostly operate in unlicensed spectrum where sensors also experience cross-technology interferences from other devices sharing the same spectrum. In this paper, we model the interferences from various sources at the sensors using stochastic geometry framework. To evaluate the performance, we derive a theorem and present new closed form expression for the outage probability of the sensors in a downlink scenario under interference limited environment. In addition, diversity analysis for the ordered NOMA users is performed. Based on the derived outage probability, we evaluate the average link throughput and energy consumption efficiency of NOMA against conventional orthogonal multiple access (OMA) technique in UWSNs. Further, the required computational complexity for the NOMA users is presented.

  10. Noncontrast-enhanced renal angiography using multiple inversion recovery and alternating TR balanced steady-state free precession.

    PubMed

    Dong, Hattie Z; Worters, Pauline W; Wu, Holden H; Ingle, R Reeve; Vasanawala, Shreyas S; Nishimura, Dwight G

    2013-08-01

    Noncontrast-enhanced renal angiography techniques based on balanced steady-state free precession avoid external contrast agents, take advantage of high inherent blood signal from the T 2 / T 1 contrast mechanism, and have short steady-state free precession acquisition times. However, background suppression is limited; inflow times are inflexible; labeling region is difficult to define when tagging arterial flow; and scan times are long. To overcome these limitations, we propose the use of multiple inversion recovery preparatory pulses combined with alternating pulse repetition time balanced steady-state free precession to produce renal angiograms. Multiple inversion recovery uses selective spatial saturation followed by four nonselective inversion recovery pulses to concurrently null a wide range of background T 1 species while allowing for adjustable inflow times; alternating pulse repetition time steady-state free precession maintains vessel contrast and provides added fat suppression. The high level of suppression enables imaging in three-dimensional as well as projective two-dimensional formats, the latter of which has a scan time as short as one heartbeat. In vivo studies at 1.5 T demonstrate the superior vessel contrast of this technique. © 2012 Wiley Periodicals, Inc.

  11. Integration between terrestrial-based and satellite-based land mobile communications systems

    NASA Technical Reports Server (NTRS)

    Arcidiancono, Antonio

    1990-01-01

    A survey is given of several approaches to improving the performance and marketability of mobile satellite systems (MSS). The provision of voice/data services in the future regional European Land Mobile Satellite System (LMSS), network integration between the Digital Cellular Mobile System (GSM) and LMSS, the identification of critical areas for the implementation of integrated GSM/LMSS areas, space segment scenarios, LMSS for digital trunked private mobile radio (PMR) services, and code division multiple access (CDMA) techniques for a terrestrial/satellite system are covered.

  12. A Targeted "Capture" and "Removal" Scavenger toward Multiple Pollutants for Water Remediation based on Molecular Recognition.

    PubMed

    Wang, Jie; Shen, Haijing; Hu, Xiaoxia; Li, Yan; Li, Zhihao; Xu, Jinfan; Song, Xiufeng; Zeng, Haibo; Yuan, Quan

    2016-03-01

    For the water remediation techniques based on adsorption, the long-standing contradictories between selectivity and multiple adsorbability, as well as between affinity and recyclability, have put it on weak defense amid more and more severe environment crisis. Here, a pollutant-targeting hydrogel scavenger is reported for water remediation with both high selectivity and multiple adsorbability for several pollutants, and with strong affinity and good recyclability through rationally integrating the advantages of multiple functional materials. In the scavenger, aptamers fold into binding pockets to accommodate the molecular structure of pollutants to afford perfect selectivity, and Janus nanoparticles with antibacterial function as well as anisotropic surfaces to immobilize multiple aptamers allow for simultaneously handling different kinds of pollutants. The scavenger exhibits high efficiencies in removing pollutants from water and it can be easily recycled for many times without significant loss of loading capacities. Moreover, the residual concentrations of each contaminant are well below the drinking water standards. Thermodynamic behavior of the adsorption process is investigated and the rate-controlling process is determined. Furthermore, a point of use device is constructed and it displays high efficiency in removing pollutants from environmental water. The scavenger exhibits great promise to be applied in the next generation of water purification systems.

  13. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  14. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  15. Monitoring damage growth in titanium matrix composites using acoustic emission

    NASA Technical Reports Server (NTRS)

    Bakuckas, J. G., Jr.; Prosser, W. H.; Johnson, W. S.

    1993-01-01

    The application of the acoustic emission (AE) technique to locate and monitor damage growth in titanium matrix composites (TMC) was investigated. Damage growth was studied using several optical techniques including a long focal length, high magnification microscope system with image acquisition capabilities. Fracture surface examinations were conducted using a scanning electron microscope (SEM). The AE technique was used to locate damage based on the arrival times of AE events between two sensors. Using model specimens exhibiting a dominant failure mechanism, correlations were established between the observed damage growth mechanisms and the AE results in terms of the events amplitude. These correlations were used to monitor the damage growth process in laminates exhibiting multiple modes of damage. Results revealed that the AE technique is a viable and effective tool to monitor damage growth in TMC.

  16. Bronchoscopic modalities to diagnose sarcoidosis.

    PubMed

    Benzaquen, Sadia; Aragaki-Nakahodo, Alejandro Adolfo

    2017-09-01

    Several studies have investigated different bronchoscopic techniques to obtain tissue diagnosis in patients with suspected sarcoidosis when the diagnosis cannot be based on clinicoradiographic findings alone. In this review, we will describe the most recent and relevant evidence from different bronchoscopic modalities to diagnose sarcoidosis. Despite multiple available bronchoscopic modalities to procure tissue samples to diagnose sarcoidosis, the vast majority of evidence favors endobronchial ultrasound transbronchial needle aspiration to diagnose Scadding stages 1 and 2 sarcoidosis. Transbronchial lung cryobiopsy is a new technique that is mainly used to aid in the diagnosis of undifferentiated interstitial lung disease; however, we will discuss its potential use in sarcoidosis. This review illustrates the limited information about the different bronchoscopic techniques to aid in the diagnosis of pulmonary sarcoidosis. However, it demonstrates that the combination of available bronchoscopic techniques increases the diagnostic yield for suspected sarcoidosis.

  17. Anterior clinoidectomy using an extradural and intradural 2-step hybrid technique.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T; Yousef, Sonia; Guo, Xiaoming; González Sánchez, Jose Juan; Tabani, Halima; García, Sergio; Burkhardt, Jan-Karl; Benet, Arnau

    2018-02-23

    Anterior clinoidectomy is a difficult yet essential technique in skull base surgery. Two main techniques (extradural and intradural) with multiple modifications have been proposed to increase efficiency and avoid complications. In this study, the authors sought to develop a hybrid technique based on localization of the optic strut (OS) to combine the advantages and avoid the disadvantages of both techniques. Ten cadaveric specimens were prepared for surgical simulation. After a standard pterional craniotomy, the anterior clinoid process (ACP) was resected in 2 steps. The segment anterior to the OS was resected extradurally, while the segment posterior to the OS was resected intradurally. The proposed technique was performed in 6 clinical cases to evaluate its safety and efficiency. Anterior clinoidectomy was successfully performed in all cadaveric specimens and all 6 patients by using the proposed technique. The extradural phase enabled early decompression of the optic nerve while avoiding the adjacent internal carotid artery. The OS was drilled intradurally under direct visualization of the adjacent neurovascular structures. The described landmarks were easily identifiable and applicable in the surgically treated patients. No operative complication was encountered. A proposed 2-step hybrid technique combines the advantages of the extradural and intradural techniques while avoiding their disadvantages. This technique allows reduced intradural drilling and subarachnoid bone dust deposition. Moreover, the most critical part of the clinoidectomy-that is, drilling of the OS and removal of the body of the ACP-is left for the intradural phase, when critical neurovascular structures can be directly viewed.

  18. A new formation control of multiple underactuated surface vessels

    NASA Astrophysics Data System (ADS)

    Xie, Wenjing; Ma, Baoli; Fernando, Tyrone; Iu, Herbert Ho-Ching

    2018-05-01

    This work investigates a new formation control problem of multiple underactuated surface vessels. The controller design is based on input-output linearisation technique, graph theory, consensus idea and some nonlinear tools. The proposed smooth time-varying distributed control law guarantees that the multiple underactuated surface vessels globally exponentially converge to some desired geometric shape, which is especially centred at the initial average position of vessels. Furthermore, the stability analysis of zero dynamics proves that the orientations of vessels tend to some constants that are dependent on the initial values of vessels, and the velocities and control inputs of the vessels decay to zero. All the results are obtained under the communication scenarios of static directed balanced graph with a spanning tree. Effectiveness of the proposed distributed control scheme is demonstrated using a simulation example.

  19. A Parallel Spectroscopic Method for Examining Dynamic Phenomena on the Millisecond Time Scale

    PubMed Central

    Snively, Christopher M.; Chase, D. Bruce; Rabolt, John F.

    2009-01-01

    An infrared spectroscopic technique based on planar array infrared (PAIR) spectroscopy has been developed that allows the acquisition of spectra from multiple samples simultaneously. Using this technique, it is possible to acquire spectra over a spectral range of 950–1900cm−1 with a temporal resolution of 2.2ms. The performance of this system was demonstrated by determining the shear-induced orientational response of several low molecular weight liquid crystals. Five different liquid crystals were examined in combination with five different alignment layers, and both primary and secondary screens were demonstrated. Implementation of this high throughput PAIR technique resulted in a reduction in acquisition time as compared to both step-scan and ultra-rapid-scanning FTIR spectroscopy. PMID:19239197

  20. Preparing Colorful Astronomical Images II

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2002-12-01

    We present additional techniques for using mainstream graphics software (Adobe Photoshop and Illustrator) to produce composite color images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope to produce photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to present more detail and additional techniques, taking advantage of new or improved features available in the latest software versions. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels.

Top