Science.gov

Sample records for base extension technique

  1. A single base extension technique for the analysis of known mutations utilizing capillary gel electrophoreisis with electrochemical detection.

    PubMed

    Brazill, Sara A; Kuhr, Werner G

    2002-07-15

    A novel single nucleotide polymorphism (SNP) detection system is described in which the accuracy of DNA polymerase and advantages of electrochemical detection are demonstrated. A model SNP system is presented to illustrate the potential advantages in coupling the single base extension (SBE) technique to capillary gel electrophoresis (CGE) with electrochemical detection. An electrochemically labeled primer, with a ferrocene acetate covalently attached to its 5' end, is used in the extension reaction. When the Watson-Crick complementary ddNTP is added to the SBE reaction, the primer is extended by a single nucleotide. The reaction mixture is subsequently separated by CGE, and the ferrocene-tagged fragments are detected at the separation anode with sinusoidal voltammetry. This work demonstrates the first single base resolution separation of DNA coupled with electrochemical detection. The unextended primer (20-mer) and the 21-mer extension product are separated with a resolution of 0.8. PMID:12139049

  2. DNA Microarray Based on Arrayed-Primer Extension Technique for Identification of Pathogenic Fungi Responsible for Invasive and Superficial Mycoses▿

    PubMed Central

    Campa, Daniele; Tavanti, Arianna; Gemignani, Federica; Mogavero, Crocifissa S.; Bellini, Ilaria; Bottari, Fabio; Barale, Roberto; Landi, Stefano; Senesi, Sonia

    2008-01-01

    An oligonucleotide microarray based on the arrayed-primer extension (APEX) technique has been developed to simultaneously identify pathogenic fungi frequently isolated from invasive and superficial infections. Species-specific oligonucleotide probes complementary to the internal transcribed spacer 1 and 2 (ITS1 and ITS2) region were designed for 24 species belonging to 10 genera, including Candida species (Candida albicans, Candida dubliniensis, Candida famata, Candida glabrata, Candida tropicalis, Candida kefyr, Candida krusei, Candida guilliermondii, Candida lusitaniae, Candida metapsilosis, Candida orthopsilosis, Candida parapsilosis, and Candida pulcherrima), Cryptococcus neoformans, Aspergillus species (Aspergillus fumigatus and Aspergillus terreus), Trichophyton species (Trichophyton rubrum and Trichophyton tonsurans), Trichosporon cutaneum, Epidermophyton floccosum, Fusarium solani, Microsporum canis, Penicillium marneffei, and Saccharomyces cerevisiae. The microarray was tested for its specificity with a panel of reference and blinded clinical isolates. The APEX technique was proven to be highly discriminative, leading to unequivocal identification of each species, including the highly related ones C. parapsilosis, C. orthopsilosis, and C. metapsilosis. Because of the satisfactory basic performance traits obtained, such as reproducibility, specificity, and unambiguous interpretation of the results, this new system represents a reliable method of potential use in clinical laboratories for parallel one-shot detection and identification of the most common pathogenic fungi. PMID:18160452

  3. A study of FM threshold extension techniques

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Loch, F. J.

    1972-01-01

    The characteristics of three postdetection threshold extension techniques are evaluated with respect to the ability of such techniques to improve the performance of a phase lock loop demodulator. These techniques include impulse-noise elimination, signal correlation for the detection of impulse noise, and delta modulation signal processing. Experimental results from signal to noise ratio data and bit error rate data indicate that a 2- to 3-decibel threshold extension is readily achievable by using the various techniques. This threshold improvement is in addition to the threshold extension that is usually achieved through the use of a phase lock loop demodulator.

  4. Extension of an Itô-based general approximation technique for random vibration of a BBW general hysteris model part II: Non-Gaussian analysis

    NASA Astrophysics Data System (ADS)

    Davoodi, H.; Noori, M.

    1990-07-01

    The work presented in this paper constitutes the second phase of on-going research aimed at developing mathematical models for representing general hysteretic behavior of structures and approximation techniques for the computation and analysis of the response of hysteretic systems to random excitations. In this second part, the technique previously developed by the authors for the Gaussian response analysis of non-linear systems with general hysteretic behavior is extended for the non-Gaussian analysis of these systems. This approximation technique is based on the approach proposed independently by Ibrahim and Wu-Lin. In this work up to fourth order moments of the response co-ordinates are obtained for the Bouc-Baber-Wen smooth hysteresis model. These higher order statistics previously have not been made available for general hysteresis models by using existing approximation methods. Second order moments obtained for the model by this non-Gaussian closure scheme are compared with equivalent linearization and Gaussian closure results via Monte Carlo simulation (MCS). Higher order moments are compared with the simulation results. The study performed for a wide range of degradation parameters and input power spectral density ( PSD) levels shows that the non-Gaussian responses obtained by this approach are in better agreement with the MCS results than the linearized and Gaussian ones. This approximation technique can provide information on higher order moments for general hysteretic systems. This information is valuable in random vibration and the reliability analysis of hysteretically yielding structures.

  5. A comparison of four streamflow record extension techniques.

    USGS Publications Warehouse

    Hirsch, R.M.

    1982-01-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. -from Author

  6. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  7. Arena Roof Technique for Complex Reconstruction After Extensive Chest Wall Resection.

    PubMed

    Rocco, Gaetano; La Rocca, Antonello; La Manna, Carmine; Martucci, Nicola; De Luca, Giuseppe; Accardo, Rosanna

    2015-10-01

    Extensive primary resections or redos may produce significant chest wall defects requiring creative reconstructions in order to avoid reduction of the intrathoracic volume. We describe the successful use of an innovative technique for chest wall reconstruction based on the concept of roof coverage of sport arenas. In fact, titanium plates are anchored to the residual rib stumps along the parasternal and paravertebral lines. The acellular collagen matrix prosthesis was sutured to the free edges of the same titanium plates to create a roof, reproducing the chest wall dome geometric configuration. A 36-year-old female patient was diagnosed with an extensive desmoid tumor involving the lateral segments of second to fifth ribs on the right side. The arena roof technique allowed for adequate expansion of the uninvolved lung and optimal chest wall functional recovery. PMID:26434458

  8. Percutaneous nephrostomy with extensions of the technique: step by step.

    PubMed

    Dyer, Raymond B; Regan, John D; Kavanagh, Peter V; Khatod, Elaine G; Chen, Michael Y; Zagoria, Ronald J

    2002-01-01

    Minimally invasive therapy in the urinary tract begins with renal access by means of percutaneous nephrostomy. Indications for percutaneous nephrostomy include urinary diversion, treatment of nephrolithiasis and complex urinary tract infections, ureteral intervention, and nephroscopy and ureteroscopy. Bleeding complications can be minimized by entering the kidney in a relatively avascular zone created by branching of the renal artery. The specific site of renal entry is dictated by the indication for access with consideration of the anatomic constraints. Successful percutaneous nephrostomy requires visualization of the collecting system for selection of an appropriate entry site. The definitive entry site is then selected; ideally, the entry site should be subcostal and lateral to the paraspinous musculature. Small-bore nephrostomy tracks can be created over a guide wire coiled in the renal pelvis. A large-diameter track may be necessary for percutaneous stone therapy, nephroscopy, or antegrade ureteroscopy. The most common extension of percutaneous nephrostomy is placement of a ureteral stent for treatment of obstruction. Transient hematuria occurs in virtually every patient after percutaneous nephrostomy, but severe bleeding that requires transfusion or intervention is uncommon. In patients with an obstructed urinary tract complicated by infection, extensive manipulations pose a risk of septic complications. PMID:12006684

  9. Scan-Based Implementation of JPEG 2000 Extensions

    NASA Technical Reports Server (NTRS)

    Rountree, Janet C.; Webb, Brian N.; Flohr, Thomas J.; Marcellin, Michael W.

    2001-01-01

    JPEG 2000 Part 2 (Extensions) contains a number of technologies that are of potential interest in remote sensing applications. These include arbitrary wavelet transforms, techniques to limit boundary artifacts in tiles, multiple component transforms, and trellis-coded quantization (TCQ). We are investigating the addition of these features to the low-memory (scan-based) implementation of JPEG 2000 Part 1. A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version. A proposed amendment to JPEG 2000 Part 2 will effect the syntax changes required to make scan-based TCQ compatible with the standard.

  10. Flexible use and technique extension of logistics management

    NASA Astrophysics Data System (ADS)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  11. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    PubMed

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique. PMID:27275119

  12. Extension and Home-Based Businesses.

    ERIC Educational Resources Information Center

    Loker, Suzanne; And Others

    1990-01-01

    Includes "Building Home Businesses in Rural Communities" (Loker et al.); "Home-Based Business...A Means to Economic Growth in Rural Areas" (Bastow-Shoop et al.); "Business Not As Usual" (Millar, Mallilo); and "Economic Options for Farm Families" (Williams). (SK)

  13. Space-based observation of the extensive airshowers

    NASA Astrophysics Data System (ADS)

    Ebisuzaki, T.

    2013-06-01

    Space based observations of extensive air showers constitute the next experimental challenge for the study of the universe at extreme energy. Space observation will allow a "quantum jump" in the observational area available to detect the UV light tracks produced by particles with energies higher than 1020 eV. These are thought to reach the Earth almost undeflected by the cosmic magnetic field. This new technique will contribute to establish the new field of astronomy and astrophysics performed with charged particles and neutrinos at the highest energies. This idea was created by the incredible efforts of three outstanding comic ray physicists: John Linsley, Livio Scarsi, and Yoshiyuki Takahashi. This challenging technique has four significant merits in comparison with ground-based observations: 1) Very large observational area, 2) Well constrained distances of the showers, 3) Clear and stable atmospheric transmission in the above half troposphere, 4) Uniform Exposure across both the north and south skies. Four proposed and planned missions constitute the roadmap of the community: TUS, JEM-EUSO, KLPVE, and Super-EUSO will contribute step-by-step to establish this challenging field of research.

  14. Research on Customer Value Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Chun-Yan, Yang; Wei-Hua, Li

    Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

  15. Primer Extension Reactions for the PCR- based α- complementation Assay

    PubMed Central

    Achuthan, Vasudevan; DeStefano, Jeffrey J.

    2016-01-01

    The PCR- based- α- complementation assay is an effective technique to measure the fidelity of polymerases, especially RNA-dependent RNA polymerases (RDRP) and Reverse Transcriptases (RT). It has been successfully employed to determine the fidelity of the poliovirus polymerase 3D-pol (DeStefano, 2010) as well as the human immunodeficiency virus Reverse Transcriptase (HIV RT) (Achuthan et al., 2014). A major advantage of the assay is that since the PCR step is involved, even the low yield of products obtained after two rounds of low yield of RNA synthesis (for RDRP) or reverse transcription (for RT) can be measured using the assay. The assay also mimics the reverse transcription process, since both RNA- and DNA- directed RT synthesis steps are performed. We recently used this assay to show that the HIV RT, at physiologically relevant magnesium concentration, has accuracy in the same range as other reverse transcriptases (Achuthan et al., 2014). Here, we describe in detail how to prepare the inserts using the primer extension reactions. The prepared inserts are then processed further in the PCR- based- α- complementation assay.

  16. Availability analysis and design of storage extension based on CWDM

    NASA Astrophysics Data System (ADS)

    Qin, Leihua; Yu, Yan

    2007-11-01

    As Fibre Channel becomes the key storage protocol of SAN (Storage Area Network), enterprises are increasingly deploying FC SANs in their data central. Meanwhile, organizations increasingly face an enormous influx of data that must be stored, protected, backed up and replicated for mitigating the risk of losing data. One of the best ways to achieve this goal is to deploy SAN extension based on CWDM(Coarse Wavelength Division Multiplexing). Availability is one of the key performance metrics for business continuity and disaster recovery and has to be well understood by IT departments when deploying SAN extension based on CWDM, for it determines accessibility to remotely located data sites. In this paper, several architecture of storage extension over CWDM is analyzed and the availability of this different storage extension architecture are calculated. Further more, two kinds of high availability storage extension architecture with 1:1 or 1:N protection is designed, and the availability of protection schema storage extension based on CWDM is calculated too.

  17. Epidural volume extension: A novel technique and its efficacy in high risk cases.

    PubMed

    Tiwari, Akhilesh Kumar; Singh, Rajeev Ratan; Anupam, Rudra Pratap; Ganguly, S; Tomar, Gaurav Singh

    2012-01-01

    We present a unique case series restricting ourselves only to the high-risk case of different specialities who underwent successful surgery in our Institute by using epidural volume extension's technique using 1 mL of 0.5% ropivacaine and 25 μg of fentanyl. PMID:25885627

  18. Designing and application of SAN extension interface based on CWDM

    NASA Astrophysics Data System (ADS)

    Qin, Leihua; Yu, Shengsheng; Zhou, Jingli

    2005-11-01

    As Fibre Channel (FC) becomes the protocol of choice within corporate data centers, enterprises are increasingly deploying SANs in their data central. In order to mitigate the risk of losing data and improve the availability of data, more and more enterprises are increasingly adopting storage extension technologies to replicate their business critical data to a secondary site. Transmitting this information over distance requires a carrier grade environment with zero data loss, scalable throughput, low jitter, high security and ability to travel long distance. To address this business requirements, there are three basic architectures for storage extension, they are Storage over Internet Protocol, Storage over Synchronous Optical Network/Synchronous Digital Hierarchy (SONET/SDH) and Storage over Dense Wavelength Division Multiplexing (DWDM). Each approach varies in functionality, complexity, cost, scalability, security, availability , predictable behavior (bandwidth, jitter, latency) and multiple carrier limitations. Compared with these connectiviy technologies,Coarse Wavelength Division Multiplexing (CWDM) is a Simplified, Low Cost and High Performance connectivity solutions for enterprises to deploy their storage extension. In this paper, we design a storage extension connectivity over CWDM and test it's electrical characteristic and random read and write performance of disk array through the CWDM connectivity, testing result show us that the performance of the connectivity over CWDM is acceptable. Furthermore, we propose three kinds of network architecture of SAN extension based on CWDM interface. Finally the credit-Based flow control mechanism of FC, and the relationship between credits and extension distance is analyzed.

  19. Extension and Home-Based Business: A Collaborative Approach.

    ERIC Educational Resources Information Center

    Burns, Marilyn; Biers, Karen

    1991-01-01

    The Center for Home-Based Entrepreneurship at Oklahoma State University developed from collaborative efforts of extension, government agencies, business associations, and the vo-tech system. It provides education, directories, information services, and other assistance to people interested in establishing businesses in their homes. (SK)

  20. An Extension Dynamic Model Based on BDI Agent

    NASA Astrophysics Data System (ADS)

    Yu, Wang; Feng, Zhu; Hua, Geng; WangJing, Zhu

    this paper's researching is based on the model of BDI Agent. Firstly, This paper analyze the deficiencies of the traditional BDI Agent model, Then propose an extension dynamic model of BDI Agent based on the traditional ones. It can quickly achieve the internal interaction of the tradition model of BDI Agent, deal with complex issues under dynamic and open environment and achieve quick reaction of the model. The new model is a natural and reasonable model by verifying the origin of civilization using the model of monkeys to eat sweet potato based on the design of the extension dynamic model. It is verified to be feasible by comparing the extended dynamic BDI Agent model with the traditional BDI Agent Model uses the SWARM, it has important theoretical significance.

  1. Attitudes of Purdue Extension Field-Based Professionals and County Extension Board Members towards the Internationalization of Extension

    ERIC Educational Resources Information Center

    Rice, William Charles

    2009-01-01

    Communities and counties in Indiana are continuing to become more diverse. An influx of immigrants over the past years is very evident in some locations. They come for good employment or because they have been displaced from their homelands. In any case, they soon become a part of the fabric of each community in which they settle. Extension has…

  2. Anterior Ridge Extension Using Modified Kazanjian Technique in Mandible- A Clinical Study

    PubMed Central

    Kumar, Jagannadham Vijay; Chakravarthi, Pandi Srinivas; Sridhar, Meka; Devi, Kolli Naga Neelima; Lingamaneni, Krishna Prasad

    2016-01-01

    Introduction Good alveolar ridge is a prerequisite for successful conventional/ implant supported partial/complete denture. Extensively resorbed ridges with shallow vestibule and high insertion of muscles in to the ridge crest, leads to failure of prosthesis. Success of prosthesis depends on surgical repositioning of mucosa and muscle insertions, which increases the depth of vestibule and denture flange area for retention. So, the study was planned to provide good attached gingiva with adequate vestibular depth using Modified Kazanjian Vestibuloplasty (MKV). Aim To evaluate efficacy of MKV technique for increasing vestibular depth in anterior mandible so that successful prosthesis can be delivered. Efficacy of the technique was evaluated through operating time required, vestibular depth achieved, scarring or relapse and any postoperative complications associated with the healing. Materials and Methods Total of 10 patients were included in the study, who had minimum 20mm of bone height and less than 5mm of vestibular depth for MKV procedure. The results were tabulated and statistical analysis was carried out to assess vestibular depth achieved i.e. from crest of the ridge to junction of attached mucosa both pre and postoperatively. The study results were compared with existing literature. Results Healing of raw surface was uneventful with satisfactory achievement of vestibular depth. The average gain in vestibular depth was 11 mm. The patients had good satisfaction index for prosthesis. Conclusion Even in the era of implant prosthesis Modified Kazanjian technique is worth to practice to achieve good results and overcorrection is not required as that of standard Kazanjian technique. It provides adequate attached gingiva for successful prosthesis. Extension of vestibular depth enables fabrication of better denture flange with improved oral hygiene. This technique does not require hospitalization and additional surgery for grafts. PMID:27042579

  3. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    ERIC Educational Resources Information Center

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  4. A useful technique for adjusting nasal tip projection in Asian rhinoplasty: Trapezoidal caudal extension cartilage grafting.

    PubMed

    Liu, Shao-Cheng; Lin, Deng-Shan; Wang, Hsing-Won; Kao, Chuan-Hsiang

    2016-01-01

    The purpose of this article is to present our experience with Asian patients in (1) using a trapezoidal caudal extension cartilage graft to adjust the tip projection in tip refinement for augmentation rhinoplasty, especially for the correction of short nose, and (2) avoiding complications of augmentation rhinoplasty with alloplastic implants. We conducted a retrospective chart review of 358 rhinoplasties that were performed by the corresponding author from January 2004 through July 2009. Patients were included in this study if they had undergone open rhinoplasty with a trapezoidal caudal extension cartilage graft as the only tip-modifying procedure. Patients in whom any additional grafting was performed that might have altered the nasal tip position were excluded. The surgical results were analyzed in terms of the degree of satisfaction judged separately by investigators and by patients. A total of 84 patients-46 males and 38 females, all Asians, aged 13 to 61 years (mean: 29.3)-met our eligibility criteria. Postoperative follow-up for 24 months was achieved in 62 patients. At the 24-month follow-up, the surgeons judged the results to be good or very good in 57 of the 62 patients (91.9%); at the same time, 56 patients (90.3%) said they were satisfied or very satisfied with their aesthetic outcome. Good nasal tip projection, a natural columellar appearance, and improvement in the nasolabial angle were achieved for most patients. Two patients required revision rhinoplasty to correct an insufficient augmentation and migration of the onlay graft. No severe complications were observed during the 2-year follow-up. We have found that trapezoidal caudal extension cartilage grafting in nasal tip refinement is an easy technique to learn and execute, its results are predictable, and it has been associated with no major complications. We recommend trapezoidal caudal extension cartilage grafting for Asian patients as a good and reliable alternative for managing tip projection

  5. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    SciTech Connect

    Neubauer, Jeremy; Wood, Eric; Pesaran, Ahmad

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  6. Automatic assessment of mitral regurgitation severity based on extensive textural features on 2D echocardiography videos.

    PubMed

    Moghaddasi, Hanie; Nourian, Saeed

    2016-06-01

    Heart disease is the major cause of death as well as a leading cause of disability in the developed countries. Mitral Regurgitation (MR) is a common heart disease which does not cause symptoms until its end stage. Therefore, early diagnosis of the disease is of crucial importance in the treatment process. Echocardiography is a common method of diagnosis in the severity of MR. Hence, a method which is based on echocardiography videos, image processing techniques and artificial intelligence could be helpful for clinicians, especially in borderline cases. In this paper, we introduce novel features to detect micro-patterns of echocardiography images in order to determine the severity of MR. Extensive Local Binary Pattern (ELBP) and Extensive Volume Local Binary Pattern (EVLBP) are presented as image descriptors which include details from different viewpoints of the heart in feature vectors. Support Vector Machine (SVM), Linear Discriminant Analysis (LDA) and Template Matching techniques are used as classifiers to determine the severity of MR based on textural descriptors. The SVM classifier with Extensive Uniform Local Binary Pattern (ELBPU) and Extensive Volume Local Binary Pattern (EVLBP) have the best accuracy with 99.52%, 99.38%, 99.31% and 99.59%, respectively, for the detection of Normal, Mild MR, Moderate MR and Severe MR subjects among echocardiography videos. The proposed method achieves 99.38% sensitivity and 99.63% specificity for the detection of the severity of MR and normal subjects. PMID:27082766

  7. Extension of the broadband single-mode integrated optical waveguide technique to the ultraviolet spectral region and its applications.

    PubMed

    Wiederkehr, Rodrigo S; Mendes, Sergio B

    2014-03-21

    We report here the fabrication, characterization, and application of a single-mode integrated optical waveguide (IOW) spectrometer capable of acquiring optical absorbance spectra of surface-immobilized molecules in the visible and ultraviolet spectral region down to 315 nm. The UV-extension of the single-mode IOW technique to shorter wavelengths was made possible by our development of a low-loss single-mode dielectric waveguide in the UV region based on an alumina film grown by atomic layer deposition (ALD) over a high quality fused silica substrate, and by our design/fabrication of a broadband waveguide coupler formed by an integrated diffraction grating combined with a highly anamorphic optical beam of large numerical aperture. As an application of the developed technology, we report here the surface adsorption process of bacteriochlorophyll a on different interfaces using its Soret absorption band centred at 370 nm. The effects of different chemical compositions at the solid-liquid interface on the adsorption and spectral properties of bacteriochlorophyll a were determined from the polarized UV-Vis IOW spectra acquired with the developed instrumentation. The spectral extension of the single-mode IOW technique into the ultraviolet region is an important advance as it enables extremely sensitive studies in key characteristics of surface molecular processes (e.g., protein unfolding and solvation of aromatic amino-acid groups under surface binding) whose spectral features are mainly located at wavelengths below the visible spectrum. PMID:24466569

  8. Extension of the broadband single-mode integrated optical waveguide technique to the ultraviolet spectral region and its applications

    PubMed Central

    Wiederkehr, Rodrigo S.; Mendes, Sergio B.

    2014-01-01

    We report here the fabrication, characterization, and application of a single-mode integrated optical waveguide (IOW) spectrometer capable of acquiring optical absorbance spectra of surface-immobilized molecules in the visible and ultraviolet spectral region down to 315 nm. The UV-extension of the single-mode IOW technique to shorter wavelengths was made possible by our development of a low-loss single-mode dielectric waveguide in the UV region based on an alumina film grown by atomic layer deposition (ALD) over a high quality fused silica substrate, and by our design/fabrication of a broadband waveguide coupler formed by an integrated diffraction grating combined with a highly anamorphic optical beam of large numerical aperture. As an application of the developed technology, we report here the surface adsorption process of bacteriochlorophyll a on different interfaces using its Soret absorption band centred at 370 nm. The effects of different chemical compositions at the solid/liquid interface on the adsorption and spectral properties of bacteriochlorophyll a were determined from the polarized UV-Vis IOW spectra acquired with the developed instrumentation. The spectral extension of the single-mode IOW technique into the ultraviolet region is an important advance as it enables extremely sensitive studies in key characteristics of surface molecular processes (e.g., protein unfolding and solvation of aromatic amino-acid groups under surface binding) whose spectral features are mainly located at wavelengths below the visible spectrum. PMID:24466569

  9. Feature-Based Registration Techniques

    NASA Astrophysics Data System (ADS)

    Lorenz, Cristian; Klinder, Tobias; von Berg, Jens

    In contrast to intensity-based image registration, where a similarity measure is typically evaluated at each voxel location, feature-based registration works on a sparse set of image locations. Therefore, it needs an explicit step of interpolation to supply a dense deformation field. In this chapter, the application of feature-based registration to pulmonary image registration as well as hybrid methods, combining feature-based with intensity-based registration, is discussed. In contrast to pure feature based registration methods, hybrid methods are increasingly proposed in the pulmonary context and have the potential to out-perform purely intensity based registration methods. Available approaches will be classified along the categories feature type, correspondence definition, and interpolation type to finally achieve a dense deformation field.

  10. Extensibility in Model-Based Business Process Engines

    NASA Astrophysics Data System (ADS)

    Sánchez, Mario; Jiménez, Camilo; Villalobos, Jorge; Deridder, Dirk

    An organization’s ability to embrace change, greatly depends on systems that support their operation. Specifically, process engines might facilitate or hinder changes, depending on their flexibility, their extensibility and the changes required: current workflow engine characteristics create difficulties in organizations that need to incorporate some types of modifications. In this paper we present Cumbia, an extensible MDE platform to support the development of flexible and extensible process engines. In a Cumbia process, models represent participating concerns (control, resources, etc.), which are described with concern-specific languages. Cumbia models are executed in a coordinated way, using extensible engines specialized for each concern.

  11. Service-based extensions to the JDL fusion model

    NASA Astrophysics Data System (ADS)

    Antony, Richard T.; Karakowski, Joseph A.

    2008-04-01

    Extensions to a previously developed service-based fusion process model are presented. The model accommodates (1) traditional sensor data and human-generated input, (2) streaming and non-streaming data feeds, and (3) the fusion of both physical and non-physical entities. More than a dozen base-level fusion services are identified. These services provide the foundation functional decomposition of levels 0 - 2 in JDL fusion model. Concepts, such as clustering, link analysis and database mining, that have traditionally been only loosely associated with the fusion process, are shown to play key roles within this fusion framework. Additionally, the proposed formulation extends the concepts of tracking and cross-entity association to non-physical entities, as well as supports effective exploitation of a priori and derived context knowledge. Finally, the proposed framework is shown to support set theoretic properties, such as equivalence and transitivity, as well as the development of a pedigree summary metric that characterizes the informational distance between individual fused products and source data.

  12. Extension Clientele Preferences: Accessing Research-Based Information Online

    ERIC Educational Resources Information Center

    Davis, Jamie M.

    2014-01-01

    Research has indicated there are a number of benefits to Extension educators in delivering educational program and content through distance technology methods. However, Extension educators are commonly apprehensive about this transition due to assumptions made about their clientele, because little research has been conducted to examine…

  13. An extension to the dynamic plane source technique for measuring thermal conductivity, thermal diffusivity, and specific heat of dielectric solids

    NASA Astrophysics Data System (ADS)

    Karawacki, Ernest; Suleiman, Bashir M.; ul-Haq, Izhar; Nhi, Bui-Thi

    1992-10-01

    The recently developed dynamic plane source (DPS) technique for simultaneous determination of the thermal properties of fast thermally conducting materials with thermal conductivities between 200 and 2 W/mK has now been extended for studying relatively slow conducting materials with thermal conductivities equal or below 2 W/mK. The method is self-checking since the thermal conductivity, thermal diffusivity specific heat, and effusivity of the material are obtained independently from each other. The theory of the technique and the experimental arrangement are given in detail. The data evaluation procedure is simple and makes it possible to reveal the distortions due to the nonideal experimental conditions. The extension to the DPS technique has been implemented at room temperature to study the samples of cordierite-based ceramic Cecorite 130P (thermal conductivity equal to 1.48 W/mK), rubber (0.403 W/mK), and polycarbonate (0.245 W/mK). The accuracy of the method is within ±5%.

  14. Comparison of Three Techniques to Monitor Bathymetric Evolution in a Spatially Extensive, Rapidly Changing Environment

    NASA Astrophysics Data System (ADS)

    Rutten, J.; Ruessink, G.

    2014-12-01

    The wide variety in spatial and temporal scales inherent to nearshore morphodynamics, together with site-specific environmental characteristics, complicate our current understanding and predictive capability of large (~ km)-scale, long-term (seasons-years) sediment transport patterns and morphologic evolution. The monitoring of this evolution at all relevant scales demands a smart combination of multiple techniques. Here, we compare depth estimates derived from operational optical (Argus video) and microwave (X-band radar) remote sensing with those from jet-ski echo-sounding in an approximately 2.5 km2 large region at the Sand Engine, a 20 Mm3 mega-nourishment at the Dutch coast. Using depth inversion techniques based on linear wave theory, frequent (hourly-daily) bathymetric maps were derived from instantaneous Argus video and X-band radar imagery. Jet-ski surveys were available every 2 to 3 months. Depth inversion on Argus imagery overestimates surveyed depths by up to 0.5 m in shallow water (< 2 m), but underestimates larger water depths (> 5m) by up to 1 m. Averaged over the entire subtidal study area, the errors canceled in volumetric budget computations. Additionally, estimates of shoreline and subtidal sandbar positions were derived from Argus imagery and jet-ski surveys. Sandbar crest positions extracted from daily low-tide time-exposure Argus images reveal a persistent onshore offset of some 20 m, but do show the smaller temporal variability not visible from jet-ski surveys. Potential improvements to the applied depth-inversion technique will be discussed.

  15. Technique for Extension of Small Antenna Array Mutual-Coupling Data to Larger Antenna Arrays

    NASA Technical Reports Server (NTRS)

    Bailey, M. C.

    1996-01-01

    A technique is presented whereby the mutual interaction between a small number of elements in a planar array can be interpolated and extrapolated to accurately predict the combined interactions in a much larger array of many elements. An approximate series expression is developed, based upon knowledge of the analytical characteristic behavior of the mutual admittance between small aperture antenna elements in a conducting ground plane. This expression is utilized to analytically extend known values for a few spacings and orientations to other element configurations, thus eliminating the need to numerically integrate a large number of highly oscillating and slowly converging functions. This paper shows that the technique can predict very accurately the mutual coupling between elements in a very large planar array with a knowledge of the self-admittance of an isolated element and the coupling between only two-elements arranged in eight different pair combinations. These eight pair combinations do not necessarily have to correspond to pairs in the large array, although all of the individual elements must be identical.

  16. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  17. Two dimensional restoration of seismic reflection profiles from Mozambique: technique for assessing rift extension histories

    SciTech Connect

    Iliffe, J.E.; Debuyl, M.; Kendall, C.G.St.C.; Lerche, I.

    1986-05-01

    Seismic reflection data from offshore Mozambique between longitudes 25/sup 0/ and 26/sup 0/ and latitudes 34/sup 0/ and 35/sup 0/ reveals a V-shaped rift, the apex of which points northward, toward the coast. This study retraces the rift's extensional history by geometric reconstruction of seismic profiles, selected perpendicular to tectonic strike. Depth conversions are performed, followed by bed length and volume balancing to test the interpretations and calculate a total extension value for the extension factor. The sediments are then backstripped in sedimentary sequences, restoring the increments of throw on faults accordingly. After each sequence is removed, the sediments are decompacted in an attempt to recover the original volume prior to the sequence deposition. The extension factor is again calculated. This process is repeated down the sequences until the result is the pre-rift state of the basin. This analysis results in an extension estimate for each sequence-time increment, as a percentage of the total extension. From this method, a detailed crustal extension history is deduced, which, when coupled to the thermal history from subsidence backstripping and paleoheatflow studies, could be used in the basin analysis assessment of the oil potential of this and other rifts.

  18. Group-Based Learning in an Authoritarian Setting? Novel Extension Approaches in Vietnam's Northern Uplands

    ERIC Educational Resources Information Center

    Schad, Iven; Roessler, Regina; Neef, Andreas; Zarate, Anne Valle; Hoffmann, Volker

    2011-01-01

    This study aims to analyze the potential and constraints of group-based extension approaches as an institutional innovation in the Vietnamese agricultural extension system. Our analysis therefore unfolds around the challenges of how to foster this kind of approach within the hierarchical extension policy setting and how to effectively shape and…

  19. The Impact of Tour-Based Diversity Programming on County Extension Personnel and Programs

    ERIC Educational Resources Information Center

    Shaklee, Harriet; Luckey, Brian; Tifft, Kathee

    2014-01-01

    This article explores the effect that planning and conducting an intensive multi-day, tour-based diversity workshop can have on the professional development and Extension work of the county Extension educators involved. Survey data was collected from the county Extension educators who planned workshops throughout Idaho. Educators reported that the…

  20. Optical connection management in ASON based on LDP extensions

    NASA Astrophysics Data System (ADS)

    Qi, Tianlong; Zheng, Xiaoping; Zhang, Hanyi; Guo, Yili

    2002-09-01

    We make an extension to the label distribute protocol in order to realize optical connection management in automatically switched optical network. Its characteristics, such as the dynamic lightpath creation and deletion, and fast restoration are investigated and tested experimentally. How to handle failures occur during these two processes is also explained in detail. The experiment results are discussed and compared with othersí previous work. The experiment setup which is made of four optical nodes and of mesh type network is also described in details in the paper.

  1. Using the Delphi Technique to Assess Educational Needs Related to Extension's 4-H Beef Program.

    ERIC Educational Resources Information Center

    Shih, Ching-Chun; Gamon, Julia A.

    1997-01-01

    Delphi panels completing questionnaires included 32 parents of 4-H students, 16 extension beef specialists, 21 4-H field specialists, and 21 industry representatives. They identified 31 subject-matter and 30 life-skill topics useful for 4-H manuals. Emerging topics included consumer and environmental concerns. (SK)

  2. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    SciTech Connect

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  3. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    USGS Publications Warehouse

    Gaydos, Leonard

    1978-01-01

    The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.

  4. Autofluorescence based diagnostic techniques for oral cancer

    PubMed Central

    Balasubramaniam, A. Murali; Sriraman, Rajkumari; Sindhuja, P.; Mohideen, Khadijah; Parameswar, R. Arjun; Muhamed Haris, K. T.

    2015-01-01

    Oral cancer is one of the most common cancers worldwide. Despite of various advancements in the treatment modalities, oral cancer mortalities are more, particularly in developing countries like India. This is mainly due to the delay in diagnosis of oral cancer. Delay in diagnosis greatly reduces prognosis of the treatment and also cause increased morbidity and mortality rates. Early diagnosis plays a key role in effective management of oral cancer. A rapid diagnostic technique can greatly aid in the early diagnosis of oral cancer. Now a day's many adjunctive oral cancer screening techniques are available for the early diagnosis of cancer. Among these, autofluorescence based diagnostic techniques are rapidly emerging as a powerful tool. These techniques are broadly discussed in this review. PMID:26538880

  5. Extension of the Viscous Collision Limiting Direct Simulation Monte Carlo Technique to Multiple Species

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Burt, Jonathan M.

    2016-01-01

    There are many flows fields that span a wide range of length scales where regions of both rarefied and continuum flow exist and neither direct simulation Monte Carlo (DSMC) nor computational fluid dynamics (CFD) provide the appropriate solution everywhere. Recently, a new viscous collision limited (VCL) DSMC technique was proposed to incorporate effects of physical diffusion into collision limiter calculations to make the low Knudsen number regime normally limited to CFD more tractable for an all-particle technique. This original work had been derived for a single species gas. The current work extends the VCL-DSMC technique to gases with multiple species. Similar derivations were performed to equate numerical and physical transport coefficients. However, a more rigorous treatment of determining the mixture viscosity is applied. In the original work, consideration was given to internal energy non-equilibrium, and this is also extended in the current work to chemical non-equilibrium.

  6. Some Novel Solidification Processing Techniques Being Investigated at MSFC: Their Extension for Study Aboard the ISS

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.; Anilkumar, A. V.; Fedoseyev, A. I.; Mazuruk, K.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The float-zone and the Bridgman techniques are two classical directional solidification processing methods that are used to improve materials properties. Unfortunately, buoyancy effects and gravity-driven convection due to unstable temperature and/or composition gradients still produce solidified products that exhibit segregation and, consequently, degraded properties. This presentation will briefly introduce how some novel processing applications can minimize detrimental gravitational effects and enhance microstructural uniformity. Discussion follows that to fully understand and model these procedures requires utilizing, in conjunction with a novel mixing technique, the facilities and quiescent microgravity environment available on the ISS.

  7. Application of a substructuring technique to the problem of crack extension and closure

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.

    1974-01-01

    A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.

  8. Hierarchic plate and shell models based on p-extension

    NASA Technical Reports Server (NTRS)

    Szabo, Barna A.; Sahrmann, Glenn J.

    1988-01-01

    Formulations of finite element models for beams, arches, plates and shells based on the principle of virtual work was studied. The focus is on computer implementation of hierarchic sequences of finite element models suitable for numerical solution of a large variety of practical problems which may concurrently contain thin and thick plates and shells, stiffeners, and regions where three dimensional representation is required. The approximate solutions corresponding to the hierarchic sequence of models converge to the exact solution of the fully three dimensional model. The stopping criterion is based on: (1) estimation of the relative error in energy norm; (2) equilibrium tests, and (3) observation of the convergence of quantities of interest.

  9. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    NASA Astrophysics Data System (ADS)

    Hofer, Thomas James

    2014-10-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed effective fiducialization of the crystal volumes and background rejection sufficient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125--128 were analyzed for the first time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel "mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have significant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them difficult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent fiducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for

  10. Extension of vibrational power flow techniques to two-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or Finite Element Analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid- frequencies between the optimum frequency regimes for FEA and SEA. Power flow analysis has in general been used on one-dimensional beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to two-dimensional plate like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.

  11. Extension of vibrational power flow techniques to two-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cuschieri, Joseph M.

    1988-01-01

    In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or finite element analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid frequencies between the optimum frequency regimes for SEA and FEA. Power flow analysis has in general been used on 1-D beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to 2-D plate-like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA results at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.

  12. CAD techniques applied to diesel engine design. Extension of the RK range. [Ruston diesels

    SciTech Connect

    Sinha, S.K.; Buckthorpe, D.E.

    1980-01-01

    Rustion Diesels Ltd. produce three ranges of engines, the AP range covering engine powers from 500 to 1400 bhp (350 to 1000 kW electrical), the RK range covering 1410 to 4200 bhp (1 to 3 MW electrical), and the AT range covering 1650 to 4950 bhp (1-2 to 3-5 MW electrical). The AT engine range is available at speeds up to 600 rev/min, whereas the AP and RK ranges cover engine speeds from 600 to 1000 rev/min. The design philosophy and extension of the RK range of engines are investigated. This is a 251 mm (ten inch) bore by 305mm (twelve inch) stroke engine and is available in 6-cylinder in-line form and 8-, 12-, and 16-cylinder vee form. The RK engine features a cast-iron crankcase and bedplate design with a forged alloy-steel crankshaft. Combustion-chamber components consist of a cast-iron cylinder head and liner, steel exhaust and inlet valves, and a single-piece aluminium piston. The durability and reliability of RK engines have been fully proven in service with over 30 years' experience in numerous applications for power generation, reaction, and marine propulsion.

  13. A Novel SNPs Detection Method Based on Gold Magnetic Nanoparticles Array and Single Base Extension

    PubMed Central

    Li, Song; Liu, Hongna; Jia, Yingying; Deng, Yan; Zhang, Liming; Lu, Zhuoxuan; He, Nongyue

    2012-01-01

    To fulfill the increasing need for large-scale genetic research, a high-throughput and automated SNPs genotyping method based on gold magnetic nanoparticles (GMNPs) array and dual-color single base extension has been designed. After amplification of DNA templates, biotinylated extension primers were captured by streptavidin coated gold magnetic nanoparticle (SA-GMNPs). Next a solid-phase, dual-color single base extension (SBE) reaction with the specific biotinylated primer was performed directly on the surface of the GMNPs. Finally, a “bead array” was fabricated by spotting GMNPs with fluorophore on a clean glass slide, and the genotype of each sample was discriminated by scanning the “bead array”. MTHFR gene C677T polymorphism of 320 individual samples were interrogated using this method, the signal/noise ratio for homozygous samples were over 12.33, while the signal/noise ratio for heterozygous samples was near 1. Compared with other dual-color hybridization based genotyping methods, the method described here gives a higher signal/noise ratio and SNP loci can be identified with a high level of confidence. This assay has the advantage of eliminating the need for background subtraction and direct analysis of the fluorescence values of the GMNPs to determine their genotypes without the necessary procedures for purification and complex reduction of PCR products. The application of this strategy to large-scale SNP studies simplifies the process, and reduces the labor required to produce highly sensitive results while improving the potential for automation. PMID:23139724

  14. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    SciTech Connect

    Hofer, Thomas James

    2014-12-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low

  15. A Smalltalk-based extension to traditional Geographic Information Systems

    SciTech Connect

    Korp, P.A.; Lurie, G.R.; Christiansen, J.H.

    1995-11-01

    The Dynamic Environmental Effects Model{copyright} (DEEM), under development at Argonne National Laboratory, is a fully object-based modeling software system that supports distributed, dynamic representation of the interlinked processes and behavior of the earth`s surface and near-surface environment, at variable scales of resolution and aggregation. Many of these real world objects are not stored in a format conducive to efficient GIS usage. Their dynamic nature, complexity and number of possible DEEM entity classes precluded efficient integration with traditional GIS technologies due to the loosely coupled nature of their data representations. To address these shortcomings, an intelligent object-oriented GIS engine (OOGIS) was developed. This engine provides not only a spatially optimized object representation, but also direct linkages to the underlying object, its data and behaviors.

  16. Extension of POA based on Fiber Element to Girder Bridge

    SciTech Connect

    Li Zhenxin; Qiang Shizhong

    2010-05-21

    Because of its main advantage of simplicity, practicality, lower computational cost and relative good results Pushover analysis (POA) has become an effective analytical tool during the last decade for the seismic assessment of buildings. But such work on bridges has been very limited. Hence, the aim of this study is to adapt POA for nonlinear seismic analysis of girder bridges, and investigate its applicability in the case of an existing river-spanning approach bridge. To three different types bridge models the nonlinear POA, which adopts fiber model nonlinear beam-column element based on flexibility approach, with return period about 2500 years is carried out. It can be concluded that POA is applicable for bridges, with some shortcomings associated with the method in general, even when it is applied for buildings. Finally the applicable selection for monitoring point and lateral load pattern is suggested according to dynamic characteristic of girder bridges.

  17. Design of extensible meteorological data acquisition system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Yin-hua; Zhang, Hui-jun; Li, Xiao-hui

    2015-02-01

    In order to compensate the tropospheric refraction error generated in the process of satellite navigation and positioning. Temperature, humidity and air pressure had to be used in concerned models to calculate the value of this error. While FPGA XC6SLX16 was used as the core processor, the integrated silicon pressure sensor MPX4115A and digital temperature-humidity sensor SHT75 are used as the basic meteorological parameter detection devices. The core processer was used to control the real-time sampling of ADC AD7608 and to acquire the serial output data of SHT75. The data was stored in the BRAM of XC6SLX16 and used to generate standard meteorological parameters in NEMA format. The whole design was based on Altium hardware platform and ISE software platform. The system was described in the VHDL language and schematic diagram to realize the correct detection of temperature, humidity, air pressure. The 8-channel synchronous sampling characteristics of AD7608 and programmable external resources of FPGA laid the foundation for the increasing of analog or digital meteorological element signal. The designed meteorological data acquisition system featured low cost, high performance, multiple expansions.

  18. Multiview video codec based on KTA techniques

    NASA Astrophysics Data System (ADS)

    Seo, Jungdong; Kim, Donghyun; Ryu, Seungchul; Sohn, Kwanghoon

    2011-03-01

    Multi-view video coding (MVC) is a video coding standard developed by MPEG and VCEG for multi-view video. It showed average PSNR gain of 1.5dB compared with view-independent coding by H.264/AVC. However, because resolutions of multi-view video are getting higher for more realistic 3D effect, high performance video codec is needed. MVC adopted hierarchical B-picture structure and inter-view prediction as core techniques. The hierarchical B-picture structure removes the temporal redundancy, and the inter-view prediction reduces the inter-view redundancy by compensated prediction from the reconstructed neighboring views. Nevertheless, MVC has inherent limitation in coding efficiency, because it is based on H.264/AVC. To overcome the limit, an enhanced video codec for multi-view video based on Key Technology Area (KTA) is proposed. KTA is a high efficiency video codec by Video Coding Expert Group (VCEG), and it was carried out for coding efficiency beyond H.264/AVC. The KTA software showed better coding gain than H.264/AVC by using additional coding techniques. The techniques and the inter-view prediction are implemented into the proposed codec, which showed high coding gain compared with the view-independent coding result by KTA. The results presents that the inter-view prediction can achieve higher efficiency in a multi-view video codec based on a high performance video codec such as HEVC.

  19. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    ERIC Educational Resources Information Center

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  20. Laser Remote Sensing: Velocimetry Based Techniques

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  1. Genealogical-based method for multiple ontology self-extension in MeSH.

    PubMed

    Guo, Yu-Wen; Tang, Yi-Tsung; Kao, Hung-Yu

    2014-06-01

    During the last decade, the advent of Ontologies used for biomedical annotation has had a deep impact on life science. MeSH is a well-known Ontology for the purpose of indexing journal articles in PubMed, improving literature searching on multi-domain topics. Since the explosion of data growth in recent years, there are new terms, concepts that weed through the old and bring forth the new. Automatically extending sets of existing terms will enable bio-curators to systematically improve text-based ontologies level by level. However, most of the related techniques which apply symbolic patterns based on a literature corpus tend to focus on more general but not specific parts of the ontology. Therefore, in this work, we present a novel method for utilizing genealogical information from Ontology itself to find suitable siblings for ontology extension. Based on the breadth and depth dimensions, the sibling generation stage and pruning strategy are proposed in our approach. As a result, on the average, the precision of the genealogical-based method achieved 0.5, with the best 0.83 performance of category "Organisms." We also achieve average precision 0.69 of 229 new terms in MeSH 2013 version. PMID:24893362

  2. Neutron-based nonintrusive inspection techniques

    NASA Astrophysics Data System (ADS)

    Gozani, Tsahi

    1997-02-01

    Non-intrusive inspection of large objects such as trucks, sea-going shipping containers, air cargo containers and pallets is gaining attention as a vital tool in combating terrorism, drug smuggling and other violation of international and national transportation and Customs laws. Neutrons are the preferred probing radiation when material specificity is required, which is most often the case. Great strides have been made in neutron based inspection techniques. Fast and thermal neutrons, whether in steady state or in microsecond, or even nanosecond pulses are being employed to interrogate, at high speeds, for explosives, drugs, chemical agents, and nuclear and many other smuggled materials. Existing neutron techniques will be compared and their current status reported.

  3. Novel optical password security technique based on optical fractal synthesizer

    NASA Astrophysics Data System (ADS)

    Wu, Kenan; Hu, Jiasheng; Wu, Xu

    2009-06-01

    A novel optical security technique for safeguarding user passwords based on an optical fractal synthesizer is proposed. A validating experiment has been carried out. In the proposed technique, a user password is protected by being converted to a fractal image. When a user sets up a new password, the password is transformed into a fractal pattern, and the fractal pattern is stored in authority. If the user is online-validated, his or her password is converted to a fractal pattern again to compare with the previous stored fractal pattern. The converting process is called the fractal encoding procedure, which consists of two steps. First, the password is nonlinearly transformed to get the parameters for the optical fractal synthesizer. Then the optical fractal synthesizer is operated to generate the output fractal image. The experimental result proves the validity of our method. The proposed technique bridges the gap between digital security systems and optical security systems and has many advantages, such as high security level, convenience, flexibility, hyper extensibility, etc. This provides an interesting optical security technique for the protection of digital passwords.

  4. An Extensible Space-Based Coordination Approach for Modeling Complex Patterns in Large Systems

    NASA Astrophysics Data System (ADS)

    Kühn, Eva; Mordinyi, Richard; Schreiber, Christian

    Coordination is frequently associated with shared data spaces employing Linda coordination. But in practice, communication between parallel and distributed processes is carried out with message exchange patterns. What, actually, do shared data spaces contribute beyond these? In this paper we present a formal representation for a definition of shared spaces by introducing an "extensible tuple model", based on existing research on Linda coordination, some Linda extensions, and virtual shared memory. The main enhancements of the extensible tuple model comprise: means for structuring of spaces, Internet- compatible addressing of resources, more powerful coordination capabilities, a clear separation of user data and coordination information, support of symmetric peer application architectures, and extensibility through programmable aspects. The advantages of the extensible tuple model (XTM) are that it allows for a specification of complex coordination patterns.

  5. Extensions of the Johnson-Neyman Technique to Linear Models With Curvilinear Effects: Derivations and Analytical Tools.

    PubMed

    Miller, Jason W; Stromeyer, William R; Schwieterman, Matthew A

    2013-03-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way interactions in several types of linear models, this method has not been extended to include quadratic terms or more complicated models involving quadratic terms and interactions. Curvilinear relations of this type are incorporated in several theories in the social sciences. This article extends the J-N method to such linear models along with presenting freely available online tools that implement this technique as well as the traditional pick-a-point approach. Algebraic and graphical representations of the proposed J-N extension are provided. An example is presented to illustrate the use of these tools and the interpretation of findings. Issues of reliability as well as "spurious moderator" effects are discussed along with recommendations for future research. PMID:26741727

  6. "YFlag"--a single-base extension primer based method for gender determination.

    PubMed

    Allwood, Julia S; Harbison, Sally Ann

    2015-01-01

    Assigning the gender of a DNA contributor in forensic analysis is typically achieved using the amelogenin test. Occasionally, this test produces false-positive results due to deletions occurring on the Y chromosome. Here, a four-marker "YFlag" method is presented to infer gender using single-base extension primers to flag the presence (or absence) of Y-chromosome DNA within a sample to supplement forensic STR profiling. This method offers built-in redundancy, with a single marker being sufficient to detect the presence of male DNA. In a study using 30 male and 30 female individuals, detection of male DNA was achieved with c. 0.03 ng of male DNA. All four markers were present in male/female mixture samples despite the presence of excessive female DNA. In summary, the YFlag system offers a method that is reproducible, specific, and sensitive, making it suitable for forensic use to detect male DNA. PMID:25354446

  7. Rapid Disaster Analysis based on SAR Techniques

    NASA Astrophysics Data System (ADS)

    Yang, C. H.; Soergel, U.

    2015-03-01

    Due to all-day and all-weather capability spaceborne SAR is a valuable means for rapid mapping during and after disaster. In this paper, three change detection techniques based on SAR data are discussed: (1) initial coarse change detection, (2) flooded area detection, and (3) linear-feature change detection. The 2011 Tohoku Earthquake and Tsunami is used as case study, where earthquake and tsunami events provide a complex case for this study. In (1), pre- and post-event TerraSAR-X images are coregistered accurately to produce a false-color image. Such image provides a quick and rough overview of potential changes, which is useful for initial decision making and identifies areas worthwhile to be analysed further in more depth. In (2), the post-event TerraSAR-X image is used to extract the flooded area by morphological approaches. In (3), we are interested in detecting changes of linear shape as indicator for modified man-made objects. Morphological approaches, e.g. thresholding, simply extract pixel-based changes in the difference image. However, in this manner many irrelevant changes are highlighted, too (e.g., farming activity, speckle). In this study, Curvelet filtering is applied in the difference image not only to suppress false alarms but also to enhance the change signals of linear-feature form (e.g. buildings) in settlements. Afterwards, thresholding is conducted to extract linear-shaped changed areas. These three techniques mentioned above are designed to be simple and applicable in timely disaster analysis. They are all validated by comparing with the change map produced by Center for Satellite Based Crisis Information, DLR.

  8. Some Techniques for Computer-Based Assessment in Medical Education.

    ERIC Educational Resources Information Center

    Mooney, G. A.; Bligh, J. G.; Leinster, S. J.

    1998-01-01

    Presents a system of classification for describing computer-based assessment techniques based on the level of action and educational activity they offer. Illustrates 10 computer-based assessment techniques and discusses their educational value. Contains 14 references. (Author)

  9. Active-contour-based image segmentation using machine learning techniques.

    PubMed

    Etyngier, Patrick; Ségonne, Florent; Keriven, Renaud

    2007-01-01

    We introduce a non-linear shape prior for the deformable model framework that we learn from a set of shape samples using recent manifold learning techniques. We model a category of shapes as a finite dimensional manifold which we approximate using Diffusion maps. Our method computes a Delaunay triangulation of the reduced space, considered as Euclidean, and uses the resulting space partition to identify the closest neighbors of any given shape based on its Nyström extension. We derive a non-linear shape prior term designed to attract a shape towards the shape prior manifold at given constant embedding. Results on shapes of ventricle nuclei demonstrate the potential of our method for segmentation tasks. PMID:18051143

  10. The Knowledge Base as an Extension of Distance Learning Reference Service

    ERIC Educational Resources Information Center

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  11. Graduate Followup: B.S. Occupational Education Extension Program at Naval Submarine Base, Bangor, Washington.

    ERIC Educational Resources Information Center

    Fellows, George; And Others

    Graduates of a military-base extension program at the Naval Submarine Base, Bangor, Washington, leading to a Bachelor of Science degree in occupational education were studied. Graduates are prepared to teach their occupational specialty at colleges as well as for occupational education work in government, private enterprise, and health care…

  12. Towards Semantically Sensitive Text Clustering: A Feature Space Modeling Technology Based on Dimension Extension

    PubMed Central

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach. PMID:25794172

  13. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project

  14. On the development of NURBS-based isogeometric solid shell elements: 2D problems and preliminary extension to 3D

    NASA Astrophysics Data System (ADS)

    Bouclier, R.; Elguedj, T.; Combescure, A.

    2013-11-01

    This work deals with the development of 2D solid shell non-uniform rational B-spline elements. We address a static problem, that can be solved with a 2D model, involving a thin slender structure under small perturbations. The plane stress, plane strain and axisymmetric assumption can be made. projection and reduced integration techniques are considered to deal with the locking phenomenon. The use of the approach leads to the implementation of two strategies insensitive to locking: the first strategy is based on a 1D projection of the mean strain across the thickness; the second strategy undertakes to project all the strains onto a suitably chosen 2D space. Conversely, the reduced integration approach based on Gauss points is less expensive, but only alleviates locking and is limited to quadratic approximations. The performance of the various 2D elements developed is assessed through several numerical examples. Simple extensions of these techniques to 3D are finally performed.

  15. Analysis and amendment of flow control credit-based in SAN extension

    NASA Astrophysics Data System (ADS)

    Qin, Leihua; Yu, Shengsheng; Zhou, Jingli

    2005-11-01

    As organizations increasingly face an enormous influx of data that must be stored, protected, backed up and replicated. One of the best ways to achieve the goal is to interconnect geographically dispersed SANs through reliable and high-speed links. In this storage extension application, flow control deals with the problem where a device receives the frames faster than it can process them, when this happens, the result is that the device is forced to drop some of the frames. The FC flow control protocol is a credit-based mechanism and usually used for SAN extension over WDM and over SONET/SDH. With FC flow control, when a source storage device intends to send data to a target storage device, the initiating storage device must receive credits from target device. For every credit the initiating device obtains, it is permitted to transmit a FC frame, so congestion is always avoided in the network. This paper analysis the mechanisms of FC flow control and it's limitation in SAN extension when the extension distance increases. Computing result indicates that the maximum link efficiency and throughput in SAN extension have relation to credits, frame size and extension distance. In order to achieve the maximum link efficiency and throughput, an extended FC flow control mechanisms are proposed.

  16. Nuclear based techniques for detection of contraband

    SciTech Connect

    Gozani, T.

    1993-12-31

    The detection of contraband such as explosives and drugs concealed in luggage or other container can be quite difficult. Nuclear techniques offer capabilities which are essential to having effective detection devices. This report describes the features of various nuclear techniques and instrumentation.

  17. 40 CFR 66.31 - Exemptions based on an order, extension or suspension.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 16 2013-07-01 2013-07-01 false Exemptions based on an order, extension or suspension. 66.31 Section 66.31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) ASSESSMENT AND COLLECTION OF NONCOMPLIANCE PENALTIES BY EPA Exemption Requests; Revocation of Exemptions §...

  18. 40 CFR 66.31 - Exemptions based on an order, extension or suspension.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Exemptions based on an order, extension or suspension. 66.31 Section 66.31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) ASSESSMENT AND COLLECTION OF NONCOMPLIANCE PENALTIES BY EPA Exemption Requests; Revocation of Exemptions §...

  19. Strategic Partnerships that Strengthen Extension's Community-Based Entrepreneurship Programs: An Example from Maine

    ERIC Educational Resources Information Center

    Bassano, Louis V.; McConnon, James C., Jr.

    2011-01-01

    This article explains how Extension can enhance and expand its nationwide community-based entrepreneurship programs by developing strategic partnerships with other organizations to create highly effective educational programs for rural entrepreneurs. The activities and impacts of the Down East Micro-Enterprise Network (DEMN), an alliance of three…

  20. School-Based Peer Mediation Programs: A Natural Extension of Developmental Guidance Programs.

    ERIC Educational Resources Information Center

    Robertson, Gwendolyn

    School-based peer mediation programs are natural extensions of the kindergarten-grade 12 developmental guidance programs. Peer mediation programs not only provide schools with alternatives to traditional discipline practices, but also teach students important life skills. Existing research on peer mediation is very limited, yet promising. This…

  1. Raising Awareness of Assistive Technology in Older Adults through a Community-Based, Cooperative Extension Program

    ERIC Educational Resources Information Center

    Sellers, Debra M.; Markham, Melinda Stafford

    2012-01-01

    The Fashion an Easier Lifestyle with Assistive Technology (FELAT) curriculum was developed as a needs-based, community educational program provided through a state Cooperative Extension Service. The overall goal for participants was to raise awareness of assistive technology. Program evaluation included a postassessment and subsequent interview to…

  2. Extension of 193 nm dry lithography to 45-nm half-pitch node: double exposure and double processing technique

    NASA Astrophysics Data System (ADS)

    Biswas, Abani M.; Li, Jianliang; Hiserote, Jay A.; Melvin, Lawrence S., III

    2006-10-01

    Immersion lithography and multiple exposure techniques are the most promising methods to extend lithography manufacturing to the 45nm node. Although immersion lithography has attracted much attention recently as a promising optical lithography extension, it will not solve all the problems at the 45-nm node. The 'dry' option, (i.e. double exposure/etch) which can be realized with standard processing practice, will extend 193-nm lithography to the end of the current industry roadmap. Double exposure/etch lithography is expensive in terms of cost, throughput time, and overlay registration accuracy. However, it is less challenging compared to other possible alternatives and has the ability to break through the κ I barrier (0.25). This process, in combination with attenuated PSM (att-PSM) mask, is a good imaging solution that can reach, and most likely go beyond, the 45-nm node. Mask making requirements in a double exposure scheme will be reduced significantly. This can be appreciated by the fact that the separation of tightly-pitched mask into two less demanding pitch patterns will reduce the stringent specifications for each mask. In this study, modeling of double exposure lithography (DEL) with att-PSM masks to target 45-nm node is described. In addition, mask separation and implementation issues of optical proximity corrections (OPC) to improve process window are studied. To understand the impact of OPC on the process window, Fourier analysis of the masks has been carried out as well.

  3. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.

    2014-12-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes?An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project.

  4. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  5. Performance evaluation of extension education centers in universities based on the balanced scorecard.

    PubMed

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-02-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance evaluation indices have been selected and then utilizing the decision making trial and evaluation laboratory (DEMATEL) and analytic network process (ANP), respectively, further establishes the causality between the four BSC perspectives as well as the relative weights between evaluation indices. According to this previous result, an empirical analysis of the performance evaluation of extension education centers of three universities at Taoyuan County in Taiwan is illustrated by applying VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). From the analysis results, it indicates that "Learning and growth" is the significant influential factor and it would affect the other three perspectives. In addition, it is discovered that "Internal process" perspective as well as "Financial" perspective play important roles in the performance evaluation of extension education centers. The top three key performance indices are "After-sales service", "Turnover volume", and "Net income". The proposed evaluation model could be considered as a reference for extension education centers in universities to prioritize their improvements on the key performance indices after performing VIKOR analyses. PMID:20619892

  6. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  7. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  8. Liquid refractometer based on fringe projection technique

    NASA Astrophysics Data System (ADS)

    de Angelis, Marco; De Nicola, Sergio; Ferraro, Pietro; Finizio, Andrea; Pierattini, Giovanni

    1999-08-01

    Measurement of the refractive index of liquids is of great importance in applications such as characterization and control of adulteration of liquid commonly used and in pollution monitoring. We present and discuss a fringe projection technique for measuring the index of refraction of transparent liquid materials.

  9. Association of Anterior and Lateral Extraprostatic Extensions with Base-Positive Resection Margins in Prostate Cancer

    PubMed Central

    Abalajon, Mark Joseph; Jang, Won Sik; Kwon, Jong Kyou; Yoon, Cheol Yong; Lee, Joo Yong; Cho, Kang Su; Ham, Won Sik

    2016-01-01

    Introduction Positive surgical margins (PSM) detected in the radical prostatectomy specimen increase the risk of biochemical recurrence (BCR). Still, with formidable number of patients never experiencing BCR in their life, the reason for this inconsistency has been attributed to the artifacts and to the spontaneous regression of micrometastatic site. To investigate the origin of margin positive cancers, we have looked into the influence of extraprostatic extension location on the resection margin positive site and its implications on BCR risk. Materials & Methods The clinical information and follow-up data of 612 patients who had extraprostatic extension and positive surgical margin at the time of robot assisted radical prostatectomy (RARP) in the single center between 2005 and 2014 were modeled using Fine and Gray’s competing risk regression analysis for BCR. Extraprostatic extensions were divided into categories according to location as apex, base, anterior, posterior, lateral, and posterolateral. Extraprostatic extensions were defined as presence of tumor beyond the borders of the gland in the posterior and posterolateral regions. Tumor admixed with periprostatic fat was additionally considered as having extraprostatic extension if capsule was vague in the anterior, apex, and base regions. Positive surgical margins were defined as the presence of tumor cells at the inked margin on the inspection under microscopy. Association of these classifications with the site of PSM was evaluated by Cohen’s Kappa analysis for concordance and logistic regression for the odds of apical and base PSMs. Results Median follow-up duration was 36.5 months (interquartile range[IQR] 20.1–36.5). Apex involvement was found in 158 (25.8%) patients and base in 110 (18.0%) patients. PSMs generally were found to be associated with increased risk of BCR regardless of location, with BCR risk highest for base PSM (HR 1.94, 95% CI 1.40–2.68, p<0.001) after adjusting for age, initial

  10. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    NASA Astrophysics Data System (ADS)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  11. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  12. The cast aluminum denture base. Part II: Technique.

    PubMed

    Halperin, A R; Halperin, G C

    1980-07-01

    A technique to wax-up and cast an aluminum base and a method to incorporate the base into the final denture base has been discussed. This technique does not use induction casting, rather it uses two casting ovens and a centrifugal casting machine. PMID:6991680

  13. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  14. Liquid tunable microlenses based on MEMS techniques

    NASA Astrophysics Data System (ADS)

    Zeng, Xuefeng; Jiang, Hongrui

    2013-08-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven and those integrated within microfluidic systems.

  15. Speech recognition based on pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Rabiner, Lawrence R.

    1990-05-01

    Algorithms for speech recognition can be characterized broadly as pattern recognition approaches and acoustic phonetic approaches. To date, the greatest degree of success in speech recognition has been obtained using pattern recognition paradigms. The use of pattern recognition techniques were applied to the problems of isolated word (or discrete utterance) recognition, connected word recognition, and continuous speech recognition. It is shown that understanding (and consequently the resulting recognizer performance) is best to the simplest recognition tasks and is considerably less well developed for large scale recognition systems.

  16. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    NASA Astrophysics Data System (ADS)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  17. Feature-specific imaging: Extensions to adaptive object recognition and active illumination based scene reconstruction

    NASA Astrophysics Data System (ADS)

    Baheti, Pawan K.

    Computational imaging (CI) systems are hybrid imagers in which the optical and post-processing sub-systems are jointly optimized to maximize the task-specific performance. In this dissertation we consider a form of CI system that measures the linear projections (i.e., features) of the scene optically, and it is commonly referred to as feature-specific imaging (FSI). Most of the previous work on FSI has been concerned with image reconstruction. Previous FSI techniques have also been non-adaptive and restricted to the use of ambient illumination. We consider two novel extensions of the FSI system in this work. We first present an adaptive feature-specific imaging (AFSI) system and consider its application to a face-recognition task. The proposed system makes use of previous measurements to adapt the projection basis at each step. We present both statistical and information-theoretic adaptation mechanisms for the AFSI system. The sequential hypothesis testing framework is used to determine the number of measurements required for achieving a specified misclassification probability. We demonstrate that AFSI system requires significantly fewer measurements than static-FSI (SFSI) and conventional imaging at low signal-to-noise ratio (SNR). We also show a trade-off, in terms of average detection time, between measurement SNR and adaptation advantage. Experimental results validating the AFSI system are presented. Next we present a FSI system based on the use of structured light. Feature measurements are obtained by projecting spatially structured illumination onto an object and collecting all of the reflected light onto a single photodetector. We refer to this system as feature-specific structured imaging (FSSI). Principal component features are used to define the illumination patterns. The optimal LMMSE operator is used to generate object estimates from the measurements. We demonstrate that this new imaging approach reduces imager complexity and provides improved image

  18. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  19. Techniques for detumbling a disabled space base

    NASA Technical Reports Server (NTRS)

    Kaplan, M. H.

    1973-01-01

    Techniques and conceptual devices for carrying out detumbling operations are examined, and progress in the development of these concepts is discussed. Devices which reduce tumble to simple spin through active linear motion of a small mass are described, together with a Module for Automatic Dock and Detumble (MADD) that could perform an orbital transfer from the shuttle in order to track and dock at a preselected point on the distressed craft. Once docked, MADD could apply torques by firing thrustors to detumble the passive vehicle. Optimum combinations of mass-motion and external devices for various situation should be developed. The need for completely formulating the automatic control logic of MADD is also emphasized.

  20. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience

    PubMed Central

    2012-01-01

    Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data

  1. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  2. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    SciTech Connect

    Blumhagen, Jan O. Ladebeck, Ralf; Fenchel, Matthias; Braun, Harald; Quick, Harald H.; Faul, David; Scheffler, Klaus

    2014-02-15

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B{sub 0}) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B{sub 0} inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might

  3. Development of a finite element based delamination analysis for laminates subject to extension, bending, and torsion

    NASA Technical Reports Server (NTRS)

    Hooper, Steven J.

    1989-01-01

    Delamination is a common failure mode of laminated composite materials. This type of failure frequently occurs at the free edges of laminates where singular interlaminar stresses are developed due to the difference in Poisson's ratios between adjacent plies. Typically the delaminations develop between 90 degree plies and adjacent angle plies. Edge delamination has been studied by several investigators using a variety of techniques. Recently, Chan and Ochoa applied the quasi-three-dimensional finite element model to the analysis of a laminate subject to bending, extension, and torsion. This problem is of particular significance relative to the structural integrity of composite helicopter rotors. The task undertaken was to incorporate Chan and Ochoa's formulation into a Raju Q3DG program. The resulting program is capable of modeling extension, bending, and torsional mechanical loadings as well as thermal and hygroscopic loadings. The addition of the torsional and bending loading capability will provide the capability to perform a delamination analysis of a general unsymmetric laminate containing four cracks, each of a different length. The solutions obtained using this program are evaluated by comparing them with solutions from a full three-dimensional finite element solution. This comparison facilitates the assessment of three dimensional affects such as the warping constraint imposed by the load frame grips. It wlso facilitates the evaluation of the external load representation employed in the Q3D formulation. Finally, strain energy release rates computed from the three-dimensional results are compared with those predicted using the quasi-three-dimensional formulation.

  4. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  5. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  6. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  7. A novel protein complex identification algorithm based on Connected Affinity Clique Extension (CACE).

    PubMed

    Li, Peng; He, Tingting; Hu, Xiaohua; Zhao, Junmin; Shen, Xianjun; Zhang, Ming; Wang, Yan

    2014-06-01

    A novel algorithm based on Connected Affinity Clique Extension (CACE) for mining overlapping functional modules in protein interaction network is proposed in this paper. In this approach, the value of protein connected affinity which is inferred from protein complexes is interpreted as the reliability and possibility of interaction. The protein interaction network is constructed as a weighted graph, and the weight is dependent on the connected affinity coefficient. The experimental results of our CACE in two test data sets show that the CACE can detect the functional modules much more effectively and accurately when compared with other state-of-art algorithms CPM and IPC-MCE. PMID:24803142

  8. Raising awareness of assistive technology in older adults through a community-based, cooperative extension program.

    PubMed

    Sellers, Debra M; Markham, Melinda Stafford

    2012-01-01

    The Fashion an Easier Lifestyle with Assistive Technology (FELAT) curriculum was developed as a needs-based, community educational program provided through a state Cooperative Extension Service. The overall goal for participants was to raise awareness of assistive technology. Program evaluation included a postassessment and subsequent interview to determine short-term knowledge gain and longer term behavior change. The sample consisted of mainly older, married females. The FELAT program was effective at raising awareness and increasing knowledge of assistive technology, and for many participants, the program acted as a catalyst for planning to or taking action related to assistive technology. PMID:22816976

  9. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  10. Non-Destructive Techniques Based on Eddy Current Testing

    PubMed Central

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  11. Non-destructive techniques based on eddy current testing.

    PubMed

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  12. Stress-dilatancy based modelling of granular materials and extensions to soils with crushable grains

    NASA Astrophysics Data System (ADS)

    Desimone, Antonio; Tamagnini, Claudio

    2005-01-01

    Stress-dilatancy relations have played a crucial role in the understanding of the mechanical behaviour of soils and in the development of realistic constitutive models for their response. Recent investigations on the mechanical behaviour of materials with crushable grains have called into question the validity of classical relations such as those used in critical state soil mechanics.In this paper, a method to construct thermodynamically consistent (isotropic, three-invariant) elasto-plastic models based on a given stress-dilatancy relation is discussed. Extensions to cover the case of granular materials with crushable grains are also presented, based on the interpretation of some classical model parameters (e.g. the stress ratio at critical state) as internal variables that evolve according to suitable hardening laws. Copyright

  13. In the Field: Increasing Undergraduate Students' Awareness of Extension through a Blended Project-Based Multimedia Production Course

    ERIC Educational Resources Information Center

    Loizzo, Jamie; Lillard, Patrick

    2015-01-01

    Undergraduate students at land-grant institutions across the country are often unaware of the depth and breadth of Extension services and careers. Agricultural communication students collaborated with an Extension programmatic team in a blended and project-based course at Purdue University to develop online videos about small farm agricultural…

  14. Evidence-Based Programming within Cooperative Extension: How Can We Maintain Program Fidelity While Adapting to Meet Local Needs?

    ERIC Educational Resources Information Center

    Olson, Jonathan R.; Welsh, Janet A.; Perkins, Daniel F.

    2015-01-01

    In this article, we describe how the recent movement towards evidence-based programming has impacted Extension. We review how the emphasis on implementing such programs with strict fidelity to an underlying program model may be at odds with Extension's strong history of adapting programming to meet the unique needs of children, youth, families,…

  15. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  16. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  17. Principals Use Research-Based Techniques for Facilitating School Effectiveness.

    ERIC Educational Resources Information Center

    Hord, Shirley M.; Hall, Gene E.

    Research shows that principals with strong leadership qualities are a critical factor in effective schools. This paper describes three research based techniques that principals can use when making decisions about how to help teachers develop their skills. The Concerns Based Adoption Model (CBAM) is an empirically based conceptual framework that…

  18. Extensive aqueous deposits at the base of the dichotomy boundary in Nilosyrtis Mensae, Mars

    NASA Astrophysics Data System (ADS)

    Bandfield, Joshua L.; Amador, Elena S.

    2016-09-01

    Thermal emission imaging system (THEMIS) and Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) spectral datasets were used to identify high bulk SiO2 and hydrated compositions throughout the Nilosyrtis Mensae region. Four isolated locations were identified across the region showing short wavelength silicate absorptions within the 8-12 μm spectral region, indicating surfaces dominated by high Si phases. Much more extensive exposures of hydrated compositions are present throughout the region, indicated by a spectral absorption near 1.9 μm in CRISM data. Although limited in spatial coverage, detailed spectral observations indicate that the hydrated materials contain Fe/Mg-smectites and hydrated silica along with minor exposures of Mg-carbonates and an unidentified hydrated phase. The high SiO2 and hydrated materials are present in layered sediments near the base of topographic scarps at the hemispheric dichotomy boundary, typically near or within low albedo sand deposits. The source of the high SiO2 and hydrated materials appears to be from groundwater discharge from Nili Fossae and Syrtis Major to the south, where there is evidence for extensive aqueous alteration of the subsurface. Although discontinuous, the exposures of high SiO2 and hydrated materials span a wide area and are present in a similar geomorphological context to previously identified deposits in western Hellas Basin. These regional deposits may reflect aqueous conditions and alteration within the adjacent crust of the martian highlands.

  19. Block Copolymer-Based Supramolecular Elastomers with High Extensibility and Large Stress Generation Capability

    NASA Astrophysics Data System (ADS)

    Noro, Atsushi; Hayashi, Mikihiro

    We prepared block copolymer-based supramolecular elastomers with high extensibility and large stress generation capability. Reversible addition fragmentation chain transfer polymerizations were conducted under normal pressure and high pressure to synthesize several large molecular weight polystyrene-b-[poly(butyl acrylate)-co-polyacrylamide]-b-polystyrene (S-Ba-S) block copolymers. Tensile tests revealed that the largest S-Ba-S with middle block molecular weight of 3140k achieved a breaking elongation of over 2000% with a maximum tensile stress of 3.6 MPa and a toughness of 28 MJ/m3 while the reference sample without any middle block hydrogen bonds, polystyrene-b-poly(butyl acrylate)-b-polystyrene with almost the same molecular weight, was merely viscous and not self-standing. Hence, incorporation of hydrogen bonds into a long soft middle block was found to be beneficial to attain high extensibility and large stress generation capability probably due to concerted combination of entropic changes and internal potential energy changes originaing from the dissociation of multiple hydrogen bonds by elongation. This work was supported by JSPS KAKENHI Grant Numbers 13J02357, 24685035, 15K13785, and 23655213 for M.H. and A.N. A.N. also expresses his gratitude for Tanaka Rubber Science & Technology Award by Enokagaku-Shinko Foundation, Japan.

  20. Informational Theory of Aging: The Life Extension Method Based on the Bone Marrow Transplantation

    PubMed Central

    Karnaukhov, Alexey V.; Karnaukhova, Elena V.; Sergievich, Larisa A.; Karnaukhova, Natalia A.; Bogdanenko, Elena V.; Manokhina, Irina A.; Karnaukhov, Valery N.

    2015-01-01

    The method of lifespan extension that is a practical application of the informational theory of aging is proposed. In this theory, the degradation (error accumulation) of the genetic information in cells is considered a main cause of aging. According to it, our method is based on the transplantation of genetically identical (or similar) stem cells with the lower number of genomic errors to the old recipients. For humans and large mammals, this method can be realized by cryopreservation of their own stem cells, taken in a young age, for the later autologous transplantation in old age. To test this method experimentally, we chose laboratory animals of relatively short lifespan (mouse). Because it is difficult to isolate the required amount of the stem cells (e.g., bone marrow) without significant damage for animals, we used the bone marrow transplantation from sacrificed inbred young donors. It is shown that the lifespan extension of recipients depends on level of their genetic similarity (syngeneity) with donors. We have achieved the lifespan increase of the experimental mice by 34% when the transplantation of the bone marrow with high level of genetic similarity was used. PMID:26491435

  1. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  2. Application of glyph-based techniques for multivariate engineering visualization

    NASA Astrophysics Data System (ADS)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  3. Adaptive Thresholding Technique for Retinal Vessel Segmentation Based on GLCM-Energy Information

    PubMed Central

    Mapayi, Temitope; Viriri, Serestina; Tapamo, Jules-Raymond

    2015-01-01

    Although retinal vessel segmentation has been extensively researched, a robust and time efficient segmentation method is highly needed. This paper presents a local adaptive thresholding technique based on gray level cooccurrence matrix- (GLCM-) energy information for retinal vessel segmentation. Different thresholds were computed using GLCM-energy information. An experimental evaluation on DRIVE database using the grayscale intensity and Green Channel of the retinal image demonstrates the high performance of the proposed local adaptive thresholding technique. The maximum average accuracy rates of 0.9511 and 0.9510 with maximum average sensitivity rates of 0.7650 and 0.7641 were achieved on DRIVE and STARE databases, respectively. When compared to the widely previously used techniques on the databases, the proposed adaptive thresholding technique is time efficient with a higher average sensitivity and average accuracy rates in the same range of very good specificity. PMID:25802550

  4. A technique to identify solvable dynamical systems, and another solvable extension of the goldfish many-body problem

    NASA Astrophysics Data System (ADS)

    Calogero, Francesco

    2004-12-01

    We take advantage of the simple approach, recently discussed, which associates to (solvable) matrix equations (solvable) dynamical systems interpretable as (interesting) many-body problems, possibly involving auxiliary dependent variables in addition to those identifying the positions of the moving particles. Starting from a solvable matrix evolution equation, we obtain the corresponding many-body model and note that in one case the auxiliary variables can be altogether eliminated, obtaining thereby an (also Hamiltonian) extension of the "goldfish" model. The solvability of this novel model, and of its isochronous variant, is exhibited. A related, as well solvable, model, is also introduced, as well as its isochronous variant. Finally, the small oscillations of the isochronous models around their equilibrium configurations are investigated, and from their isochronicity certain diophantine relations are evinced.

  5. Rule Induction with Extension Matrices.

    ERIC Educational Resources Information Center

    Wu, Xindong

    1998-01-01

    Presents a heuristic, attribute-based, noise-tolerant data mining program, HCV (Version 2.0) based on the newly-developed extension matrix approach. Outlines some techniques implemented in the HCV program for noise handling and discretization of continuous domains; an empirical comparison shows that rules generated by HCV are more compact than the…

  6. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia

    PubMed Central

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features. PMID:27570653

  7. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia.

    PubMed

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features. PMID:27570653

  8. Protein-protein interactions prediction based on iterative clique extension with gene ontology filtering.

    PubMed

    Yang, Lei; Tang, Xianglong

    2014-01-01

    Cliques (maximal complete subnets) in protein-protein interaction (PPI) network are an important resource used to analyze protein complexes and functional modules. Clique-based methods of predicting PPI complement the data defection from biological experiments. However, clique-based predicting methods only depend on the topology of network. The false-positive and false-negative interactions in a network usually interfere with prediction. Therefore, we propose a method combining clique-based method of prediction and gene ontology (GO) annotations to overcome the shortcoming and improve the accuracy of predictions. According to different GO correcting rules, we generate two predicted interaction sets which guarantee the quality and quantity of predicted protein interactions. The proposed method is applied to the PPI network from the Database of Interacting Proteins (DIP) and most of the predicted interactions are verified by another biological database, BioGRID. The predicted protein interactions are appended to the original protein network, which leads to clique extension and shows the significance of biological meaning. PMID:24578640

  9. Extensibility in local sensor based planning for hyper-redundant manipulators (robot snakes)

    NASA Technical Reports Server (NTRS)

    Choset, Howie; Burdick, Joel

    1994-01-01

    Partial Shape Modification (PSM) is a local sensor feedback method used for hyper-redundant robot manipulators, in which the redundancy is very large or infinite such as that of a robot snake. This aspect of redundancy enables local obstacle avoidance and end-effector placement in real time. Due to the large number of joints or actuators in a hyper-redundant manipulator, small displacement errors of such easily accumulate to large errors in the position of the tip relative to the base. The accuracy could be improved by a local sensor based planning method in which sensors are distributed along the length of the hyper-redundant robot. This paper extends the local sensor based planning strategy beyond the limitations of the fixed length of such a manipulator when its joint limits are met. This is achieved with an algorithm where the length of the deforming part of the robot is variable. Thus , the robot's local avoidance of obstacles is improved through the enhancement of its extensibility.

  10. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  11. EMAAS: An extensible grid-based Rich Internet Application for microarray data analysis and management

    PubMed Central

    Barton, G; Abbott, J; Chiba, N; Huang, DW; Huang, Y; Krznaric, M; Mack-Smith, J; Saleem, A; Sherman, BT; Tiwari, B; Tomlinson, C; Aitman, T; Darlington, J; Game, L; Sternberg, MJE; Butcher, SA

    2008-01-01

    Background Microarray experimentation requires the application of complex analysis methods as well as the use of non-trivial computer technologies to manage the resultant large data sets. This, together with the proliferation of tools and techniques for microarray data analysis, makes it very challenging for a laboratory scientist to keep up-to-date with the latest developments in this field. Our aim was to develop a distributed e-support system for microarray data analysis and management. Results EMAAS (Extensible MicroArray Analysis System) is a multi-user rich internet application (RIA) providing simple, robust access to up-to-date resources for microarray data storage and analysis, combined with integrated tools to optimise real time user support and training. The system leverages the power of distributed computing to perform microarray analyses, and provides seamless access to resources located at various remote facilities. The EMAAS framework allows users to import microarray data from several sources to an underlying database, to pre-process, quality assess and analyse the data, to perform functional analyses, and to track data analysis steps, all through a single easy to use web portal. This interface offers distance support to users both in the form of video tutorials and via live screen feeds using the web conferencing tool EVO. A number of analysis packages, including R-Bioconductor and Affymetrix Power Tools have been integrated on the server side and are available programmatically through the Postgres-PLR library or on grid compute clusters. Integrated distributed resources include the functional annotation tool DAVID, GeneCards and the microarray data repositories GEO, CELSIUS and MiMiR. EMAAS currently supports analysis of Affymetrix 3' and Exon expression arrays, and the system is extensible to cater for other microarray and transcriptomic platforms. Conclusion EMAAS enables users to track and perform microarray data management and analysis tasks

  12. Comparison of ITS, RAPD and ISSR from DNA-based genetic diversity techniques.

    PubMed

    Poyraz, Ismail

    2016-01-01

    ITS, RAPD-PCR and ISSR-PCR are most popular DNA-based techniques that are extensively applied in the determination of the genetic diversity of species among populations. However, especially for organisms having high genetic polymorphism, phylogenetic trees drawn from the results of these techniques may be different. For finding a meaningful phylogenetic tree, it should be compared phylogenetic trees obtained from these different techniques with geographic locations of populations. Lichens have a high genetic polymorphism and tolerance against different environmental conditions. In this study, these three DNA-based genetic diversity techniques were compared, using different populations of a lichen species (Xanthoria parietina). X. parietina was especially chosen because of its high genetic diversity in narrow zones. Lichen samples were collected from ten different locations in a narrow transition climate zone Bilecik (Turkey). Statistical analyses of all results were calculated using UPGMA analysis. Phylogenic trees for each technique were drawn and transferred to the Bilecik map for comparative analysis. The results of three techniques allowed us to verify that populations of X. parietina have high genetic variety in a narrow zone. But phylogenetic trees obtained from these results were found to be very different. Our comparative analysis demonstrated that the results of these techniques are not similar and have critical differences. We observed that the ITS method provides more clear data and is more successful in genetic diversity analyses of more asunder populations, in contrast to ISSR-PCR and RAPD-PCR methods. PMID:27156497

  13. A Lyapunov-Based Extension to Particle Swarm Dynamics for Continuous Function Optimization

    PubMed Central

    Bhattacharya, Sayantani; Konar, Amit; Das, Swagatam; Han, Sang Yong

    2009-01-01

    The paper proposes three alternative extensions to the classical global-best particle swarm optimization dynamics, and compares their relative performance with the standard particle swarm algorithm. The first extension, which readily follows from the well-known Lyapunov's stability theorem, provides a mathematical basis of the particle dynamics with a guaranteed convergence at an optimum. The inclusion of local and global attractors to this dynamics leads to faster convergence speed and better accuracy than the classical one. The second extension augments the velocity adaptation equation by a negative randomly weighted positional term of individual particle, while the third extension considers the negative positional term in place of the inertial term. Computer simulations further reveal that the last two extensions outperform both the classical and the first extension in terms of convergence speed and accuracy. PMID:22303158

  14. PseudoBase++: an extension of PseudoBase for easy searching, formatting and visualization of pseudoknots

    PubMed Central

    Taufer, Michela; Licon, Abel; Araiza, Roberto; Mireles, David; van Batenburg, F. H. D.; Gultyaev, Alexander P.; Leung, Ming-Ying

    2009-01-01

    Pseudoknots have been recognized to be an important type of RNA secondary structures responsible for many biological functions. PseudoBase, a widely used database of pseudoknot secondary structures developed at Leiden University, contains over 250 records of pseudoknots obtained in the past 25 years through crystallography, NMR, mutational experiments and sequence comparisons. To promptly address the growing analysis requests of the researchers on RNA structures and bring together information from multiple sources across the Internet to a single platform, we designed and implemented PseudoBase++, an extension of PseudoBase for easy searching, formatting and visualization of pseudoknots. PseudoBase++ (http://pseudobaseplusplus.utep.edu) maps the PseudoBase dataset into a searchable relational database including additional functionalities such as pseudoknot type. PseudoBase++ links each pseudoknot in PseudoBase to the GenBank record of the corresponding nucleotide sequence and allows scientists to automatically visualize RNA secondary structures with PseudoViewer. It also includes the capabilities of fine-grained reference searching and collecting new pseudoknot information. PMID:18988624

  15. An ionospheric occultation inversion technique based on epoch difference

    NASA Astrophysics Data System (ADS)

    Lin, Jian; Xiong, Jing; Zhu, Fuying; Yang, Jian; Qiao, Xuejun

    2013-09-01

    Of the ionospheric radio occultation (IRO) electron density profile (EDP) retrievals, the Abel based calibrated TEC inversion (CTI) is the most widely used technique. In order to eliminate the contribution from the altitude above the RO satellite, it is necessary to utilize the calibrated TEC to retrieve the EDP, which introduces the error due to the coplanar assumption. In this paper, a new technique based on the epoch difference inversion (EDI) is firstly proposed to eliminate this error. The comparisons between CTI and EDI have been done, taking advantage of the simulated and real COSMIC data. The following conclusions can be drawn: the EDI technique can successfully retrieve the EDPs without non-occultation side measurements and shows better performance than the CTI method, especially for lower orbit mission; no matter which technique is used, the inversion results at the higher altitudes are better than those at the lower altitudes, which could be explained theoretically.

  16. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  17. An improved method of gene synthesis based on DNA works software and overlap extension PCR.

    PubMed

    Dong, Bingxue; Mao, Runqian; Li, Baojian; Liu, Qiuyun; Xu, Peilin; Li, Gang

    2007-11-01

    A bottleneck in recent gene synthesis technologies is the high cost of oligonucleotide synthesis and post-synthesis sequencing. In this article, a simple and rapid method for low-cost gene synthesis technology was developed based on DNAWorks program and an improved single-step overlap extension PCR (OE-PCR). This method enables any DNA sequence to be synthesized with few errors, then any mutated sites could be corrected by site-specific mutagenesis technology or PCR amplification-assembly method, which can amplify different DNA fragments of target gene followed by assembly into an entire gene through their overlapped region. Eventually, full-length DNA sequence without error was obtained via this novel method. Our method is simple, rapid and low-cost, and also easily amenable to automation based on a DNAWorks design program and defined set of OE-PCR reaction conditions suitable for different genes. Using this method, several genes including Manganese peroxidase gene (Mnp) of Phanerochaete chrysosporium (P. chrysosporium), Laccase gene (Lac) of Trametes versicolor (T. versicolor) and Cip1 peroxidase gene (cip 1) of Coprinus cinereus (C. cinereus) with sizes ranging from 1.0 kb to 1.5 kb have been synthesized successfully. PMID:17952664

  18. Spatial representativeness of ground-based solar radiation measurements - Extension to the full Meteosat disk

    NASA Astrophysics Data System (ADS)

    Zyta Hakuba, Maria; Folini, Doris; Sanchez-Lorenzo, Arturo; Wild, Martin

    2015-04-01

    The spatial representativeness of a point measurement of surface solar radiation (SSR) of its larger-scale surrounding, e.g. collocated grid cell, is a potential source of uncertainty in the validation of climate models and satellite products. Here, we expand our previous study over Europe to the entire Meteosat disk, covering additional climate zones in Africa, the Middle east, and South America between -70° to 70° East and -70° to 70° North. Using a high-resolution (0.03°) satellite-based SSR dataset (2001-2005), we quantify the spatial subgrid variability in grids of 1° and 3° resolution and the spatial representativeness of 887 surface sites with respect to site-centered surroundings of variable size. In the multi-annual mean the subgrid variability is the largest in some mountainous and coastal regions, but varies seasonally due to changes in the ITCZ location. The absolute mean representation errors at the surface sites with respect to surroundings of 1° and 3° are on average 1-2 Hakuba, M. Z., D. Folini, A. Sanchez-Lorenzo, and M. Wild, Spatial representativeness of ground-based solar radiation measurements - Extension to the full Meteosat disk, J. Geophys. Res. Atmos., 119, doi:10.1002/2014JD021946,2014.

  19. Advanced airfoil design empirically based transonic aircraft drag buildup technique

    NASA Technical Reports Server (NTRS)

    Morrison, W. D., Jr.

    1976-01-01

    To systematically investigate the potential of advanced airfoils in advance preliminary design studies, empirical relationships were derived, based on available wind tunnel test data, through which total drag is determined recognizing all major aircraft geometric variables. This technique recognizes a single design lift coefficient and Mach number for each aircraft. Using this technique drag polars are derived for all Mach numbers up to MDesign + 0.05 and lift coefficients -0.40 to +0.20 from CLDesign.

  20. Intramuscular injection technique: an evidence-based approach.

    PubMed

    Ogston-Tuck, Sherri

    2014-09-30

    Intramuscular injections require a thorough and meticulous approach to patient assessment and injection technique. This article, the second in a series of two, reviews the evidence base to inform safer practice and to consider the evidence for nursing practice in this area. A framework for safe practice is included, identifying important points for safe technique, patient care and clinical decision making. It also highlights the ongoing debate in selection of intramuscular injection sites, predominately the ventrogluteal and dorsogluteal muscles. PMID:25249123

  1. Image analysis techniques associated with automatic data base generation.

    NASA Technical Reports Server (NTRS)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  2. Efficiency of Integrated Geophysical techniques in delineating the extension of Bauxites ore in north Riyadh, Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Almutairi, Yasir; Alanazi, Abdulrahman; Almutairi, Muteb; Alsama, Ali; Alhenaki, Bander; Almalki, Awadh

    2014-05-01

    We exploit the integration of Ground Penetrating Radar (GPR) techniques, magnetic gradiometry, resistivity measurements and seismic tomography for the high-resolution non-invasive study for delineating the subsurface Bauxite layer in Zabira locality, north of Riyadh. Integrated GPR, magnetic gradiometry resistivity and seismic refraction are used in the case of high contrast targets and provide an accurate subsurface reconstruction of foundations in sediments. Resistivity pseudo-sections are in particular useful for the areal identification of contacts between soils and foundations while GPR and magnetic gradiometry provide detailed information about location and depth of the structures. Results obtained by GPR, Magnetics and resistivity shows a very good agreement in mapping the bauxite layer depth at range of 5 m to 10 m while the depth obtained by seismic refraction was 10 m to 15 m due to lack of velocity information.

  3. Real-time neural network based camera localization and its extension to mobile robot control.

    PubMed

    Choi, D H; Oh, S Y

    1997-06-01

    The feasibility of using neural networks for camera localization and mobile robot control is investigated here. This approach has the advantages of eliminating the laborious and error-prone process of imaging system modeling and calibration procedures. Basically, two different approaches of using neural networks are introduced of which one is a hybrid approach combining neural networks and the pinhole-based analytic solution while the other is purely neural network based. These techniques have been tested and compared through both simulation and real-time experiments and are shown to yield more precise localization than analytic approaches. Furthermore, this neural localization method is also shown to be directly applicable to the navigation control of an experimental mobile robot along the hallway purely guided by a dark wall strip. It also facilitates multi-sensor fusion through the use of multiple sensors of different types for control due to the network's capability of learning without models. PMID:9427102

  4. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  5. Contraction-based classification of supersymmetric extensions of kinematical lie algebras

    SciTech Connect

    Campoamor-Stursberg, R.; Rausch de Traubenberg, M.

    2010-02-15

    We study supersymmetric extensions of classical kinematical algebras from the point of view of contraction theory. It is shown that contracting the supersymmetric extension of the anti-de Sitter algebra leads to a hierarchy similar in structure to the classical Bacry-Levy-Leblond classification.

  6. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    ERIC Educational Resources Information Center

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  7. The Role of Extension Specialists in Helping Entrepreneurs Develop Successful Food-Based Businesses.

    ERIC Educational Resources Information Center

    Holcomb, Rodney; Muske, Glenn

    2000-01-01

    Three areas in which extension specialists can assist food industry entrepreneurs include (1) awareness of the components of a business plan, (2) pro forma financial analysis, and (3) legal issues affecting the food industry. In addition to specialized expertise, extension professionals can help with making contacts, objectively review business…

  8. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  9. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework

    PubMed Central

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-01-01

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user’s training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system. PMID:27104534

  10. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    PubMed

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-01-01

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system. PMID:27104534

  11. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula.

    PubMed

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P < .0001). The control group reported significantly more days with >3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted. PMID:27336009

  12. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula

    PubMed Central

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P < .0001). The control group reported significantly more days with >3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted. PMID:27336009

  13. CDAPubMed: a browser extension to retrieve EHR-based biomedical literature

    PubMed Central

    2012-01-01

    Background Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs). In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i) load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA), (ii) identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH), automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii) generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard PubMed interface. It has

  14. GIS-based assessment of groundwater level on extensive karst areas

    NASA Astrophysics Data System (ADS)

    Kopecskó, Zsanett; Józsa, Edina

    2016-04-01

    Karst topographies represent unique geographical regions containing caves and extensive underground water systems developed especially on soluble rocks such as limestone, marble and gypsum. The significance of these areas is evident considering that 12% of the ice-free continental area consists of landscapes developed on carbonate rocks and 20-25% of the global population depends mostly on groundwater obtained from these systems. Karst water reservoirs already give the 25% of the freshwater resources globally. Comprehensive studies considering these regions are the key to explore chances of the exploitation and to analyze the consequences of contamination, anthropogenic effects and natural processes within these specific hydro-geological characteristics. For the proposed work we chose several of the largest karst regions over the ice-free part of continents, representing diverse climatic and topographic characteristics. An important aspect of the study is that there are no available in situ hydrologic measurements over the entire research area that would provide discrete sampling of soil, ground and surface water. As replacement for the detailed surveys, multi remote sensing data (Gravity Recovery and Climate Experiment (GRACE) satellite derivatives products, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite products and Tropical Rainfall Measuring Mission (TRMM) monthly rainfalls satellite datasets) are used along with model reanalysis data (Global Precipitation Climate Center data (GPCC) and Global Land Data Assimilation System (GLDAS)) to study the variation on extensive karst areas in response to the changing climate and anthropogenic effects. The analyses are carried out within open source software environment to enable sharing of the proposed algorithm. The GRASS GIS geoinformatic software and the R statistical program proved to be adequate choice to collect and analyze the above mentioned datasets by taking advantage of their interoperability

  15. An extensive survey of dayside diffuse aurora based on optical observations at Yellow River Station

    NASA Astrophysics Data System (ADS)

    Han, De-Sheng; Chen, Xiang-Cai; Liu, Jian-Jun; Qiu, Qi; Keika, K.; Hu, Ze-Jun; Liu, Jun-Ming; Hu, Hong-Qiao; Yang, Hui-Gen

    2015-09-01

    By using 7 years optical auroral observations obtained at Yellow River Station (magnetic latitude 76.24°N) at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into two broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAs mainly show patchy, stripy, and irregular forms and are often pulsating and drifting. The drifting directions are mostly westward (with speed ~5 km/s), but there are cases showing eastward or poleward drifting. (5) The stripy DDAs are exclusively observed near the MLN and, most importantly, their alignments are confirmed to be consistent with the direction of ionospheric convection near the MLN. (6) A new auroral form, called throat aurora, is found to be developed from the stripy DDAs. Based on the observational results and previous studies, we proposed our explanations to the DDAs. We suggest that the unstructured DDAs observed in the morning are extensions of the nightside diffuse aurora to the dayside, but that observed in the afternoon are predominantly caused by proton precipitations. The structured DDAs occurred near the MLN are caused by interactions of cold plasma structures, which are supposed to be originated from the ionospheric outflows or plasmaspheric drainage plumes, with hot electrons from the plasma sheet. We suppose that the cold plasma structures for producing the patchy DDAs are in lumpy and are more likely from the plasmaspheric drainage plumes. The cold plasma structure for

  16. Wavelet transformation based watermarking technique for human electrocardiogram (ECG).

    PubMed

    Engin, Mehmet; Cidam, Oğuz; Engin, Erkan Zeki

    2005-12-01

    Nowadays, watermarking has become a technology of choice for a broad range of multimedia copyright protection applications. Watermarks have also been used to embed prespecified data in biomedical signals. Thus, the watermarked biomedical signals being transmitted through communication are resistant to some attacks. This paper investigates discrete wavelet transform based watermarking technique for signal integrity verification in an Electrocardiogram (ECG) coming from four ECG classes for monitoring application of cardiovascular diseases. The proposed technique is evaluated under different noisy conditions for different wavelet functions. Daubechies (db2) wavelet function based technique performs better than those of Biorthogonal (bior5.5) wavelet function. For the beat-to-beat applications, all performance results belonging to four ECG classes are highly moderate. PMID:16235811

  17. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  18. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  19. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  20. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    SciTech Connect

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.; THOMAS, EDWARD V.; WUNSCH, DONALD

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.

  1. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  2. Development and Implementation of an Extensible Interface-Based Spatiotemporal Geoprocessing and Modeling Toolbox

    NASA Astrophysics Data System (ADS)

    Cao, Y.; Ames, D. P.

    2011-12-01

    This poster presents an object oriented and interface-based spatiotemporal data processing and modeling toolbox that can be extended by third parties to include complete suites of new tools through the implementation of simple interfaces. The resulting software implementation includes both a toolbox and workflow designer or "model builder" constructed using the underlying open source DotSpatial library and MapWindow desktop GIS. The unique contribution of this research and software development activity is in the creation and use of an extensibility architecture for both specific tools (through a so-called "ITool" interface) and batches of tools (through a so-called "IToolProvider" interface.) This concept is introduced to allow for seamless integration of geoprocessing tools from various sources (e.g. distinct libraries of spatiotemporal processing code) - including online sources - within a single user environment. In this way, the IToolProvider interface allows developers to wrap large existing collections of data analysis code without having to re-write it for interoperability. Additionally, developers do not need to design the user interfaces for loading, displaying or interacting with their specific tools, but rather can simply implement the provided interfaces and have their tools and tool collections appear in the toolbox alongside other tools. The demonstration software presented here is based on an implementation of the interfaces and sample tool libraries using the C# .NET programming language. This poster will include a summary of the interfaces as well as a demonstration of the system using the Whitebox Geospatial Analysis Tools (GAT) as an example case of a large number of existing tools that can be exposed to users through this new system. Vector analysis tools which are native in DotSpatial are linked to the Whitebox raster analysis tools in the model builder environment for ease of execution and consistent/repeatable use. We expect that this

  3. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  4. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  5. Novel techniques and the future of skull base reconstruction.

    PubMed

    Meier, Joshua C; Bleier, Benjamin S

    2013-01-01

    The field of endoscopic skull base surgery has evolved considerably in recent years fueled largely by advances in both imaging and instrumentation. While the indications for these approaches continue to be extended, the ability to reconstruct the resultant defects has emerged as a rate-limiting obstacle. Postoperative failures with current multilayer grafting techniques remain significant and may increase as the indications for endoscopic resections continue to expand. Laser tissue welding represents a novel method of wound repair in which laser energy is applied to a chromophore doped biologic solder at the wound edge to create a laser weld (fig. 1). These repairs are capable of withstanding forces far exceeding those exerted by intracranial pressure with negligible collateral thermal tissue injury. Recent clinical trials have demonstrated the safety and feasibility of endoscopic laser welding while exposing the limitations of first generation hyaluronic acid based solders. Novel supersaturated gel based solders are currently being tested in clinical trials and appear to possess significantly improved viscoelastic properties. While laser tissue welding remains an experimental technique, continued success with these novel solder formulations may catalyze the widespread adoption of this technique for skull base repair in the near future. PMID:23257563

  6. DEVA: An extensible ontology-based annotation model for visual document collections

    NASA Astrophysics Data System (ADS)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  7. The Intelligent System of Cardiovascular Disease Diagnosis Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Sun, Baiqing; Li, Yange; Zhang, Lin

    This thesis gives the general definition of the concepts of extension knowledge, extension data mining and extension data mining theorem in high dimension space, and also builds the IDSS integrated system by the rough set, expert system and neural network, develops the relevant computer software. From the diagnosis tests, according to the common diseases of myocardial infarctions, angina pectoris and hypertension, and made the test result with physicians, the results shows that the sensitivity, specific and accuracy diagnosis by the IDSS are all higher than the physicians. It can improve the rate of the accuracy diagnosis of physician with the auxiliary help of this system, which have the obvious meaning in low the mortality, disability rate and high the survival rate, and has strong practical values and further social benefits.

  8. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  9. A unified neural-network-based speaker localization technique.

    PubMed

    Arslan, G; Sakarya, F A

    2000-01-01

    Locating and tracking a speaker in real time using microphone arrays is important in many applications such as hands-free video conferencing, speech processing in large rooms, and acoustic echo cancellation. A speaker can be moving from the far field to the near field of the array, or vice versa. Many neural-network-based localization techniques exist, but they are applicable to either far-field or near-field sources, and are computationally intensive for real-time speaker localization applications because of the wide-band nature of the speech. We propose a unified neural-network-based source localization technique, which is simultaneously applicable to wide-band and narrow-band signal sources that are in the far field or near field of a microphone array. The technique exploits a multilayer perceptron feedforward neural network structure and forms the feature vectors by computing the normalized instantaneous cross-power spectrum samples between adjacent pairs of sensors. Simulation results indicate that our technique is able to locate a source with an absolute error of less than 3.5 degrees at a signal-to-noise ratio of 20 dB and a sampling rate of 8000 Hz at each sensor. PMID:18249826

  10. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    ERIC Educational Resources Information Center

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  11. 78 FR 16275 - Extension of the Duration of Programmatic Agreements Based on the Department of Energy Prototype...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-14

    ... 22, 2013, for a one-week comment period (78 FR 12336-12337). In accordance with 36 CFR 800.14(e), the... HISTORIC PRESERVATION Extension of the Duration of Programmatic Agreements Based on the Department of... on Historic Preservation. ACTION: The Advisory Council on Historic Preservation has issued a...

  12. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  13. Extensions of algebraic image operators: An approach to model-based vision

    NASA Technical Reports Server (NTRS)

    Lerner, Bao-Ting; Morelli, Michael V.

    1990-01-01

    Researchers extend their previous research on a highly structured and compact algebraic representation of grey-level images which can be viewed as fuzzy sets. Addition and multiplication are defined for the set of all grey-level images, which can then be described as polynomials of two variables. Utilizing this new algebraic structure, researchers devised an innovative, efficient edge detection scheme. An accurate method for deriving gradient component information from this edge detector is presented. Based upon this new edge detection system researchers developed a robust method for linear feature extraction by combining the techniques of a Hough transform and a line follower. The major advantage of this feature extractor is its general, object-independent nature. Target attributes, such as line segment lengths, intersections, angles of intersection, and endpoints are derived by the feature extraction algorithm and employed during model matching. The algebraic operators are global operations which are easily reconfigured to operate on any size or shape region. This provides a natural platform from which to pursue dynamic scene analysis. A method for optimizing the linear feature extractor which capitalizes on the spatially reconfiguration nature of the edge detector/gradient component operator is discussed.

  14. Designing a Competency-Based New County Extension Personnel Training Program: A Novel Approach

    ERIC Educational Resources Information Center

    Brodeur, Cheri Winton; Higgins, Cynthia; Galindo-Gonzalez, Sebastian; Craig, Diane D.; Haile, Tyann

    2011-01-01

    Voluntary county personnel turnover occurs for a multitude of reasons, including the lack of job satisfaction, organizational commitment, and job embeddedness and lack of proper training. Loss of personnel can be costly both economically and in terms of human capital. Retention of Extension professionals can be improved through proper training or…

  15. 76 FR 12073 - Extension of Web-Based TRICARE Assistance Program Demonstration Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... Register Notice, 74 FR 3667, July 24, 2009. The demonstration was extended to March 31, 2011, as referenced... in 74 FR 3667 July 24, 2009 launched August 1, 2009, to provide the capability for short-term... original Federal Register Notice, 74 FR 3667 July 24, 2009, and the extension Federal Register...

  16. Quantum state tomography of orbital angular momentum photonic qubits via a projection-based technique

    NASA Astrophysics Data System (ADS)

    Nicolas, Adrien; Veissier, Lucile; Giacobino, Elisabeth; Maxein, Dominik; Laurat, Julien

    2015-03-01

    While measuring the orbital angular momentum state of bright light beams can be performed using imaging techniques, a full characterization at the single-photon level is challenging. For applications to quantum optics and quantum information science, such characterization is an essential capability. Here, we present a setup to perform the quantum state tomography of photonic qubits encoded in this degree of freedom. The method is based on a projective technique using spatial mode projection via fork holograms and single-mode fibers inserted into an interferometer. The alignment and calibration of the device is detailed as well as the measurement sequence to reconstruct the associated density matrix. Possible extensions to higher-dimensional spaces are discussed.

  17. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  18. Flow management techniques for base and afterbody drag reduction

    NASA Astrophysics Data System (ADS)

    Viswanath, P. R.

    The problem of turbulent base flows and the drag associated with it have been of significant interest in missile as well as fighter aircraft design. Numerous studies in the literature have been devoted to aspects of reducing base drag on two-dimensional as well as on axisymmetric bodies. This paper presents a review of the developments that have taken place on the use of passive techniques or devices for axisymmetric base and net afterbody drag reduction in the absence of jet flow at the base. In particular, the paper discusses the effectiveness of base cavities, ventilated cavities, locked vortex afterbodies, multi-step afterbodies and afterbodies employing a non-axisymmetric boat-tailing concept for base and net drag reduction in different speed regimes. The broad features of the flow and the likely fluid-dynamical mechanisms associated with the device leading to base drag reduction are highlighted. Flight-test results assessing the effectiveness of some of the devices are compared with data from wind tunnels. The present survey indicates that base and net afterbody drag reduction of considerable engineering significance in aerospace applications can be achieved by various passive devices even when the (unmanipulated) base flow is not characterised by vortex shedding.

  19. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  20. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  1. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  2. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  3. The Roland Maze Project school-based extensive air shower network

    NASA Astrophysics Data System (ADS)

    Feder, J.; Jȩdrzejczak, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Wibig, T.

    2006-01-01

    We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Łódź. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented.

  4. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  5. Surveying converter lining erosion state based on laser measurement technique

    NASA Astrophysics Data System (ADS)

    Li, Hongsheng; Shi, Tielin; Yang, Shuzi

    1998-08-01

    It is very important to survey the eroding state of the steelmaking converter lining real time so as to optimize technological process, extend converter durability and reduce steelmaking production costs. This paper gives one practical method based on the laser measure technique. It presents the basic principle of the measure technique. It presents the basic principle of the measure method, the composition of the measure system and the researches on key technological problems. The method is based on the technique of the laser range finding to net points on the surface of the surveyed converter lining, and the technology of angle finding to the laser beams. The angle signals would be used to help realizing the automatic scanning function also. The laser signals would be modulated and encoded. In the meantime, we would adopt the wavelet analysis and other filter algorithms, to denoise noisy data and extract useful information. And the main idea of some algorithms such as the net point measuring path planning and the measure device position optimal algorithm would also be given in order to improve the measure precision and real time property of the system.

  6. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  7. Effects of borate-based bioactive glass on neuron viability and neurite extension.

    PubMed

    Marquardt, Laura M; Day, Delbert; Sakiyama-Elbert, Shelly E; Harkins, Amy B

    2014-08-01

    Bioactive glasses have recently been shown to promote regeneration of soft tissues by positively influencing tissue remodeling during wound healing. We were interested to determine whether bioactive glasses have the potential for use in the treatment of peripheral nerve injury. In these experiments, degradable bioactive borate glass was fabricated into rods and microfibers. To study the compatibility with neurons, embryonic chick dorsal root ganglia (DRG) were cultured with different forms of bioactive borate glass. Cell viability was measured with no media exchange (static condition) or routine media exchange (transient condition). Neurite extension was measured within fibrin scaffolds with embedded glass microfibers or aligned rod sheets. Mixed cultures of neurons, glia, and fibroblasts growing in static conditions with glass rods and microfibers resulted in decreased cell viability. However, the percentage of neurons compared with all cell types increased by the end of the culture protocol compared with culture without glass. Furthermore, bioactive glass and fibrin composite scaffolds promoted neurite extension similar to that of control fibrin scaffolds, suggesting that glass does not have a significant detrimental effect on neuronal health. Aligned glass scaffolds guided neurite extension in an oriented manner. Together these findings suggest that bioactive glass can provide alignment to support directed axon growth. PMID:24027222

  8. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-01

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper. PMID:25284741

  9. New modulation-based watermarking technique for video

    NASA Astrophysics Data System (ADS)

    Lemma, Aweke; van der Veen, Michiel; Celik, Mehmet

    2006-02-01

    Successful watermarking algorithms have already been developed for various applications ranging from meta-data tagging to forensic tracking. Nevertheless, it is commendable to develop alternative watermarking techniques that provide a broader basis for meeting emerging services, usage models and security threats. To this end, we propose a new multiplicative watermarking technique for video, which is based on the principles of our successful MASK audio watermark. Audio-MASK has embedded the watermark by modulating the short-time envelope of the audio signal and performed detection using a simple envelope detector followed by a SPOMF (symmetrical phase-only matched filter). Video-MASK takes a similar approach and modulates the image luminance envelope. In addition, it incorporates a simple model to account for the luminance sensitivity of the HVS (human visual system). Preliminary tests show algorithms transparency and robustness to lossy compression.

  10. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  11. Extensive analysis of potentialities and limitations of a maximum cross-correlation technique for surface circulation by using realistic ocean model simulations

    NASA Astrophysics Data System (ADS)

    Doronzo, Bartolomeo; Taddei, Stefano; Brandini, Carlo; Fattorini, Maria

    2015-08-01

    As shown in the literature, ocean surface circulation can be estimated from sequential satellite imagery by using the maximum cross-correlation (MCC) technique. This approach is very promising since it offers the potential to acquire synoptic-scale coverage of the surface currents on a quasi-continuous temporal basis. However, MCC has also many limits due, for example, to cloud cover or the assumption that Sea Surface Temperature (SST) or other surface parameters from satellite imagery are considered as conservative passive tracers. Also, since MCC can detect only advective flows, it might not work properly in shallow water, where local heating and cooling, upwelling and other small-scale processes have a strong influence. Another limitation of the MCC technique is the impossibility of detecting currents moving along surface temperature fronts. The accuracy and reliability of MCC can be analysed by comparing the estimated velocities with those measured by in situ instrumentation, but the low number of experimental measurements does not allow a systematic statistical study of the potentials and limitations of the method. Instead, an extensive analysis of these features can be done by applying the MCC to synthetic imagery obtained from a realistic numerical ocean model that takes into account most physical phenomena. In this paper a multi-window (MW-) MCC technique is proposed, and its application to synthetic imagery obtained by a regional high-resolution implementation of the Regional Ocean Modeling System (ROMS) is discussed. An application of the MW-MCC algorithm to a real case and a comparison with experimental measurements are then shown.

  12. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  13. Pseudorandom Noise Code-Based Technique for Cloud and Aerosol Discrimination Applications

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.; Harrison, Fenton Wallace

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a PN code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths. Keywords: ASCENDS, CO2 sensing, O2 sensing, PN codes, CW lidar

  14. Pseudorandom noise code-based technique for cloud and aerosol discrimination applications

    NASA Astrophysics Data System (ADS)

    Campbell, Joel; Prasad, Narasimha S.; Flood, Michael; Harrison, Wallace

    2011-06-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a PN code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths.

  15. Herd-scale measurements of methane emissions from cattle grazing extensive sub-tropical grasslands using the open-path laser technique.

    PubMed

    Tomkins, N W; Charmley, E

    2015-12-01

    Methane (CH4) emissions associated with beef production systems in northern Australia are yet to be quantified. Methodologies are available to measure emissions, but application in extensive grazing environments is challenging. A micrometeorological methodology for estimating herd-scale emissions using an indirect open-path spectroscopic technique and an atmospheric dispersion model is described. The methodology was deployed on five cattle properties across Queensland and Northern Territory, with measurements conducted during two occasions at one site. On each deployment, data were collected every 10 min for up to 7 h a day over 4 to 16 days. To increase the atmospheric concentration of CH4 to measurable levels, cattle were confined to a known area around water points from ~0800 to 1600 h, during which time measurements of wind statistics and line-averaged CH4 concentration were taken. Filtering to remove erroneous data accounted for 35% of total observations. For five of the six deployments CH4 emissions were within the expected range of 0.4 to 0.6 g/kg BW. At one site, emissions were ~2 times expected values. There was small but consistent variation with time of day, although for some deployments measurements taken early in the day tended to be higher than at the other times. There was a weak linear relationship (R 2=0.47) between animal BW and CH4 emission per kg BW. Where it was possible to compare emissions in the early and late dry season at one site, it was speculated that higher emissions at the late dry season may have been attributed to poorer diet quality. It is concluded that the micrometeorological methodology using open-path lasers can be successfully deployed in extensive grazing conditions to directly measure CH4 emissions from cattle at a herd scale. PMID:26290115

  16. Rapid prototyping of extrusion dies using layer-based techniques

    SciTech Connect

    Misiolek, W.Z.; Winther, K.T.; Prats, A.E.; Rock, S.J.

    1999-02-01

    Extrusion die design and development often requires significant craftsman skill and iterative improvement to arrive at a production-ready die geometry. Constructing the dies used during this iterative process from layers, rather than from one solid block of material, offers unique opportunities to improve die development efficiency when coupled with concepts drawn from the rapid prototyping field. This article presents a proof-of-concept illustrating the potential utility of layer-based extrusion dies for the die design and fabrication process. The major benefits include greater flexibility in the design process, a more efficient, automated fabrication technique, and a means for performing localized die modifications and repairs.

  17. Simultaneous algebraic reconstruction technique based on guided image filtering.

    PubMed

    Ji, Dongjiang; Qu, Gangrong; Liu, Baodong

    2016-07-11

    The challenge of computed tomography is to reconstruct high-quality images from few-view projections. Using a prior guidance image, guided image filtering smoothes images while preserving edge features. The prior guidance image can be incorporated into the image reconstruction process to improve image quality. We propose a new simultaneous algebraic reconstruction technique based on guided image filtering. Specifically, the prior guidance image is updated in the image reconstruction process, merging information iteratively. To validate the algorithm practicality and efficiency, experiments were performed with numerical phantom projection data and real projection data. The results demonstrate that the proposed method is effective and efficient for nondestructive testing and rock mechanics. PMID:27410859

  18. Foreign fiber detecting system based on multispectral technique

    NASA Astrophysics Data System (ADS)

    Li, Qi; Han, Shaokun; Wang, Ping; Wang, Liang; Xia, Wenze

    2015-08-01

    This paper presents a foreign fiber detecting system based on multi-spectral technique. The absorption rate and the reflectivity of foreign fibers differently under different wavelengths of light so that the characteristics of the image has difference in the different light irradiation. Contrast pyramid image fusion algorithm and adaptive enhancement is improved to extracted the foreign fiber from the cotton background. The experimental results show that the single light source can detect 6 kinds of foreign fiber in cotton and multi-spectral detection can detect eight kinds.

  19. NIOS II processor-based acceleration of motion compensation techniques

    NASA Astrophysics Data System (ADS)

    González, Diego; Botella, Guillermo; Mookherjee, Soumak; Meyer-Bäse, Uwe; Meyer-Bäse, Anke

    2011-06-01

    This paper focuses on the hardware acceleration of motion compensation techniques suitable for the MPEG video compression. A plethora of representative motion estimation search algorithms and the new perspectives are introduced. The methods and designs described here are qualified for medical imaging area where are involved larger images. The structure of the processing systems considered has a good fit for reconfigurable acceleration. The system is based in a platform like FPGA working with the Nios II Microprocessor platform applying C2H acceleration. The paper shows the results in terms of performance and resources needed.

  20. Laser jamming technique research based on combined fiber laser

    NASA Astrophysics Data System (ADS)

    Jie, Xu; Shanghong, Zhao; Rui, Hou; Shengbao, Zhan; Lei, Shi; Jili, Wu; Shaoqiang, Fang; Yongjun, Li

    2009-06-01

    A compact and light laser jamming source is needed to increase the flexibility of laser jamming technique. A novel laser jamming source based on combined fiber lasers is proposed. Preliminary experimental results show that power levels in excess of 10 kW could be achieved. An example of laser jamming used for an air-to-air missile is given. It shows that the tracking system could complete tracking in only 4 s and came into a steady state with its new tracking target being the laser jamming source.

  1. Signature extension studies

    NASA Technical Reports Server (NTRS)

    Vincent, R. K.; Thomas, G. S.; Nalepka, R. F.

    1974-01-01

    The importance of specific spectral regions to signature extension is explored. In the recent past, the signature extension task was focused on the development of new techniques. Tested techniques are now used to investigate this spectral aspect of the large area survey. Sets of channels were sought which, for a given technique, were the least affected by several sources of variation over four data sets and yet provided good object class separation on each individual data set. Using sets of channels determined as part of this study, signature extension was accomplished between data sets collected over a six-day period and over a range of about 400 kilometers.

  2. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  3. On combining Laplacian and optimization-based mesh smoothing techniques

    SciTech Connect

    Freitag, L.A.

    1997-07-01

    Local mesh smoothing algorithms have been shown to be effective in repairing distorted elements in automatically generated meshes. The simplest such algorithm is Laplacian smoothing, which moves grid points to the geometric center of incident vertices. Unfortunately, this method operates heuristically and can create invalid meshes or elements of worse quality than those contained in the original mesh. In contrast, optimization-based methods are designed to maximize some measure of mesh quality and are very effective at eliminating extremal angles in the mesh. These improvements come at a higher computational cost, however. In this article the author proposes three smoothing techniques that combine a smart variant of Laplacian smoothing with an optimization-based approach. Several numerical experiments are performed that compare the mesh quality and computational cost for each of the methods in two and three dimensions. The author finds that the combined approaches are very cost effective and yield high-quality meshes.

  4. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  5. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  6. Techniques for region coding in object-based image compression

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.

    2004-01-01

    Object-based compression (OBC) is an emerging technology that combines region segmentation and coding to produce a compact representation of a digital image or video sequence. Previous research has focused on a variety of segmentation and representation techniques for regions that comprise an image. The author has previously suggested [1] partitioning of the OBC problem into three steps: (1) region segmentation, (2) region boundary extraction and compression, and (3) region contents compression. A companion paper [2] surveys implementationally feasible techniques for boundary compression. In this paper, we analyze several strategies for region contents compression, including lossless compression, lossy VPIC, EPIC, and EBLAST compression, wavelet-based coding (e.g., JPEG-2000), as well as texture matching approaches. This paper is part of a larger study that seeks to develop highly efficient compression algorithms for still and video imagery, which would eventually support automated object recognition (AOR) and semantic lookup of images in large databases or high-volume OBC-format datastreams. Example applications include querying journalistic archives, scientific or medical imaging, surveillance image processing and target tracking, as well as compression of video for transmission over the Internet. Analysis emphasizes time and space complexity, as well as sources of reconstruction error in decompressed imagery.

  7. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    SciTech Connect

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  8. Surfer: An Extensible Pull-Based Framework for Resource Selection and Ranking

    NASA Technical Reports Server (NTRS)

    Zolano, Paul Z.

    2004-01-01

    Grid computing aims to connect large numbers of geographically and organizationally distributed resources to increase computational power; resource utilization, and resource accessibility. In order to effectively utilize grids, users need to be connected to the best available resources at any given time. As grids are in constant flux, users cannot be expected to keep up with the configuration and status of the grid, thus they must be provided with automatic resource brokering for selecting and ranking resources meeting constraints and preferences they specify. This paper presents a new OGSI-compliant resource selection and ranking framework called Surfer that has been implemented as part of NASA's Information Power Grid (IPG) project. Surfer is highly extensible and may be integrated into any grid environment by adding information providers knowledgeable about that environment.

  9. A restrained-torque-based motion instructor: forearm flexion/extension-driving exoskeleton

    NASA Astrophysics Data System (ADS)

    Nishimura, Takuya; Nomura, Yoshihiko; Sakamoto, Ryota

    2013-01-01

    When learning complicated movements by ourselves, we encounter such problems as a self-rightness. The self-rightness results in a lack of detail and objectivity, and it may cause to miss essences and even twist the essences. Thus, we sometimes fall into the habits of doing inappropriate motions. To solve these problems or to alleviate the problems as could as possible, we have been developed mechanical man-machine human interfaces to support us learning such motions as cultural gestures and sports form. One of the promising interfaces is a wearable exoskeleton mechanical system. As of the first try, we have made a prototype of a 2-link 1-DOF rotational elbow joint interface that is applied for teaching extension-flexion operations with forearms and have found its potential abilities for teaching the initiating and continuing flection motion of the elbow.

  10. Testing of Large Diameter Fresnel Optics for Space Based Observations of Extensive Air Showers

    NASA Technical Reports Server (NTRS)

    Adams, James H.; Christl, Mark J.; Young, Roy M.

    2011-01-01

    The JEM-EUSO mission will detect extensive air showers produced by extreme energy cosmic rays. It operates from the ISS looking down on Earth's night time atmosphere to detect the nitrogen fluorescence and Cherenkov produce by the charged particles in the EAS. The JEM-EUSO science objectives require a large field of view, sensitivity to energies below 50 EeV, and must fit within available ISS resources. The JEM-EUSO optic module uses three large diameter, thin plastic lenses with Fresnel surfaces to meet the instrument requirements. A bread-board model of the optic has been manufactured and has undergone preliminary tests. We report the results of optical performance tests and evaluate the present capability to manufacture these optical elements.

  11. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  12. A New Rerouting Technique for the Extensor Pollicis Longus in Palliative Treatment for Wrist and Finger Extension Paralysis Resulting From Radial Nerve and C5C6C7 Root Injury.

    PubMed

    Laravine, Jennifer; Cambon-Binder, Adeline; Belkheyar, Zoubir

    2016-03-01

    Wrist and finger extension paralysis is a consequence of an injury to the radial nerve or the C5C6C7 roots. Despite these 2 different levels of lesions, palliative treatment for this type of paralysis depends on the same tendon transfers. A large majority of the patients are able to compensate for a deficiency of the extension of the wrist and fingers. However, a deficiency in the opening of the first web space, which could be responsible for transfers to the abductor pollicis longus, the extensor pollicis brevis, and the extensor pollicis longus (EPL), frequently exists. The aim of this work was to evaluate the feasibility of a new EPL rerouting technique outside of Lister's tubercle. Another aim was to verify whether this technique allows a better opening of the thumb-index pinch in this type of paralysis. In the first part, we performed an anatomic study comparing the EPL rerouting technique and the frequently used technique for wrist and finger extension paralyses. In the second part, we present 2 clinical cases in which this new technique will be practiced. Preliminary results during this study favor the EPL rerouting technique. This is a simple and reproducible technique that allows for good opening of the first web space in the treatment of wrist and finger extension paralysis. PMID:26709570

  13. Evaluations of mosquito age grading techniques based on morphological changes.

    PubMed

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  14. Knee extension isometric torque production differences based on verbal motivation given to introverted and extroverted female children.

    PubMed

    McWhorter, J Wesley; Landers, Merrill; Young, Daniel; Puentedura, E Louie; Hickman, Robbin A; Brooksby, Candi; Liveratti, Marc; Taylor, Lisa

    2011-08-01

    To date, little research has been conducted to test the efficacy of different forms of motivation based on a female child's personality type. The purpose of this study was to evaluate the ability of female children to perform a maximal knee extension isometric torque test with varying forms of motivation, based on the child's personality type (introvert vs. extrovert). The subjects were asked to perform a maximal isometric knee extension test under three different conditions: 1) with no verbal motivation, 2) with verbal motivation from the evaluator only, and 3) with verbal motivation from a group of their peers and the evaluator combined. A 2×3 mixed ANOVA was significant for an interaction (F 2,62=17.530; p<0.0005). Post hoc testing for the introverted group showed that scores without verbal motivation were significantly higher than with verbal motivation from the evaluator or the evaluator plus the peers. The extroverted group revealed that scores with verbal motivation from the evaluator or the evaluator plus the peers were significantly higher than without verbal motivation. Results suggest that verbal motivation has a varying effect on isometric knee extension torque production in female children with different personality types. Extroverted girls perform better with motivation, whereas introverted girls perform better without motivation from others. PMID:20812856

  15. Detecting Molecular Properties by Various Laser-Based Techniques

    SciTech Connect

    Hsin, Tse-Ming

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  16. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  17. Model-checking techniques based on cumulative residuals.

    PubMed

    Lin, D Y; Wei, L J; Ying, Z

    2002-03-01

    Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided. PMID:11890304

  18. Human resource development for a community-based health extension program: a case study from Ethiopia

    PubMed Central

    2013-01-01

    Introduction Ethiopia is one of the sub-Saharan countries most affected by high disease burden, aggravated by a shortage and imbalance of human resources, geographical distance, and socioeconomic factors. In 2004, the government introduced the Health Extension Program (HEP), a primary care delivery strategy, to address the challenges and achieve the World Health Organization Millennium Development Goals (MDGs) within a context of limited resources. Case description The health system was reformed to create a platform for integration and institutionalization of the HEP with appropriate human capacity, infrastructure, and management structures. Human resources were developed through training of female health workers recruited from their prospective villages, designed to limit the high staff turnover and address gender, social and cultural factors in order to provide services acceptable to each community. The service delivery modalities include household, community and health facility care. Thus, the most basic health post infrastructure, designed to rapidly and cost-effectively scale up HEP, was built in each village. In line with the country’s decentralized management system, the HEP service delivery is under the jurisdiction of the district authorities. Discussion and evaluation The nationwide implementation of HEP progressed in line with its target goals. In all, 40 training institutions were established, and over 30,000 Health Extension Workers have been trained and deployed to approximately 15,000 villages. The potential health service coverage reached 92.1% in 2011, up from 64% in 2004. While most health indicators have improved, performance in skilled delivery and postnatal care has not been satisfactory. While HEP is considered the most important institutional framework for achieving the health MDGs in Ethiopia, quality of service, utilization rate, access and referral linkage to emergency obstetric care, management, and evaluation of the program are the key

  19. Astronomical Image Compression Techniques Based on ACC and KLT Coder

    NASA Astrophysics Data System (ADS)

    Schindler, J.; Páta, P.; Klíma, M.; Fliegel, K.

    This paper deals with a compression of image data in applications in astronomy. Astronomical images have typical specific properties -- high grayscale bit depth, size, noise occurrence and special processing algorithms. They belong to the class of scientific images. Their processing and compression is quite different from the classical approach of multimedia image processing. The database of images from BOOTES (Burst Observer and Optical Transient Exploring System) has been chosen as a source of the testing signal. BOOTES is a Czech-Spanish robotic telescope for observing AGN (active galactic nuclei) and the optical transient of GRB (gamma ray bursts) searching. This paper discusses an approach based on an analysis of statistical properties of image data. A comparison of two irrelevancy reduction methods is presented from a scientific (astrometric and photometric) point of view. The first method is based on a statistical approach, using the Karhunen-Loève transform (KLT) with uniform quantization in the spectral domain. The second technique is derived from wavelet decomposition with adaptive selection of used prediction coefficients. Finally, the comparison of three redundancy reduction methods is discussed. Multimedia format JPEG2000 and HCOMPRESS, designed especially for astronomical images, are compared with the new Astronomical Context Coder (ACC) coder based on adaptive median regression.

  20. Vegetation change detection based on image fusion technique

    NASA Astrophysics Data System (ADS)

    Jia, Yonghong; Liu, Yueyan; Yu, Hui; Li, Deren

    2005-10-01

    The change detection of land use and land cover has always been the focus of remotely sensed study and application. Based on techniques of image fusion, a new approach of detecting vegetation change according to vector of brightness index (BI) and perpendicular vegetation index (PVI) extracted from multi-temporal remotely sensed imagery is proposed. The procedure is introduced. Firstly, the Landsat eTM+ imagery is geometrically corrected and registered. Secondly, band 2,3,4 and panchromatic images of Landsat eTM+ are fused by a trous wavelet fusion, and bands 1,2,3 of SPOT are registered to the fused images. Thirdly, brightness index and perpendicular vegetation index are respectively extracted from SPOT images and fused images. Finally, change vectors are obtained and used to detect vegetation change. The testing results show that the approach of detecting vegetation change is very efficient.

  1. Protein elasticity probed with two synchrotron-based techniques.

    SciTech Connect

    Leu, B. M.; Alatas, A.; Sinn, H.; Alp, E. E.; Said, A.; Yavas, H.; Zhao, J.; Sage, J. T.; Sturhahn, W.; X-Ray Science Division; Hasylab; Northeastern Univ.

    2010-02-25

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques - nuclear resonance vibrational spectroscopy and inelastic x-ray scattering - to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  2. Validation techniques for fault emulation of SRAM-based FPGAs

    SciTech Connect

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA in a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.

  3. Mars laser altimeter based on a single photon ranging technique

    NASA Technical Reports Server (NTRS)

    Prochazka, Ivan; Hamal, Karel; Sopko, B.; Pershin, S.

    1993-01-01

    The Mars 94/96 Mission will carry, among others things, the balloon probe experiment. The balloon with the scientific cargo in the gondola underneath will drift in the Mars atmosphere, its altitude will range from zero, in the night, up to 5 km at noon. The accurate gondola altitude will be determined by an altimeter. As the Balloon gondola mass is strictly limited, the altimeter total mass and power consumption are critical; maximum allowed is a few hundred grams a few tens of mWatts of average power consumption. We did propose, design, and construct the laser altimeter based on the single photon ranging technique. Topics covered include the following: principle of operation, altimeter construction, and ground tests.

  4. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGESBeta

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  5. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    PubMed

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  6. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    PubMed Central

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  7. Applications of imaged capillary isoelectric focussing technique in development of biopharmaceutical glycoprotein-based products.

    PubMed

    Anderson, Carrie L; Wang, Yang; Rustandi, Richard R

    2012-06-01

    CE-based methods have increasingly been applied to the analysis of a variety of different type proteins. One of those techniques is imaged capillary isoelectric focusing (icIEF), a method that has been used extensively in the field of protein-based drug development as a tool for product identification, stability monitoring, and characterization. It offers many advantages over the traditional labor-intensive IEF slab gel method and even standard cIEF with on-line detection technologies with regard to method development, reproducibility, robustness, and speed. Here, specific examples are provided for biopharmaceutical glycoprotein products such as mAbs, erythropoietin (EPO), and recombinant Fc-fusion proteins, though the technique can be adapted for many other therapeutic proteins. Applications of iCIEF using a Convergent Bioscience instrument (Toronto, Canada) with whole-field imaging technology are presented and discussed. These include a quick method to establish an identity test for many protein-based products, product release, and stability evaluation of glycoproteins with respect to charge heterogeneity under accelerated temperature stress, different pH conditions, and in different formulations. Finally, characterization of glycoproteins using this iCIEF technology is discussed with respect to biosimilar development, clone selection, and antigen binding. The data presented provide a "taste'' of what icIEF method can do to support the development of biopharmaceutical glycoprotein products from early clone screening for better product candidates to characterization of the final commercial products. PMID:22736354

  8. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  9. Use of extension-deformation-based crystallisation of silk fibres to differentiate their functions in nature.

    PubMed

    Numata, Keiji; Masunaga, Hiroyasu; Hikima, Takaaki; Sasaki, Sono; Sekiyama, Kazuhide; Takata, Masaki

    2015-08-21

    β-Sheet crystals play an important role in determining the stiffness, strength, and optical properties of silk and in the exhibition of silk-type-specific functions. It is important to elucidate the structural changes that occur during the stretching of silk fibres to understand the functions of different types of fibres. Herein, we elucidate the initial crystallisation behaviour of silk molecules during the stretching of three types of silk fibres using synchrotron radiation X-ray analysis. When spider dragline silk was stretched, it underwent crystallisation and the alignment of the β-sheet crystals became disordered initially but was later recovered. On the other hand, silkworm cocoon silk did not exhibit further crystallisation, whereas capture spiral silk was predominantly amorphous. Structural analyses showed that the crystallisation of silks following extension deformation has a critical effect on their mechanical and optical properties. These findings should aid the production of artificial silk fibres and facilitate the development of silk-inspired functional materials. PMID:26166211

  10. Movement Analysis of Flexion and Extension of Honeybee Abdomen Based on an Adaptive Segmented Structure

    PubMed Central

    Zhao, Jieliang; Wu, Jianing; Yan, Shaoze

    2015-01-01

    Honeybees (Apis mellifera) curl their abdomens for daily rhythmic activities. Prior to determining this fact, people have concluded that honeybees could curl their abdomen casually. However, an intriguing but less studied feature is the possible unidirectional abdominal deformation in free-flying honeybees. A high-speed video camera was used to capture the curling and to analyze the changes in the arc length of the honeybee abdomen not only in free-flying mode but also in the fixed sample. Frozen sections and environment scanning electron microscope were used to investigate the microstructure and motion principle of honeybee abdomen and to explore the physical structure restricting its curling. An adaptive segmented structure, especially the folded intersegmental membrane (FIM), plays a dominant role in the flexion and extension of the abdomen. The structural features of FIM were utilized to mimic and exhibit movement restriction on honeybee abdomen. Combining experimental analysis and theoretical demonstration, a unidirectional bending mechanism of honeybee abdomen was revealed. Through this finding, a new perspective for aerospace vehicle design can be imitated. PMID:26223946

  11. Shellac and Aloe vera gel based surface coating for shelf life extension of tomatoes.

    PubMed

    Chauhan, O P; Nanjappa, C; Ashok, N; Ravi, N; Roopa, N; Raju, P S

    2015-02-01

    Shellac (S) and Aloe vera gel (AG) were used to develop edible surface coatings for shelf-life extension of tomato fruits. The coating was prepared by dissolving de-waxed and bleached shellac in an alkaline aqueous medium as such as well as in combination with AG. Incorporation of AG in shellac coating improved permeability characteristics of the coating film towards oxygen and carbon dioxide and water vapours. The coatings when applied to tomatoes delayed senescence which was characterized by restricted changes in respiration and ethylene synthesis rates during storage. Texture of the fruits when measured in terms of firmness showed restricted changes as compared to untreated control. Similar observations were also recorded in the case of instrumental colour (L*, a* and b* values). The developed coatings extended shelf-life of tomatoes by 10, 8 and 12 days in case of shellac (S), AG and composite coating (S + AG) coated fruits, respectively; when kept at ambient storage conditions (28 ± 2 °C). PMID:25694740

  12. 78 FR 7654 - Extension of Exemptions for Security-Based Swaps

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ... (Jul. 18, 2012), 77 FR 48208 (Aug. 13, 2012). Title VII amended the Securities Act and the Exchange Act..., Release No. 34-63825 (Feb. 2, 2011), 76 FR 10948 (Feb. 28, 2011) (``Security-Based SEF Proposing Release... Security-Based Swaps Issued By Certain Clearing Agencies, Release No. 33-9308 (Mar. 30, 2012), 77 FR...

  13. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  14. An extension of the 'Malkus hypothesis' to the turbulent base flow of blunt sections

    NASA Astrophysics Data System (ADS)

    Vorus, William S.; Chen, Liyong

    1987-11-01

    An approximate theory for the mean turbulent near-wake of cylindrical bodies with blunt after edges is developed and implemented in terms of a linearized closed free-streamline theory of thin blunt-based symmetric sections. In the present application, the Malkus hypothesis leads to maximization of the rate of change of mean kinetic energy along the separation-cavity streamline. The results compare well with experimental measurements of mean base pressures and section drag, although the linearizing assumptions of section-cavity slenderness and base-pressure magnitude are not so well preserved in the calculated results.

  15. Evidence-Based Programming: What Is a Process an Extension Agent Can Use to Evaluate a Program's Effectiveness?

    ERIC Educational Resources Information Center

    Fetsch, Robert J.; MacPhee, David; Boyer, Luann K.

    2012-01-01

    Extension agents and specialists have experienced increased pressure for greater program effectiveness and accountability and especially for evidence-based programs. This article builds on previously published evidence-based programming articles. It provides ideas that address three problems that Extension staff face with EBPs and that Extension…

  16. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review.

    PubMed

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  17. Extensive quantities in thermodynamics

    NASA Astrophysics Data System (ADS)

    Mannaerts, Sebastiaan H.

    2014-05-01

    A literature survey shows little consistency in the definitions of the term ‘extensive quantity’ (a.k.a. extensive property) as used in thermodynamics. A majority assumes that extensive quantities are those that are proportional to mass. Taking the mathematical meaning of proportional and taking the ‘mass’ to be that of the system or subsystem, it is shown that the proportionality assumption is only correct for a few extensive quantities under condition of constant composition. A large subset of extensive quantities are completely independent of mass; for most systems extensive quantities are not proportional to mass, but mass is the (extensive) constant of proportionality. The definition by IUPAC, based on the additivity of extensive quantities, is the preferred baseline for discussing this subject. It is noted however, that two types of additivity need to be distinguished and that a few intensive quantities are also additive. This paper leaves several interesting questions open to further scrutiny.

  18. A New Femtosecond Laser-Based Three-Dimensional Tomography Technique

    NASA Astrophysics Data System (ADS)

    Echlin, McLean P.

    2011-12-01

    Tomographic imaging has dramatically changed science, most notably in the fields of medicine and biology, by producing 3D views of structures which are too complex to understand in any other way. Current tomographic techniques require extensive time both for post-processing and data collection. Femtosecond laser based tomographic techniques have been developed in both standard atmosphere (femtosecond laser-based serial sectioning technique - FSLSS) and in vacuum (Tri-Beam System) for the fast collection (10 5mum3/s) of mm3 sized 3D datasets. Both techniques use femtosecond laser pulses to selectively remove layer-by-layer areas of material with low collateral damage and a negligible heat affected zone. To the authors knowledge, femtosecond lasers have never been used to serial section and these techniques have been entirely and uniquely developed by the author and his collaborators at the University of Michigan and University of California Santa Barbara. The FSLSS was applied to measure the 3D distribution of TiN particles in a 4330 steel. Single pulse ablation morphologies and rates were measured and collected from literature. Simultaneous two-phase ablation of TiN and steel matrix was shown to occur at fluences of 0.9-2 J/cm2. Laser scanning protocols were developed minimizing surface roughness to 0.1-0.4 mum for laser-based sectioning. The FSLSS technique was used to section and 3D reconstruct titanium nitride (TiN) containing 4330 steel. Statistical analysis of 3D TiN particle sizes, distribution parameters, and particle density were measured. A methodology was developed to use the 3D datasets to produce statistical volume elements (SVEs) for toughness modeling. Six FSLSS TiN datasets were sub-sampled into 48 SVEs for statistical analysis and toughness modeling using the Rice-Tracey and Garrison-Moody models. A two-parameter Weibull analysis was performed and variability in the toughness data agreed well with Ruggieri et al. bulk toughness measurements. The Tri

  19. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  20. Perceptually based techniques for semantic image classification and retrieval

    NASA Astrophysics Data System (ADS)

    Depalov, Dejan; Pappas, Thrasyvoulos; Li, Dongge; Gandhi, Bhavan

    2006-02-01

    The accumulation of large collections of digital images has created the need for efficient and intelligent schemes for content-based image retrieval. Our goal is to organize the contents semantically, according to meaningful categories. We present a new approach for semantic classification that utilizes a recently proposed color-texture segmentation algorithm (by Chen et al.), which combines knowledge of human perception and signal characteristics to segment natural scenes into perceptually uniform regions. The color and texture features of these regions are used as medium level descriptors, based on which we extract semantic labels, first at the segment and then at the scene level. The segment features consist of spatial texture orientation information and color composition in terms of a limited number of locally adapted dominant colors. The focus of this paper is on region classification. We use a hierarchical vocabulary of segment labels that is consistent with those used in the NIST TRECVID 2003 development set. We test the approach on a database of 9000 segments obtained from 2500 photographs of natural scenes. For training and classification we use the Linear Discriminant Analysis (LDA) technique. We examine the performance of the algorithm (precision and recall rates) when different sets of features (e.g., one or two most dominant colors versus four quantized dominant colors) are used. Our results indicate that the proposed approach offers significant performance improvements over existing approaches.

  1. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  2. Damage detection technique by measuring laser-based mechanical impedance

    NASA Astrophysics Data System (ADS)

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-01

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  3. Automatic tumor segmentation using knowledge-based techniques.

    PubMed

    Clark, M C; Hall, L O; Goldgof, D B; Velthuizen, R; Murtagh, F R; Silbiger, M S

    1998-04-01

    A system that automatically segments and labels glioblastoma-multiforme tumors in magnetic resonance images (MRI's) of the human brain is presented. The MRI's consist of T1-weighted, proton density, and T2-weighted feature images and are processed by a system which integrates knowledge-based (KB) techniques with multispectral analysis. Initial segmentation is performed by an unsupervised clustering algorithm. The segmented image, along with cluster centers for each class are provided to a rule-based expert system which extracts the intracranial region. Multispectral histogram analysis separates suspected tumor from the rest of the intracranial region, with region analysis used in performing the final tumor labeling. This system has been trained on three volume data sets and tested on thirteen unseen volume data sets acquired from a single MRI system. The KB tumor segmentation was compared with supervised, radiologist-labeled "ground truth" tumor volumes and supervised k-nearest neighbors tumor segmentations. The results of this system generally correspond well to ground truth, both on a per slice basis and more importantly in tracking total tumor volume during treatment over time. PMID:9688151

  4. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  5. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  6. Semantic Extension of Agent-Based Control: The Packing Cell Case Study

    NASA Astrophysics Data System (ADS)

    Vrba, Pavel; Radakovič, Miloslav; Obitko, Marek; Mařík, Vladimír

    The paper reports on the latest R&D activities in the field of agent-based manufacturing control systems. It is documented that this area becomes strongly influenced by the advancements of semantic technologies like the Web Ontology Language. The application of ontologies provides the agents with much more effective means for handling, exchanging and reasoning about the knowledge. The ontology dedicated for semantic description of orders, production processes and material handling tasks in discrete manufacturing domain has been developed. In addition, the framework for integration of this ontology in distributed, agent-based control solutions is given. The Manufacturing Agent Simulation Tool (MAST) is used as a base for pilot implementation of the ontology-powered multiagent control system; the packing cell environment is selected as a case study.

  7. Whole Genome Sequencing Based Characterization of Extensively Drug-Resistant Mycobacterium tuberculosis Isolates from Pakistan

    PubMed Central

    Ali, Asho; Hasan, Zahra; McNerney, Ruth; Mallard, Kim; Hill-Cawthorne, Grant; Coll, Francesc; Nair, Mridul; Pain, Arnab; Clark, Taane G.; Hasan, Rumina

    2015-01-01

    Improved molecular diagnostic methods for detection drug resistance in Mycobacterium tuberculosis (MTB) strains are required. Resistance to first- and second- line anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs) in particular genes. However, these SNPs can vary between MTB lineages therefore local data is required to describe different strain populations. We used whole genome sequencing (WGS) to characterize 37 extensively drug-resistant (XDR) MTB isolates from Pakistan and investigated 40 genes associated with drug resistance. Rifampicin resistance was attributable to SNPs in the rpoB hot-spot region. Isoniazid resistance was most commonly associated with the katG codon 315 (92%) mutation followed by inhA S94A (8%) however, one strain did not have SNPs in katG, inhA or oxyR-ahpC. All strains were pyrazimamide resistant but only 43% had pncA SNPs. Ethambutol resistant strains predominantly had embB codon 306 (62%) mutations, but additional SNPs at embB codons 406, 378 and 328 were also present. Fluoroquinolone resistance was associated with gyrA 91–94 codons in 81% of strains; four strains had only gyrB mutations, while others did not have SNPs in either gyrA or gyrB. Streptomycin resistant strains had mutations in ribosomal RNA genes; rpsL codon 43 (42%); rrs 500 region (16%), and gidB (34%) while six strains did not have mutations in any of these genes. Amikacin/kanamycin/capreomycin resistance was associated with SNPs in rrs at nt1401 (78%) and nt1484 (3%), except in seven (19%) strains. We estimate that if only the common hot-spot region targets of current commercial assays were used, the concordance between phenotypic and genotypic testing for these XDR strains would vary between rifampicin (100%), isoniazid (92%), flouroquinolones (81%), aminoglycoside (78%) and ethambutol (62%); while pncA sequencing would provide genotypic resistance in less than half the isolates. This work highlights the importance of expanded

  8. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  9. Dynamic digital watermark technique based on neural network

    NASA Astrophysics Data System (ADS)

    Gu, Tao; Li, Xu

    2008-04-01

    An algorithm of dynamic watermark based on neural network is presented which is more robust against attack of false authentication and watermark-tampered operations contrasting with one watermark embedded method. (1) Five binary images used as watermarks are coded into a binary array. The total number of 0s and 1s is 5*N, every 0 or 1 is enlarged fivefold by information-enlarged technique. N is the original total number of the watermarks' binary bits. (2) Choose the seed image pixel p x,y and its 3×3 vicinities pixel p x-1,y-1,p x-1,y,p x-1,y+1,p x,y-1,p x,y+1,p x+1,y-1,p x+1,y,p x+1,y+1 as one sample space. The p x,y is used as the neural network target and the other eight pixel values are used as neural network inputs. (3) To make the neural network learn the sample space, 5*N pixel values and their closely relevant pixel values are randomly chosen with a password from a color BMP format image and used to train the neural network.(4) A four-layer neural network is constructed to describe the nonlinear mapped relationship between inputs and outputs. (5) One bit from the array is embedded by adjusting the polarity between a chosen pixel value and the output value of the model. (6) One randomizer generates a number to ascertain the counts of watermarks for retrieving. The randomly ascertained watermarks can be retrieved by using the restored neural network outputs value, the corresponding image pixels value, and the restore function without knowing the original image and watermarks (The restored coded-watermark bit=1, if ox,y(restored)>p x,y(reconstructed, else coded-watermark bit =0). The retrieved watermarks are different when extracting each time. The proposed technique can offer more watermarking proofs than one watermark embedded algorithm. Experimental results show that the proposed technique is very robust against some image processing operations and JPEG lossy compression. Therefore, the algorithm can be used to protect the copyright of one important image.

  10. Clientele Differences of a Cooperative Extension Program as Related to Base of Organization.

    ERIC Educational Resources Information Center

    Gross, John G.

    Conducted in Nebraska and Missouri, this study compared the clientele of an area specialist dairy testing program with the clientele of generalized county based programs to determine significant differences and their implications. Comparisons were made by age, educational level, size of farm business, farm ownership, participation in short courses…

  11. Depth-based coding of MVD data for 3D video extension of H.264/AVC

    NASA Astrophysics Data System (ADS)

    Rusanovskyy, Dmytro; Hannuksela, Miska M.; Su, Wenyi

    2013-06-01

    This paper describes a novel approach of using depth information for advanced coding of associated video data in Multiview Video plus Depth (MVD)-based 3D video systems. As a possible implementation of this conception, we describe two coding tools that have been developed for H.264/AVC based 3D Video Codec as response to Moving Picture Experts Group (MPEG) Call for Proposals (CfP). These tools are Depth-based Motion Vector Prediction (DMVP) and Backward View Synthesis Prediction (BVSP). Simulation results conducted under JCT-3V/MPEG 3DV Common Test Conditions show, that proposed in this paper tools reduce bit rate of coded video data by 15% of average delta bit rate reduction, which results in 13% of bit rate savings on total for the MVD data over the state-of-the-art MVC+D coding. Moreover, presented in this paper conception of depth-based coding of video has been further developed by MPEG 3DV and JCT-3V and this work resulted in even higher compression efficiency, bringing about 20% of delta bit rate reduction on total for coded MVD data over the reference MVC+D coding. Considering significant gains, proposed in this paper coding approach can be beneficial for development of new 3D video coding standards. [Figure not available: see fulltext.

  12. Object Similarity Bootstraps Young Children to Action-Based Verb Extension

    ERIC Educational Resources Information Center

    Haryu, Etsuko; Imai, Mutsumi; Okada, Hiroyuki

    2011-01-01

    Young children often fail to generalize a novel verb based on sameness of action since they have difficulty focusing on the relational similarity across events while at the same time ignoring the objects that are involved. Study 1, with Japanese-speaking 3- and 4-year-olds (N = 28 in each group), found that similarity of objects involved in action…

  13. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    NASA Astrophysics Data System (ADS)

    Ahrens, R.; Harenberg, T.; Kalinin, S.; Mättig, P.; Sandhoff, M.; dos Santos, T.; Volkmer, F.

    2012-12-01

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based PanDA job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job's owner immedeatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehavior. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitoring data to provide job and site health summary information to users and admins is presented. Finally, the provision of a secure real-time control and steering channel to the job as extension of the presented monitoring software is considered and a possible model of such the control method is presented.

  14. [Evidence-based medicine: reality and illusions. Extension of epistemological reflexions].

    PubMed

    Timio, M; Antiseri, D

    2000-03-01

    Evidence-based medicine (EBM) is a cultural and methodological approach to clinical practice helping to make decisions based on clinical expertise and an intimate knowledge of the individual patient's situations, beliefs, and priorities useful for the analysis of clinical research. As such, it can be considered the scientifically grounded art of medicine, as it appears to be an emerging paradigm of scientifically based clinical care. It de-emphasizes intuition and unsystematic clinical experience as grounds for medical decision-making and stresses the rigorous and formal analysis of evidence from clinical research. EBM converts the abstract exercise of reading and appraising the literature into the pragmatic process of using the literature to benefit individual patients, while simultaneously expanding the clinician's knowledge base. On EBM grounds, clinical, practice guidelines, pathways and algorithms or instructions can be developed with the aim of solving a problem or accomplishing a task. Nonetheless in these processes the theory of EBM shows internal and external bias. Among internal bias, economic-based interest may influence the development and diffusion of research and its results. In addition "systemic review" may be incorrectly guided, the quality filters of the literature can be inappropriately applied, the choice criteria can be only based on the positive results of evidence, but according to modern epistemology, it will be helpful for clinicians to know when their uncertainty stems from gaps between positive and negative evidence. Another bias is the difficulty to convert EBM into clinical practice recommendations. EBM set movement has shown that it is nearly impossible to make recommendations that are appropriate in every situation. Epistemological approach identifies external "bias" of EBM. It is consistent with the theory of "fact" as human construction. Every human fact can historically fade and then be restored according to new paradigms. EBM is a

  15. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    NASA Technical Reports Server (NTRS)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  16. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  17. Novel technique: a pupillometer-based objective chromatic perimetry

    NASA Astrophysics Data System (ADS)

    Rotenstreich, Ygal; Skaat, Alon; Sher, Ifat; Kolker, Andru; Rosenfeld, Elkana; Melamed, Shlomo; Belkin, Michael

    2014-02-01

    Evaluation of visual field (VF) is important for clinical diagnosis and patient monitoring. The current VF methods are subjective and require patient cooperation. Here we developed a novel objective perimetry technique based on the pupil response (PR) to multifocal chromatic stimuli in normal subjects and in patients with glaucoma and retinitis pigmentosa (RP). A computerized infrared video pupillometer was used to record PR to short- and long-wavelength stimuli (peak 485 nm and 620 nm, respectively) at light intensities of 15-100 cd-s/m2 at thirteen different points of the VF. The RP study included 30 eyes of 16 patients and 20 eyes of 12 healthy participants. The glaucoma study included 22 eyes of 11 patients and 38 eyes of 19 healthy participants. Significantly reduced PR was observed in RP patients in response to short-wavelength stimuli at 40 cd-s/m2 in nearly all perimetric locations (P <0.05). By contrast, RP patients demonstrated nearly normal PR to long-wavelength in majority of perimetric locations. The glaucoma group showed significantly reduced PR to long- and short-wavelength stimuli at high intensity in all perimetric locations (P <0.05). The PR of glaucoma patients was significantly lower than normal in response to short-wavelength stimuli at low intensity mostly in central and 20° locations (p<0.05). This study demonstrates the feasibility of using pupillometer-based chromatic perimetry for objectively assessing VF defects and retinal function and optic nerve damage in patients with retinal dystrophies and glaucoma. Furthermore, this method may be used to distinguish between the damaged cells underlying the VF defect.

  18. Age estimation based on Kvaal's technique using digital panoramic radiographs

    PubMed Central

    Mittal, Samta; Nagendrareddy, Suma Gundareddy; Sharma, Manisha Lakhanpal; Agnihotri, Poornapragna; Chaudhary, Sunil; Dhillon, Manu

    2016-01-01

    Introduction: Age estimation is important for administrative and ethical reasons and also because of legal consequences. Dental pulp undergoes regression in size with increasing age due to secondary dentin deposition and can be used as a parameter of age estimation even beyond 25 years of age. Kvaal et al. developed a method for chronological age estimation based on the pulp size using periapical dental radiographs. There is a need for testing this method of age estimation in the Indian population using simple tools like digital imaging on living individuals not requiring extraction of teeth. Aims and Objectives: Estimation of the chronological age of subjects by Kvaal's method using digital panoramic radiographs and also testing the validity of regression equations as given by Kvaal et al. Materials and Methods: The study sample included a total of 152 subjects in the age group of 14-60 years. Measurements were performed on the standardized digital panoramic radiographs based on Kvaal's method. Different regression formulae were derived and the age was assessed. The assessed age was then correlated to the actual age of the patient using Student's t-test. Results: No significant difference between the mean of the chronological age and the estimated age was observed. However, the values of the mean age estimated by using regression equations as given previously in the study of Kvaal et al. significantly underestimated the chronological age in the present study sample. Conclusion: The results of the study give an inference for the feasibility of this technique by calculation of regression equations on digital panoramic radiographs. However, it negates the applicability of same regression equations as given by Kvaal et al. on the study population.

  19. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    PubMed

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    Sugiura, Y, Hatanaka, Y, Arai, T, Sakurai, H, and Kanada, Y. Estimations of one repetition maximum and isometric peak torque in knee extension based on the relationship between force and velocity. J Strength Cond Res 30(4): 980-988, 2016-We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results. PMID:26382131

  20. Carbon Storage in an Extensive Karst-distributed Region of Southwestern China based on Multiple Methods

    NASA Astrophysics Data System (ADS)

    Guo, C.; Wu, Y.; Yang, H.; Ni, J.

    2015-12-01

    Accurate estimation of carbon storage is crucial to better understand the processes of global and regional carbon cycles and to more precisely project ecological and economic scenarios for the future. Southwestern China has broadly and continuously distribution of karst landscapes with harsh and fragile habitats which might lead to rocky desertification, an ecological disaster which has significantly hindered vegetation succession and economic development in karst regions of southwestern China. In this study we evaluated the carbon storage in eight political divisions of southwestern China based on four methods: forest inventory, carbon density based on field investigations, CASA model driven by remote sensing data, and BIOME4/LPJ global vegetation models driven by climate data. The results show that: (1) The total vegetation carbon storage (including agricultural ecosystem) is 6763.97 Tg C based on the carbon density, and the soil organic carbon (SOC) storage (above 20cm depth) is 12475.72 Tg C. Sichuan Province (including Chongqing) possess the highest carbon storage in both vegetation and soil (1736.47 Tg C and 4056.56 Tg C, respectively) among the eight political divisions because of the higher carbon density and larger distribution area. The vegetation carbon storage in Hunan Province is the smallest (565.30 Tg C), and the smallest SOC storage (1127.40 Tg C) is in Guangdong Province; (2) Based on forest inventory data, the total aboveground carbon storage in the woody vegetation is 2103.29 Tg C. The carbon storage in Yunnan Province (819.01 Tg C) is significantly higher than other areas while tropical rainforests and seasonal forests in Yunnan contribute the maximum of the woody vegetation carbon storage (account for 62.40% of the total). (3) The net primary production (NPP) simulated by the CASA model is 68.57 Tg C/yr, while the forest NPP in the non-karst region (account for 72.50% of the total) is higher than that in the karst region. (4) BIOME4 and LPJ

  1. Development of a Flexible and Extensible Computer-based Simulation Platform for Healthcare Students.

    PubMed

    Bindoff, Ivan; Cummings, Elizabeth; Ling, Tristan; Chalmers, Leanne; Bereznicki, Luke

    2015-01-01

    Accessing appropriate clinical placement positions for all health profession students can be expensive and challenging. Increasingly simulation, in a range of modes, is being used to enhance student learning and prepare them for clinical placement. Commonly these simulations are focused on the use of simulated patient mannequins which typically presented as single-event scenarios, difficult to organise, and usually scenarios include only a single healthcare profession. Computer based simulation is relatively under-researched and under-utilised but is beginning to demonstrate potential benefits. This paper describes the development and trialling of an entirely virtual 3D simulated environment for inter-professional student education. PMID:25676952

  2. Two new defective distributions based on the Marshall-Olkin extension.

    PubMed

    Rocha, Ricardo; Nadarajah, Saralees; Tomazella, Vera; Louzada, Francisco

    2016-04-01

    The presence of immune elements (generating a fraction of cure) in survival data is common. These cases are usually modeled by the standard mixture model. Here, we use an alternative approach based on defective distributions. Defective distributions are characterized by having density functions that integrate to values less than 1, when the domain of their parameters is different from the usual one. We use the Marshall-Olkin class of distributions to generalize two existing defective distributions, therefore generating two new defective distributions. We illustrate the distributions using three real data sets. PMID:25951911

  3. A new membrane-based crystallization technique: tests on lysozyme

    NASA Astrophysics Data System (ADS)

    Curcio, Efrem; Profio, Gianluca Di; Drioli, Enrico

    2003-01-01

    The great importance of protein science both in industrial and scientific fields, in conjunction with the intrinsic difficulty to grow macromolecular crystals, stimulates the development of new observations and ideas that can be useful in initiating more systematic studies using novel approaches. In this regard, an innovative technique, based on the employment of microporous hydrophobic membranes in order to promote the formation of lysozyme crystals from supersaturated solutions, is introduced in this work. Operational principles and possible advantages, both in terms of controlled extraction of solvent by acting on the concentration of the stripping solution and reduced induction times, are outlined. Theoretical developments and experimental results concerning the mass transfer, in vapour phase, through the membrane are presented, as well as the results from X-ray diffraction to 1.7 Å resolution of obtained lysozyme crystals using NaCl as the crystallizing agent and sodium acetate as the buffer. Crystals were found to be tetragonal with unit cell dimensions of a= b=79.1 Å and c=37.9 Å; the overall Rmerge on intensities in the resolution range from 25 to 1.7 Å was, in the best case, 4.4%.

  4. Research on technique of wavefront retrieval based on Foucault test

    NASA Astrophysics Data System (ADS)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  5. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  6. Study of coherent synchrotron radiation effects by means of a new simulation code based on the non-linear extension of the operator splitting method

    NASA Astrophysics Data System (ADS)

    Dattoli, G.; Migliorati, M.; Schiavi, A.

    2007-05-01

    The coherent synchrotron radiation (CSR) is one of the main problems limiting the performance of high-intensity electron accelerators. The complexity of the physical mechanisms underlying the onset of instabilities due to CSR demands for accurate descriptions, capable of including the large number of features of an actual accelerating device. A code devoted to the analysis of these types of problems should be fast and reliable, conditions that are usually hardly achieved at the same time. In the past, codes based on Lie algebraic techniques have been very efficient to treat transport problems in accelerators. The extension of these methods to the non-linear case is ideally suited to treat CSR instability problems. We report on the development of a numerical code, based on the solution of the Vlasov equation, with the inclusion of non-linear contribution due to wake field effects. The proposed solution method exploits an algebraic technique that uses the exponential operators. We show that the integration procedure is capable of reproducing the onset of instability and the effects associated with bunching mechanisms leading to the growth of the instability itself. In addition, considerations on the threshold of the instability are also developed.

  7. A Spatial Division Clustering Method and Low Dimensional Feature Extraction Technique Based Indoor Positioning System

    PubMed Central

    Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao

    2014-01-01

    Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470

  8. A spatial division clustering method and low dimensional feature extraction technique based indoor positioning system.

    PubMed

    Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao

    2014-01-01

    Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470

  9. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    NASA Technical Reports Server (NTRS)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  10. Extension of transverse relaxation-optimized spectroscopy techniques to allosteric proteins: CO- and paramagnetic fluoromet-hemoglobin [beta (15N-valine)].

    PubMed

    Nocek, J M; Huang, K; Hoffman, B M

    2000-03-14

    We present the first steps in applying transverse relaxation-optimized spectroscopy (TROSY) techniques to the study of allosterism. Each beta-chain of the hemoglobin (Hb) tetramer has 17 valine residues. We have (15)N-labeled the beta-chain Val residues and detected 16 of the 17 (1)H-(15)N correlation peaks for beta-chain Val of the R state CO-Hb structure by using the TROSY technique. Sequence-specific assignments are suggested, based mainly on analysis of the (1)H pseudocontact-shift increments produced by oxidizing the diamagnetic R state HbCO to the paramagnetic R state fluoromet form. When possible, we support these assignments with sequential nuclear Overhauser effect (NOE) information obtained from a two-dimensional [(1)H,(1)H]-NOESY-TROSY experiment (NOESY, NOE spectroscopy). We have induced further the R-T conformational change by adding the allosteric effector, inositol hexaphosphate, to the fluoromet-Hb sample. This change induces substantial increments in the (1)H and (15)N chemical shifts, and we discuss the implication of these findings in the context of the tentative sequence assignments. These preliminary results suggest that amide nitrogen and amide proton chemical shifts in a selectively labeled sample are site-specific probes for monitoring the allosteric response of the ensemble-averaged solution structure of Hb. More important, the chemical-shift dispersion obtained is adequate to permit a complete assignment of the backbone (15)N/(13)C resonances upon nonselective labeling. PMID:10716987

  11. A fast and accurate PCA based radiative transfer model: Extension to the broadband shortwave region

    NASA Astrophysics Data System (ADS)

    Kopparla, Pushkar; Natraj, Vijay; Spurr, Robert; Shia, Run-Lie; Crisp, David; Yung, Yuk L.

    2016-04-01

    Accurate radiative transfer (RT) calculations are necessary for many earth-atmosphere applications, from remote sensing retrieval to climate modeling. A Principal Component Analysis (PCA)-based spectral binning method has been shown to provide an order of magnitude increase in computational speed while maintaining an overall accuracy of 0.01% (compared to line-by-line calculations) over narrow spectral bands. In this paper, we have extended the PCA method for RT calculations over the entire shortwave region of the spectrum from 0.3 to 3 microns. The region is divided into 33 spectral fields covering all major gas absorption regimes. We find that the RT performance runtimes are shorter by factors between 10 and 100, while root mean square errors are of order 0.01%.

  12. Research Extension and Education Programs on Bio-based Energy Technologies and Products

    SciTech Connect

    Jackson, Sam; Harper, David; Womac, Al

    2010-03-02

    The overall objectives of this project were to provide enhanced educational resources for the general public, educational and development opportunities for University faculty in the Southeast region, and enhance research knowledge concerning biomass preprocessing and deconstruction. All of these efforts combine to create a research and education program that enhances the biomass-based industries of the United States. This work was broken into five primary objective areas: • Task A - Technical research in the area of biomass preprocessing, analysis, and evaluation. • Tasks B&C - Technical research in the areas of Fluidized Beds for the Chemical Modification of Lignocellulosic Biomass and Biomass Deconstruction and Evaluation. • Task D - Analyses for the non-scientific community to provides a comprehensive analysis of the current state of biomass supply, demand, technologies, markets and policies; identify a set of feasible alternative paths for biomass industry development and quantify the impacts associated with alternative path. • Task E - Efforts to build research capacity and develop partnerships through faculty fellowships with DOE national labs The research and education programs conducted through this grant have led to three primary results. They include: • A better knowledge base related to and understanding of biomass deconstruction, through both mechanical size reduction and chemical processing • A better source of information related to biomass, bioenergy, and bioproducts for researchers and general public users through the BioWeb system. • Stronger research ties between land-grant universities and DOE National Labs through the faculty fellowship program. In addition to the scientific knowledge and resources developed, funding through this program produced a minimum of eleven (11) scientific publications and contributed to the research behind at least one patent.

  13. Evaluation of a school-based diabetes education intervention, an extension of Program ENERGY

    NASA Astrophysics Data System (ADS)

    Conner, Matthew David

    Background: The prevalence of both obesity and type 2 diabetes in the United States has increased over the past two decades and rates remain high. The latest data from the National Center for Health Statistics estimates that 36% of adults and 17% of children and adolescents in the US are obese (CDC Adult Obesity, CDC Childhood Obesity). Being overweight or obese greatly increases one's risk of developing several chronic diseases, such as type 2 diabetes. Approximately 8% of adults in the US have diabetes, type 2 diabetes accounts for 90-95% of these cases. Type 2 diabetes in children and adolescents is still rare, however clinical reports suggest an increase in the frequency of diagnosis (CDC Diabetes Fact Sheet, 2011). Results from the Diabetes Prevention Program show that the incidence of type 2 diabetes can be reduced through the adoption of a healthier lifestyle among high-risk individuals (DPP, 2002). Objectives: This classroom-based intervention included scientific coverage of energy balance, diabetes, diabetes prevention strategies, and diabetes management. Coverage of diabetes management topics were included in lesson content to further the students' understanding of the disease. Measurable short-term goals of the intervention included increases in: general diabetes knowledge, diabetes management knowledge, and awareness of type 2 diabetes prevention strategies. Methods: A total of 66 sixth grade students at Tavelli Elementary School in Fort Collins, CO completed the intervention. The program consisted of nine classroom-based lessons; students participated in one lesson every two weeks. The lessons were delivered from November of 2005 to May of 2006. Each bi-weekly lesson included a presentation and interactive group activities. Participants completed two diabetes knowledge questionnaires at baseline and post intervention. A diabetes survey developed by Program ENERGY measured general diabetes knowledge and awareness of type 2 diabetes prevention strategies

  14. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  15. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings.

    PubMed

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance. PMID:25946630

  16. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    PubMed Central

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance. PMID:25946630

  17. Creative Conceptual Design Based on Evolutionary DNA Computing Technique

    NASA Astrophysics Data System (ADS)

    Liu, Xiyu; Liu, Hong; Zheng, Yangyang

    Creative conceptual design is an important area in computer aided innovation. Typical design methodology includes exploration and optimization by evolutionary techniques such as EC and swarm intelligence. Although there are many proposed algorithms and applications for creative design by these techniques, the computing models are implemented mostly by traditional von Neumann’s architecture. On the other hand, the possibility of using DNA as a computing technique arouses wide interests in recent years with huge built-in parallel computing nature and ability to solve NP complete problems. This new computing technique is performed by biological operations on DNA molecules rather than chips. The purpose of this paper is to propose a simulated evolutionary DNA computing model and integrate DNA computing with creative conceptual design. The proposed technique will apply for large scale, high parallel design problems potentially.

  18. Improved mesh based photon sampling techniques for neutron activation analysis

    SciTech Connect

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-07-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  19. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    SciTech Connect

    Almansouri, Hani; Clayton, Dwight A; Kisner, Roger A; Polsky, Yarom; Bouman, Charlie; Santos-Villalobos, Hector J

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  20. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    SciTech Connect

    Almansouri, Hani; Clayton, Dwight A; Kisner, Roger A; Polsky, Yarom; Bouman, Charlie; Santos-Villalobos, Hector J

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  1. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    NASA Astrophysics Data System (ADS)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  2. [Eco-value level classification model of forest ecosystem based on modified projection pursuit technique].

    PubMed

    Wu, Chengzhen; Hong, Wei; Hong, Tao

    2006-03-01

    To optimize the projection function and direction of projection pursuit technique, predigest its realization process, and overcome the shortcomings in long time calculation and in the difficulty of optimizing projection direction and computer programming, this paper presented a modified simplex method (MSM), and based on it, brought forward the eco-value level classification model (EVLCM) of forest ecosystem, which could integrate the multidimensional classification index into one-dimensional projection value, with high projection value denoting high ecosystem services value. Examples of forest ecosystem could be reasonably classified by the new model according to their projection value, suggesting that EVLCM driven directly by samples data of forest ecosystem was simple and feasible, applicable, and maneuverable. The calculating time and value of projection function were 34% and 143% of those with the traditional projection pursuit technique, respectively. This model could be applied extensively to classify and estimate all kinds of non-linear and multidimensional data in ecology, biology, and regional sustainable development. PMID:16724723

  3. Application of Condition-Based Monitoring Techniques for Remote Monitoring of a Simulated Gas Centrifuge Enrichment Plant

    SciTech Connect

    Hooper, David A; Henkel, James J; Whitaker, Michael

    2012-01-01

    This paper presents research into the adaptation of monitoring techniques from maintainability and reliability (M&R) engineering for remote unattended monitoring of gas centrifuge enrichment plants (GCEPs) for international safeguards. Two categories of techniques are discussed: the sequential probability ratio test (SPRT) for diagnostic monitoring, and sequential Monte Carlo (SMC or, more commonly, particle filtering ) for prognostic monitoring. Development and testing of the application of condition-based monitoring (CBM) techniques was performed on the Oak Ridge Mock Feed and Withdrawal (F&W) facility as a proof of principle. CBM techniques have been extensively developed for M&R assessment of physical processes, such as manufacturing and power plants. These techniques are normally used to locate and diagnose the effects of mechanical degradation of equipment to aid in planning of maintenance and repair cycles. In a safeguards environment, however, the goal is not to identify mechanical deterioration, but to detect and diagnose (and potentially predict) attempts to circumvent normal, declared facility operations, such as through protracted diversion of enriched material. The CBM techniques are first explained from the traditional perspective of maintenance and reliability engineering. The adaptation of CBM techniques to inspector monitoring is then discussed, focusing on the unique challenges of decision-based effects rather than equipment degradation effects. These techniques are then applied to the Oak Ridge Mock F&W facility a water-based physical simulation of a material feed and withdrawal process used at enrichment plants that is used to develop and test online monitoring techniques for fully information-driven safeguards of GCEPs. Advantages and limitations of the CBM approach to online monitoring are discussed, as well as the potential challenges of adapting CBM concepts to safeguards applications.

  4. C. elegans lifespan extension by osmotic stress requires FUdR, base excision repair, FOXO, and sirtuins.

    PubMed

    Anderson, Edward N; Corkins, Mark E; Li, Jia-Cheng; Singh, Komudi; Parsons, Sadé; Tucey, Tim M; Sorkaç, Altar; Huang, Huiyan; Dimitriadi, Maria; Sinclair, David A; Hart, Anne C

    2016-03-01

    Moderate stress can increase lifespan by hormesis, a beneficial low-level induction of stress response pathways. 5'-fluorodeoxyuridine (FUdR) is commonly used to sterilize Caenorhabditis elegans in aging experiments. However, FUdR alters lifespan in some genotypes and induces resistance to thermal and proteotoxic stress. We report that hypertonic stress in combination with FUdR treatment or inhibition of the FUdR target thymidylate synthase, TYMS-1, extends C. elegans lifespan by up to 30%. By contrast, in the absence of FUdR, hypertonic stress decreases lifespan. Adaptation to hypertonic stress requires diminished Notch signaling and loss of Notch co-ligands leads to lifespan extension only in combination with FUdR. Either FUdR treatment or TYMS-1 loss induced resistance to acute hypertonic stress, anoxia, and thermal stress. FUdR treatment increased expression of DAF-16 FOXO and the osmolyte biosynthesis enzyme GPDH-1. FUdR-induced hypertonic stress resistance was partially dependent on sirtuins and base excision repair (BER) pathways, while FUdR-induced lifespan extension under hypertonic stress conditions requires DAF-16, BER, and sirtuin function. Combined, these results demonstrate that FUdR, through inhibition of TYMS-1, activates stress response pathways in somatic tissues to confer hormetic resistance to acute and chronic stress. C. elegans lifespan studies using FUdR may need re-interpretation in light of this work. PMID:26854551

  5. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  6. Characterization of soil behavior using electromagnetic wave-based technique

    NASA Astrophysics Data System (ADS)

    Dong, Xiaobo

    samples so that the beta value, i.e., the ratio between the conductivities of the sediment and the fluid, is smaller than 1. The beta value is greater than 1 in the Group B samples owing to an overcompensation of surface conduction. Sedimentation behavior of two kaolinite samples with distinct fabric associations is characterized using mechanical and electromagnetic wave-based techniques. The two different fabric formations, the edge-to-face (EF) flocculated structure (i.e., sample A) and the dispersed and deflocculated structure (i.e., sample B), were regulated by changing the pH of the pore fluid and are produced. The anisotropy of shear wave velocity and DC conductivity was not observed in the sediment of sample A because of EF isotropic fabric associations but it was detected in sample B as a result of face-to-face (FF) aggregation. An open card-house structure of the sample A sediment results in a higher relaxation strength of the bulk water, Deltakappaw owing to a higher water content; the smaller Deltakappaw measured in the sample B sediment indicates denser packing. In both samples, sediment consolidation gives rise to a decrease in the bulk-water relaxation strength but an increase in the bound-water relaxation strength owing to increasing particle content. In response to sediment consolidation, the sediment conductivity of sample A continuously decreases because of the reduced contribution from the fluid conductivity. In sample B, the surface conduction via the overlapped double layer overcompensates such a decreased contribution so that the sediment conductivity increases with increasing particle content. The slim-form open-ended coaxial probe is also used to conduct a local dielectric measurement. The measured results, i.e. dielectric relaxation strength of bulk water, Deltakappaw, and the DC conductivity of the saturated sample, sigmamix, are jointly used to characterize the spatial variability of different specimens including glass beads, sand and mica

  7. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1978-01-01

    Furnaces and photolithography related equipment were applied to experiments on double layer metal. The double layer metal activity emphasized wet chemistry techniques. By incorporating the following techniques: (1) ultrasonic etching of the vias; (2) premetal clean using a modified buffered hydrogen fluoride; (3) phosphorus doped vapor; and (4) extended sintering, yields of 98 percent were obtained using the standard test pattern. The two dimensional modeling problems have stemmed from, alternately, instability and too much computation time to achieve convergence.

  8. Measurement based simulation of microscope deviations for evaluation of stitching algorithms for the extension of Fourier-based alignment

    NASA Astrophysics Data System (ADS)

    Engelke, Florian; Kästner, Markus; Reithmeier, Eduard

    2013-05-01

    Image stitching is a technique used to measure large surface areas with high resolution while maintaining a large field of view. We work on improving data fusion by stitching in the field of microscopic analysis of technical surfaces for structures and roughness. Guidance errors and imaging errors such as noise cause problems for seamless image fusion of technical surfaces. The optical imaging errors of 3D Microscopes, such as confocal microscopes and white light interferometers, as well as the guidance errors of their automated positioning systems have been measured to create a software to simulate automated measurements of known surfaces with specific deviations to test new stitching algorithms. We measured and incorporated radial image distortion, interferometer reference mirror shape deviations, statistical noise, drift of the positional axis, on-axis-accuracy and repeatability of the used positioning stages and misalignment of the CCD-Chip with respect to the axes of motion. We used the resulting simulation of the measurement process to test a new image registration technique that allows for the use of correlation of images by fast fourier transform for small overlaps between single measurements.

  9. MR-based field-of-view extension in MR/PET: B0 homogenization using gradient enhancement (HUGE).

    PubMed

    Blumhagen, Jan O; Ladebeck, Ralf; Fenchel, Matthias; Scheffler, Klaus

    2013-10-01

    In whole-body MR/PET, the human attenuation correction can be based on the MR data. However, an MR-based field-of-view (FoV) is limited due to physical restrictions such as B0 inhomogeneities and gradient nonlinearities. Therefore, for large patients, the MR image and the attenuation map might be truncated and the attenuation correction might be biased. The aim of this work is to explore extending the MR FoV through B0 homogenization using gradient enhancement in which an optimal readout gradient field is determined to locally compensate B0 inhomogeneities and gradient nonlinearities. A spin-echo-based sequence was developed that computes an optimal gradient for certain regions of interest, for example, the patient's arms. A significant distortion reduction was achieved outside the normal MR-based FoV. This FoV extension was achieved without any hardware modifications. In-plane distortions in a transaxially extended FoV of up to 600 mm were analyzed in phantom studies. In vivo measurements of the patient's arms lying outside the normal specified FoV were compared with and without the use of B0 homogenization using gradient enhancement. In summary, we designed a sequence that provides data for reducing the image distortions due to B0 inhomogeneities and gradient nonlinearities and used the data to extend the MR FoV. PMID:23203976

  10. A Monte-Carlo based extension of the Meteor Orbit and Trajectory Software (MOTS) for computations of orbital elements

    NASA Astrophysics Data System (ADS)

    Albin, T.; Koschny, D.; Soja, R.; Srama, R.; Poppe, B.

    2016-01-01

    The Canary Islands Long-Baseline Observatory (CILBO) is a double station meteor camera system (Koschny et al., 2013; Koschny et al., 2014) that consists of 5 cameras. The two cameras considered in this report are ICC7 and ICC9, and are installed on Tenerife and La Palma. They point to the same atmospheric volume between both islands allowing stereoscopic observation of meteors. Since its installation in 2011 and the start of operation in 2012 CILBO has detected over 15000 simultaneously observed meteors. Koschny and Diaz (2002) developed the Meteor Orbit and Trajectory Software (MOTS) to compute the trajectory of such meteors. The software uses the astrometric data from the detection software MetRec (Molau, 1998) and determines the trajectory in geodetic coordinates. This work presents a Monte-Carlo based extension of the MOTS code to compute the orbital elements of simultaneously detected meteors by CILBO.

  11. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  12. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  13. Fiber probes based optical techniques for biomedical diagnosis

    NASA Astrophysics Data System (ADS)

    Arce-Diego, José L.; Fanjul-Vélez, Félix

    2007-06-01

    Although fiber optics have been applied in optical communication and sensor systems for several years in a very successful way, their first application was developed in medicine in the early 20's. Manufacturing and developing of optical fibers for biomedical purposes have required a lot of research efforts in order to achieve a non-invasive, in-vivo, and real-time diagnosis of different diseases in human or animal tissues. In general, optical fiber probes are designed as a function of the optical measurement technique. In this work, a brief description of the main optical techniques for optical characterization of biological tissues is presented. The recent advances in optical fiber probes for biomedical diagnosis in clinical analysis and optical biopsy in relation with the different spectroscopic or tomographic optical techniques are described.

  14. AGRICULTURAL EXTENSION.

    ERIC Educational Resources Information Center

    FARQUHAR, R.N.

    AUSTRALIAN AGRICULTURAL EXTENSION HAS LONG EMPHASIZED TECHNICAL ADVISORY SERVICE AT THE EXPENSE OF THE SOCIOECONOMIC ASPECTS OF FARM PRODUCTION AND FARM LIFE. ONLY IN TASMANIA HAS FARM MANAGEMENT BEEN STRESSED. DEMANDS FOR THE WHOLE-FARM APPROACH HAVE PRODUCED A TREND TOWARD GENERALISM FOR DISTRICT OFFICERS IN MOST STATES. THE FEDERAL GOVERNMENT,…

  15. Intraoperative Vagus Nerve Monitoring: A Transnasal Technique during Skull Base Surgery

    PubMed Central

    Schutt, Christopher A.; Paskhover, Boris; Judson, Benjamin L.

    2014-01-01

    Objectives Intraoperative vagus nerve monitoring during skull base surgery has been reported with the use of an oral nerve monitoring endotracheal tube. However, the intraoral presence of an endotracheal tube can limit exposure by its location in the operative field during transfacial approaches and by limiting superior mobilization of the mandible during transcervical approaches. We describe a transnasal vagus nerve monitoring technique. Design and Participants Ten patients underwent open skull base surgery. Surgical approaches included transcervical (five), transfacial/maxillary swing (three), and double mandibular osteotomy (two). The vagus nerve was identified, stimulated, and monitored in all cases. Main Outcome Measures Intraoperative nerve stimulation, pre- and postoperative vagus nerve function through the use of flexible laryngoscopy in conjunction with assessment of subjective symptoms of hoarseness, voice change, and swallowing difficulty. Results Three patients had extensive involvement of the nerve by tumor with complete postoperative nerve deficit, one patient had a transient deficit following dissection of tumor off of nerve with resolution, and the remaining patients had nerve preservation. One patient experienced minor epistaxis during monitor tube placement that was managed conservatively. Conclusions Transnasal vagal nerve monitoring is a simple method that allows for intraoperative monitoring during nerve preservation surgery without limiting surgical exposure. PMID:25844292

  16. The Influence of an Extensive Inquiry-Based Field Experience on Pre-Service Elementary Student Teachers' Science Teaching Beliefs

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sumita; Volk, Trudi; Lumpe, Andrew

    2009-06-01

    This study examined the effects of an extensive inquiry-based field experience on pre service elementary teachers’ personal agency beliefs, a composite measure of context beliefs and capability beliefs related to teaching science. The research combined quantitative and qualitative approaches and included an experimental group that utilized the inquiry method and a control group that used traditional teaching methods. Pre- and post-test scores for the experimental and control groups were compared. The context beliefs of both groups showed no significant change as a result of the experience. However, the control group’s capability belief scores, lower than those of the experimental group to start with, declined significantly; the experimental group’s scores remained unchanged. Thus, the inquiry-based field experience led to an increase in personal agency beliefs. The qualitative data suggested a new hypothesis that there is a spiral relationship among teachers’ ability to establish communicative relationships with students, desire for personal growth and improvement, ability to implement multiple instructional strategies, and possession of substantive content knowledge. The study concludes that inquiry-based student teaching should be encouraged in the training of elementary school science teachers. However, the meaning and practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom.

  17. Kernel-Based Discriminant Techniques for Educational Placement

    ERIC Educational Resources Information Center

    Lin, Miao-hsiang; Huang, Su-yun; Chang, Yuan-chin

    2004-01-01

    This article considers the problem of educational placement. Several discriminant techniques are applied to a data set from a survey project of science ability. A profile vector for each student consists of five science-educational indicators. The students are intended to be placed into three reference groups: advanced, regular, and remedial.…

  18. Bit-depth extension using spatiotemporal microdither based on models of the equivalent input noise of the visual system

    NASA Astrophysics Data System (ADS)

    Daly, Scott J.; Feng, Xiaofan

    2003-01-01

    Continuous tone, or "contone", imagery usually has 24 bits/pixel as a minimum, with eight bits each for the three primaries in typical displays. However, lower-cost displays constrain this number because of various system limitations. Conversely, high quality displays seek to achieve 9-10 bits/pixel/color, though there may be system bottlenecks limited at 8. The two main artifacts from reduced bit-depth are contouring and loss of amplitude detail; these can be prevented by dithering the image prior to these bit-depth losses. Early work in this area includes Roberts" noise modulation technique, Mista"s blue noise mask, Tyler"s technique of bit-stealing, and Mulligan"s use of the visual system"s spatiotemporal properties for spatiotemporal dithering. However, most halftoning/dithering work was primarily directed to displays at the lower end of bits/pixel (e.g., 1 bit as in halftoning) and higher ppi. Like Tyler, we approach the problem from the higher end of bits/pixel/color, say 6-8, and use available high frequency color content to generate even higher luminance amplitude resolution. Bit-depth extension with a high starting bit-depth (and often lower spatial resolution) changes the game substantially from halftoning experience. For example, complex algorithms like error diffusion and annealing are not needed, just the simple addition of noise. Instead of a spatial dither, it is better to use an amplitude dither, termed microdither by Pappas. We have looked at methods of generating the highest invisible opponent color spatiotemporal noise and other patterns, and have used Ahumada"s concept of equivalent input noise to guide our work. This paper will report on techniques and observations made in achieving contone quality on ~100 ppi 6 bits/pixel/color LCD displays with no visible dither patterns, noise, contours, or loss of amplitude detail at viewing distances as close as the near focus limit (~120 mm). These include the interaction of display nonlinearities and

  19. Response Time Comparisons among Four Base Running Starting Techniques in Slow Pitch Softball.

    ERIC Educational Resources Information Center

    Israel, Richard G.; Brown, Rodney L.

    1981-01-01

    Response times among four starting techniques (cross-over step, jab step, standing sprinter's start, and momentum start) were compared. The results suggest that the momentum start was the fastest starting technique for optimum speed in running bases. (FG)

  20. Assessing the Utility of the Nominal Group Technique as a Consensus-Building Tool in Extension-Led Avian Influenza Response Planning

    ERIC Educational Resources Information Center

    Kline, Terence R.

    2013-01-01

    The intent of the project described was to apply the Nominal Group Technique (NGT) to achieve a consensus on Avian Influenza (AI) planning in Northeastern Ohio. Nominal Group Technique is a process first developed by Delbecq, Vande Ven, and Gustafsen (1975) to allow all participants to have an equal say in an open forum setting. A very diverse…

  1. Knowledge-based GIS techniques applied to geological engineering

    USGS Publications Warehouse

    Usery, E. Lynn; Altheide, Phyllis; Deister, Robin R.P.; Barr, David J.

    1988-01-01

    A knowledge-based geographic information system (KBGIS) approach which requires development of a rule base for both GIS processing and for the geological engineering application has been implemented. The rule bases are implemented in the Goldworks expert system development shell interfaced to the Earth Resources Data Analysis System (ERDAS) raster-based GIS for input and output. GIS analysis procedures including recoding, intersection, and union are controlled by the rule base, and the geological engineering map product is generted by the expert system. The KBGIS has been used to generate a geological engineering map of Creve Coeur, Missouri.

  2. Wavelet-based techniques for the gamma-ray sky

    NASA Astrophysics Data System (ADS)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  3. The future of magnetic resonance-based techniques in neurology.

    PubMed

    2001-01-01

    Magnetic resonance techniques have become increasingly important in neurology for defining: 1. brain, spinal cord and peripheral nerve or muscle structure; 2. pathological changes in tissue structures and properties; and 3. dynamic patterns of functional activation of the brain. New applications have been driven in part by advances in hardware, particularly improvements in magnet and gradient coil design. New imaging strategies allow novel approaches to contrast with, for example, diffusion imaging, magnetization transfer imaging, perfusion imaging and functional magnetic resonance imaging. In parallel with developments in hardware and image acquisition have been new approaches to image analysis. These have allowed quantitative descriptions of the image changes to be used for a precise, non-invasive definition of pathology. With the increasing capabilities and specificity of magnetic resonance techniques it is becoming more important that the neurologist is intimately involved in both the selection of magnetic resonance studies for patients and their interpretation. There is a need for considerably improved access to magnetic resonance technology, particularly in the acute or intensive care ward and in the neurosurgical theatre. This report illustrates several key developments. The task force concludes that magnetic resonance imaging is a major clinical tool of growing significance and offers recommendations for maximizing the potential future for magnetic resonance techniques in neurology. PMID:11509077

  4. Research and development of LANDSAT-based crop inventory techniques

    NASA Technical Reports Server (NTRS)

    Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)

    1982-01-01

    A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.

  5. Applying Knowledge-Based Techniques to Software Development.

    ERIC Educational Resources Information Center

    Harandi, Mehdi T.

    1986-01-01

    Reviews overall structure and design principles of a knowledge-based programming support tool, the Knowledge-Based Programming Assistant, which is being developed at University of Illinois Urbana-Champaign. The system's major units (program design program coding, and intelligent debugging) and additional functions are described. (MBR)

  6. Fast Multigrid Techniques in Total Variation-Based Image Reconstruction

    NASA Technical Reports Server (NTRS)

    Oman, Mary Ellen

    1996-01-01

    Existing multigrid techniques are used to effect an efficient method for reconstructing an image from noisy, blurred data. Total Variation minimization yields a nonlinear integro-differential equation which, when discretized using cell-centered finite differences, yields a full matrix equation. A fixed point iteration is applied with the intermediate matrix equations solved via a preconditioned conjugate gradient method which utilizes multi-level quadrature (due to Brandt and Lubrecht) to apply the integral operator and a multigrid scheme (due to Ewing and Shen) to invert the differential operator. With effective preconditioning, the method presented seems to require Omicron(n) operations. Numerical results are given for a two-dimensional example.

  7. Radiation synthesized protein-based nanoparticles: A technique overview

    NASA Astrophysics Data System (ADS)

    Varca, Gustavo H. C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-12-01

    Seeking for alternative routes for protein engineering a novel technique - radiation induced synthesis of protein nanoparticles - to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0-35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5-50 mg mL-1) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation.

  8. Planning/scheduling techniques for VQ-based image compression

    NASA Technical Reports Server (NTRS)

    Short, Nicholas M., Jr.; Manohar, Mareboyana; Tilton, James C.

    1994-01-01

    The enormous size of the data holding and the complexity of the information system resulting from the EOS system pose several challenges to computer scientists, one of which is data archival and dissemination. More than ninety percent of the data holdings of NASA is in the form of images which will be accessed by users across the computer networks. Accessing the image data in its full resolution creates data traffic problems. Image browsing using a lossy compression reduces this data traffic, as well as storage by factor of 30-40. Of the several image compression techniques, VQ is most appropriate for this application since the decompression of the VQ compressed images is a table lookup process which makes minimal additional demands on the user's computational resources. Lossy compression of image data needs expert level knowledge in general and is not straightforward to use. This is especially true in the case of VQ. It involves the selection of appropriate codebooks for a given data set and vector dimensions for each compression ratio, etc. A planning and scheduling system is described for using the VQ compression technique in the data access and ingest of raw satellite data.

  9. Bioluminescence-based imaging technique for pressure measurement in water

    NASA Astrophysics Data System (ADS)

    Watanabe, Yasunori; Tanaka, Yasufumi

    2011-07-01

    The dinoflagellate Pyrocystis lunula emits light in response to water motion. We developed a new imaging technique for measuring pressure using plankton that emits light in response to mechanical stimulation. The bioluminescence emitted by P. lunula was used to measure impact water pressure produced using weight-drop tests. The maximum mean luminescence intensity correlated with the maximum impact pressure that the cells receive when the circadian and diurnal biological rhythms are appropriately controlled. Thus, with appropriate calibration of experimentally determined parameters, the dynamic impact pressure can be estimated by measuring the cell-flash distribution. Statistical features of the evolution of flash intensity and the probability distribution during the impacting event, which are described by both biological and mechanical response parameters, are also discussed in this paper. The practical applicability of this bioluminescence imaging technique is examined through a water drop test. The maximum dynamic pressure, occurring at the impact of a water jet against a wall, was estimated from the flash intensity of the dinoflagellate.

  10. Computer-vision-based registration techniques for augmented reality

    NASA Astrophysics Data System (ADS)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  11. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  12. Tornado wind-loading requirements based on risk assessment techniques

    SciTech Connect

    Deobald, T.L.; Coles, G.A.; Smith, G.L.

    1991-06-01

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol. 4 refs., 4 figs.

  13. Feature based sliding window technique for face recognition

    NASA Astrophysics Data System (ADS)

    Javed, Muhammad Younus; Mohsin, Syed Maajid; Anjum, Muhammad Almas

    2010-02-01

    Human beings are commonly identified by biometric schemes which are concerned with identifying individuals by their unique physical characteristics. The use of passwords and personal identification numbers for detecting humans are being used for years now. Disadvantages of these schemes are that someone else may use them or can easily be forgotten. Keeping in view of these problems, biometrics approaches such as face recognition, fingerprint, iris/retina and voice recognition have been developed which provide a far better solution when identifying individuals. A number of methods have been developed for face recognition. This paper illustrates employment of Gabor filters for extracting facial features by constructing a sliding window frame. Classification is done by assigning class label to the unknown image that has maximum features similar to the image stored in the database of that class. The proposed system gives a recognition rate of 96% which is better than many of the similar techniques being used for face recognition.

  14. A fast Stokes inversion technique based on quadratic regression

    NASA Astrophysics Data System (ADS)

    Teng, Fei; Deng, Yuan-Yong

    2016-05-01

    Stokes inversion calculation is a key process in resolving polarization information on radiation from the Sun and obtaining the associated vector magnetic fields. Even in the cases of simple local thermodynamic equilibrium (LTE) and where the Milne-Eddington approximation is valid, the inversion problem may not be easy to solve. The initial values for the iterations are important in handling the case with multiple minima. In this paper, we develop a fast inversion technique without iterations. The time taken for computation is only 1/100 the time that the iterative algorithm takes. In addition, it can provide available initial values even in cases with lower spectral resolutions. This strategy is useful for a filter-type Stokes spectrograph, such as SDO/HMI and the developed two-dimensional real-time spectrograph (2DS).

  15. Conductivity-Based Detection Techniques in Nanofluidic Devices

    PubMed Central

    Harms, Zachary D.; Haywood, Daniel G.; Kneller, Andrew R.

    2016-01-01

    This review covers conductivity detection in fabricated nanochannels and nanopores. Improvements in nanoscale sensing are a direct result of advances in fabrication techniques, which produce devices with channels and pores with reproducible dimensions and in a variety of materials. Analytes of interest are detected by measuring changes in conductance as the analyte accumulates in the channel or passes transiently through the pore. These detection methods take advantage of phenomena enhanced at the nanoscale, such as ion current rectification, surface conductance, and dimensions comparable to the analytes of interest. The end result is the development of sensing technologies for a broad range of analytes, e.g., ions, small molecules, proteins, nucleic acids, and particles. PMID:25988434

  16. YREE determination in seawater. Standardization and validation of a new method based on preconcentration techniques

    NASA Astrophysics Data System (ADS)

    Raso, Maria; Saiano, Filippo; Montalbano, Maria; Censi, Paolo

    2010-05-01

    The most interesting attraction of using rare-earth elements and yttrium (YREE) to address geochemical and marine chemical problems consists of their chemical coherence as group of trace elements. These characters allow YREE compositions of rocks and minerals to be extensively used in studies of provenance, petrogenesis and chemical evolution of the geological materials (1). Similarly, YREE compositions in the hydrosphere were used in studies of coagulation, particle-solution reactions and oceanic circulation of water masses (2-4). Unfortunately, very low concentrations of YREE (ng l-1 or sub-ng l-1) associated to high ionic strength of seawater always represented the main difficulty to analyse dissolved YREE in marine environment. The first geochemical investigations of YREE contents in seawater were carried out using neutron activation and isotope dilution mass spectrometry that were almost entirely replaced by inductively coupled plasma supplemented by mass spectrometry (ICP-MS) in recent years. This technique offers many advantages including simultaneous analysis of all the elements of series and their quantitative determination with detection limits of the order of ng l-1 if associated to preconcentration techniques (5). To perform ultra-trace YREE analyses in seawater, we developed a preconcentration method based on CHELEX-100 iminodiacetate resin followed by ICP-MS determination (Ref). In this study the YREE behaviour was quantitatively investigated during interactions with ion chelating resin and estimation of composed measurement uncertainty associated to measurements was evaluated with a rigorous metrological approach based on method validation and quality control of YREE data. These goals were achieved using synthetic seawater where YREE had concentrations as occurring in natural seawater samples. Under these conditions good recovery were obtained along the YREE series, ranging from 75%-85% and 90%-100% for heavy REE and Y and light REE, respectively

  17. An Extensive Survey of Dayside Diffuse Aurora (DDA) Based on Optical Observations at Yellow River Station (YRS)

    NASA Astrophysics Data System (ADS)

    Desheng, H.

    2015-12-01

    By using 7 years optical auroral observations obtained at Yellow River Station at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into 2 broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAs mainly show patchy, stripy, and irregular forms, and are often pulsating and drifting. The drifting directions are mostly westward (with speed ~5 km/s), but there are cases showing eastward or poleward drifting. (5) The stripy DDAs are exclusively observed near the MLN and, most importantly, their alignments are confirmed to be consistent with the direction of ionospheric convection near the MLN. (6) A new auroral form, called throat aurora, is found to be developed from the stripy DDAs. Based on the observational results and previous studies, we proposed our explanations to the DDAs. We suggest that the unstructured DDAs observed in the morning are extensions of the nightside diffuse aurora to the dayside, but that observed in the afternoon are predominantly caused by proton precipitations. The structured DDAs occurred near the MLN are caused by interactions of cold plasma structures, which are supposed to be originated from the ionospheric outflows or plasmaspheric drainage plumes, with hot electrons from the plasma sheet. We suppose that the cold plasma structures for producing the patchy DDAs are in lumpy and are more likely from the plasmaspheric drainage plumes. The cold plasm structure for producing the stripy DDAs should

  18. Folder: a MATLAB-based tool for modelling deformation in layered media subject to layer parallel shortening or extension

    NASA Astrophysics Data System (ADS)

    Adamuszek, Marta; Dabrowski, Marcin; Schmid, Daniel W.

    2016-04-01

    We present Folder, a numerical tool to simulate and analyse the structure development in mechanically layered media during the layer parallel shortening or extension. Folder includes a graphical user interface that allows for easy designing of complex geometrical models, defining material parameters (including linear and non-linear rheology), and specifying type and amount of deformation. It also includes a range of features that facilitate the visualization and examination of various relevant quantities e.g. velocities, stress, rate of deformation, pressure, and finite strain. Folder contains a separate application, which illustrates analytical solutions of growth rate spectra for layer parallel shortening and extension of a single viscous layer. In the study, we also demonstrate a Folder application, where the role of confinement on the growth rate spectrum and the fold shape evolution during the deformation of a single layer subject to the layer parallel shortening is presented. In the case of the linear viscous materials used for the layer and matrix, the close wall proximity leads to a decrease of the growth rate values. The decrease is more pronounced for the larger wavelengths than for the smaller wavelengths. The growth rate reduction is greater when the walls are set closer to the layer. The presence of the close confinement can also affect the wavelength selection process and significantly shift the position of the dominant wavelength. The influence of the wall proximity on the growth rate spectrum for the case of non-linear viscous materials used for the layer and/or matrix is very different as compared to the linear viscous case. We observe a multiple maxima in the growth rate spectrum. The number of the growth rate maxima, their value and the position strongly depend on the closeness of the confinement. The maximum growth rate value for a selected range of layer-wall distances is much larger than in the case when the confinement effect is not taken

  19. Large area photodetector based on microwave cavity perturbation techniques

    SciTech Connect

    Braggio, C. Carugno, G.; Sirugudu, R. K.; Lombardi, A.; Ruoso, G.

    2014-07-28

    We present a preliminary study to develop a large area photodetector, based on a semiconductor crystal placed inside a superconducting resonant cavity. Laser pulses are detected through a variation of the cavity impedance, as a consequence of the conductivity change in the semiconductor. A novel method, whereby the designed photodetector is simulated by finite element analysis, makes it possible to perform pulse-height spectroscopy on the reflected microwave signals. We measure an energy sensitivity of 100 fJ in the average mode without the employment of low noise electronics and suggest possible ways to further reduce the single-shot detection threshold, based on the results of the described method.

  20. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  1. Estimating monthly temperature using point based interpolation techniques

    NASA Astrophysics Data System (ADS)

    Saaban, Azizan; Mah Hashim, Noridayu; Murat, Rusdi Indra Zuhdi

    2013-04-01

    This paper discusses the use of point based interpolation to estimate the value of temperature at an unallocated meteorology stations in Peninsular Malaysia using data of year 2010 collected from the Malaysian Meteorology Department. Two point based interpolation methods which are Inverse Distance Weighted (IDW) and Radial Basis Function (RBF) are considered. The accuracy of the methods is evaluated using Root Mean Square Error (RMSE). The results show that RBF with thin plate spline model is suitable to be used as temperature estimator for the months of January and December, while RBF with multiquadric model is suitable to estimate the temperature for the rest of the months.

  2. Visual cryptography based on optical interference encryption technique

    NASA Astrophysics Data System (ADS)

    Seo, Dong-Hoan; Kim, Jong-Yun; Lee, Sang-Su; Park, Se-Joon; Cho, Woong H.; Kim, Soo-Joong

    2001-07-01

    In this paper, we proposed a new visual cryptography scheme based on optical interference that can improve the contrast and signal to noise ratio of reconstructed images when compared to conventional visual cryptography methods. The binary image being encrypted is divided into any number of n slides. For encryption, randomly independent keys are generated along with another random key based on a XOR process of random keys. The XOR process between each divided image and each random key produces the encryption of n encrypted images. These encrypted images are then used to make encrypted binary phase masks. For decryption, the phase masks are placed on the paths of a Mach-Zehnder interferometer.

  3. An RSA-Based Leakage-Resilient Authenticated Key Exchange Protocol Secure against Replacement Attacks, and Its Extensions

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    Secure channels can be realized by an authenticated key exchange (AKE) protocol that generates authenticated session keys between the involving parties. In [32], Shin et al., proposed a new kind of AKE (RSA-AKE) protocol whose goal is to provide high efficiency and security against leakage of stored secrets as much as possible. Let us consider more powerful attacks where an adversary completely controls the communications and the stored secrets (the latter is denoted by “replacement” attacks). In this paper, we first show that the RSA-AKE protocol [32] is no longer secure against such an adversary. The main contributions of this paper are as follows: (1) we propose an RSA-based leakage-resilient AKE (RSA-AKE2) protocol that is secure against active attacks as well as replacement attacks; (2) we prove that the RSA-AKE2 protocol is secure against replacement attacks based on the number theory results; (3) we show that it is provably secure in the random oracle model, by showing the reduction to the RSA one-wayness, under an extended model that covers active attacks and replacement attacks; (4) in terms of efficiency, the RSA-AKE2 protocol is comparable to [32] in the sense that the client needs to compute only one modular multiplication with pre-computation; and (5) we also discuss about extensions of the RSA-AKE2 protocol for several security properties (i.e., synchronization of stored secrets, privacy of client and solution to server compromise-impersonation attacks).

  4. Evaluation of SGML-based Information through Fuzzy Techniques.

    ERIC Educational Resources Information Center

    Fontana, Francesca Arcelli

    2001-01-01

    Discussion of knowledge management, information retrieval, information filtering, and information evaluation focuses on knowledge evaluation and proposes some evaluation methods based on L-grammars which are fuzzy grammars. Applies these methods to the evaluation of documents in SGML and to the evaluation of pages in HTML in the World Wide Web.…

  5. EXPERIMENTAL AND THEORETICAL EVALUATIONS OF OBSERVATIONAL-BASED TECHNIQUES

    EPA Science Inventory

    Observational Based Methods (OBMs) can be used by EPA and the States to develop reliable ozone controls approaches. OBMs use actual measured concentrations of ozone, its precursors, and other indicators to determine the most appropriate strategy for ozone control. The usual app...

  6. Problem-Based Learning Supported by Semantic Techniques

    ERIC Educational Resources Information Center

    Lozano, Esther; Gracia, Jorge; Corcho, Oscar; Noble, Richard A.; Gómez-Pérez, Asunción

    2015-01-01

    Problem-based learning has been applied over the last three decades to a diverse range of learning environments. In this educational approach, different problems are posed to the learners so that they can develop different solutions while learning about the problem domain. When applied to conceptual modelling, and particularly to Qualitative…

  7. Key techniques for space-based solar pumped semiconductor lasers

    NASA Astrophysics Data System (ADS)

    He, Yang; Xiong, Sheng-jun; Liu, Xiao-long; Han, Wei-hua

    2014-12-01

    In space, the absence of atmospheric turbulence, absorption, dispersion and aerosol factors on laser transmission. Therefore, space-based laser has important values in satellite communication, satellite attitude controlling, space debris clearing, and long distance energy transmission, etc. On the other hand, solar energy is a kind of clean and renewable resources, the average intensity of solar irradiation on the earth is 1353W/m2, and it is even higher in space. Therefore, the space-based solar pumped lasers has attracted much research in recent years, most research focuses on solar pumped solid state lasers and solar pumped fiber lasers. The two lasing principle is based on stimulated emission of the rare earth ions such as Nd, Yb, Cr. The rare earth ions absorb light only in narrow bands. This leads to inefficient absorption of the broad-band solar spectrum, and increases the system heating load, which make the system solar to laser power conversion efficiency very low. As a solar pumped semiconductor lasers could absorb all photons with energy greater than the bandgap. Thus, solar pumped semiconductor lasers could have considerably higher efficiencies than other solar pumped lasers. Besides, solar pumped semiconductor lasers has smaller volume chip, simpler structure and better heat dissipation, it can be mounted on a small satellite platform, can compose satellite array, which can greatly improve the output power of the system, and have flexible character. This paper summarizes the research progress of space-based solar pumped semiconductor lasers, analyses of the key technologies based on several application areas, including the processing of semiconductor chip, the design of small and efficient solar condenser, and the cooling system of lasers, etc. We conclude that the solar pumped vertical cavity surface-emitting semiconductor lasers will have a wide application prospects in the space.

  8. Kernel-based machine learning techniques for infrasound signal classification

    NASA Astrophysics Data System (ADS)

    Tuma, Matthias; Igel, Christian; Mialle, Pierrick

    2014-05-01

    Infrasound monitoring is one of four remote sensing technologies continuously employed by the CTBTO Preparatory Commission. The CTBTO's infrasound network is designed to monitor the Earth for potential evidence of atmospheric or shallow underground nuclear explosions. Upon completion, it will comprise 60 infrasound array stations distributed around the globe, of which 47 were certified in January 2014. Three stages can be identified in CTBTO infrasound data processing: automated processing at the level of single array stations, automated processing at the level of the overall global network, and interactive review by human analysts. At station level, the cross correlation-based PMCC algorithm is used for initial detection of coherent wavefronts. It produces estimates for trace velocity and azimuth of incoming wavefronts, as well as other descriptive features characterizing a signal. Detected arrivals are then categorized into potentially treaty-relevant versus noise-type signals by a rule-based expert system. This corresponds to a binary classification task at the level of station processing. In addition, incoming signals may be grouped according to their travel path in the atmosphere. The present work investigates automatic classification of infrasound arrivals by kernel-based pattern recognition methods. It aims to explore the potential of state-of-the-art machine learning methods vis-a-vis the current rule-based and task-tailored expert system. To this purpose, we first address the compilation of a representative, labeled reference benchmark dataset as a prerequisite for both classifier training and evaluation. Data representation is based on features extracted by the CTBTO's PMCC algorithm. As classifiers, we employ support vector machines (SVMs) in a supervised learning setting. Different SVM kernel functions are used and adapted through different hyperparameter optimization routines. The resulting performance is compared to several baseline classifiers. All

  9. Prediction Method of Speech Recognition Performance Based on HMM-based Speech Synthesis Technique

    NASA Astrophysics Data System (ADS)

    Terashima, Ryuta; Yoshimura, Takayoshi; Wakita, Toshihiro; Tokuda, Keiichi; Kitamura, Tadashi

    We describe an efficient method that uses a HMM-based speech synthesis technique as a test pattern generator for evaluating the word recognition rate. The recognition rates of each word and speaker can be evaluated by the synthesized speech by using this method. The parameter generation technique can be formulated as an algorithm that can determine the speech parameter vector sequence O by maximizing P(O¦Q,λ) given the model parameter λ and the state sequence Q, under a dynamic acoustic feature constraint. We conducted recognition experiments to illustrate the validity of the method. Approximately 100 speakers were used to train the speaker dependent models for the speech synthesis used in these experiments, and the synthetic speech was generated as the test patterns for the target speech recognizer. As a result, the recognition rate of the HMM-based synthesized speech shows a good correlation with the recognition rate of the actual speech. Furthermore, we find that our method can predict the speaker recognition rate with approximately 2% error on average. Therefore the evaluation of the speaker recognition rate will be performed automatically by using the proposed method.

  10. Calculation of free fall trajectories based on numerical optimization techniques

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The development of a means of computing free-fall (nonthrusting) trajectories from one specified point in the solar system to another specified point in the solar system in a given amount of time was studied. The problem is that of solving a two-point boundary value problem for which the initial slope is unknown. Two standard methods of attack exist for solving two-point boundary value problems. The first method is known as the initial value or shooting method. The second method of attack for two-point boundary value problems is to approximate the nonlinear differential equations by an appropriate linearized set. Parts of both boundary value problem solution techniques described above are used. A complete velocity history is guessed such that the corresponding position history satisfies the given boundary conditions at the appropriate times. An iterative procedure is then followed until the last guessed velocity history and the velocity history obtained from integrating the acceleration history agree to some specified tolerance everywhere along the trajectory.

  11. Fatigue loading history reconstruction based on the rainflow technique

    NASA Technical Reports Server (NTRS)

    Khosrovaneh, A. K.; Dowling, N. E.

    1990-01-01

    Methods are considered of reducing a non-random fatigue loading history to a concise description and then of reconstructing a time history similar to the original. In particular, three methods of reconstruction based on a rainflow cycle counting matrix are presented. A rainflow matrix consists of the numbers of cycles at various peak and valley combinations. Two methods are based on a two-dimensional rainflow matrix, and the third on a three-dimensional rainflow matrix. Histories reconstructed by any of these methods produce a rainflow matrix identical to that of the original history, and the resulting time history is expected to produce a fatigue life similar to that for the original. The procedures described allow lengthy loading histories to be stored in compact form.

  12. Interactive classification: A technique for acquiring and maintaining knowledge bases

    SciTech Connect

    Finin, T.W.

    1986-10-01

    The practical application of knowledge-based systems, such as in expert systems, often requires the maintenance of large amounts of declarative knowledge. As a knowledge base (KB) grows in size and complexity, it becomes more difficult to maintain and extend. Even someone who is familiar with the knowledge domain, how it is represented in the KB, and the actual contents of the current KB may have severe difficulties in updating it. Even if the difficulties can be tolerated, there is a very real danger that inconsistencies and errors may be introduced into the KB through the modification. This paper describes an approach to this problem based on a tool called an interactive classifier. An interactive classifier uses the contents of the existing KB and knowledge about its representation to help the maintainer describe new KB objects. The interactive classifier will identify the appropriate taxonomic location for the newly described object and add it to the KB. The new object is allowed to be a generalization of existing KB objects, enabling the system to learn more about existing objects.

  13. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to

  14. Hybrid OPC technique using model based and rule-based flows

    NASA Astrophysics Data System (ADS)

    Harb, Mohammed; Abdelghany, Hesham

    2013-04-01

    To transfer an electronic circuit from design to silicon, a lot of stages are involved in between. As technology evolves, the design shapes are getting closer to each other. Since the wavelength of the lithography process didn't get any better than 193nm, optical interference is a problem that needs to be accounted for by using Optical Proximity Correction (OPC) algorithms. In earlier technologies, simple OPC was applied to the design based on spatial rules. This is not the situation in the recent technologies anymore, since more optical interference took place with the intensive scaling down of the designs. Model-based OPC is a better solution now to produce accurate results, but this comes at the cost of the increased run time. Electronic Design Automation (EDA) companies compete to offer tools that provide both accuracy and run time efficiency. In this paper, we show that optimum usage of some of these tools can ensure OPC accuracy with better run time. The hybrid technique of OPC uses the classic rule-based OPC in a modern fashion to consider the optical parameters, instead of the spatial metrics only. Combined with conventional model-based OPC, the whole flow shows better results in terms of accuracy and run time.

  15. Antenna pointing compensation based on precision optical measurement techniques

    NASA Technical Reports Server (NTRS)

    Schumacher, L. L.; Vivian, H. C.

    1988-01-01

    The pointing control loops of the Deep Space Network 70 meter antennas extend only to the Intermediate Reference Structure (IRS). Thus, distortion of the structure forward of the IRS due to unpredictable environmental loads can result in uncompensated boresight shifts which degrade blind pointing accuracy. A system is described which can provide real time bias commands to the pointing control system to compensate for environmental effects on blind pointing performance. The bias commands are computed in real time based on optical ranging measurements of the structure from the IRS to a number of selected points on the primary and secondary reflectors.

  16. Wideband electromagnetic scattering program. Fourier-based radar imaging techniques

    NASA Astrophysics Data System (ADS)

    Chan, B. L.; Young, J. D.; Rudduck, R. C.

    1993-09-01

    This report describes the implementation of Fourier based radar imaging algorithms in a computer program. In particular, the algorithms are derived for wide bandwidth and for specific geometries. These geometries are often measured by radar cross section measurement systems such as compact ranges and near field linear synthetic aperture radar systems. The limitations of different implementations of the algorithms are presented. Imaging results from radar measurements are also presented for an F-4 fighter aircraft, an M35 truck (1/16 scale model), and a forest.

  17. Immobilization, stabilization and patterning techniques for enzyme based sensor systems.

    SciTech Connect

    Flounders, A.W.; Carichner, S.C.; Singh, A.K.; Volponi, J.V.; Schoeniger, J.S.; Wally, K.

    1997-01-01

    Sandia National Laboratories has recently opened the Chemical and Radiation Detection Laboratory (CRDL) in Livermore CA to address the detection needs of a variety of government agencies (e.g., Department of Energy, Environmental Protection Agency, Department of Agriculture) as well as provide a fertile environment for the cooperative development of new industrial technologies. This laboratory consolidates a variety of existing chemical and radiation detection efforts and enables Sandia to expand into the novel area of biochemically based sensors. One aspect of this biosensor effort is further development and optimization of enzyme modified field effect transistors (EnFETs). Recent work has focused upon covalent attachment of enzymes to silicon dioxide and silicon nitride surfaces for EnFET fabrication. They are also investigating methods to pattern immobilized proteins; a critical component for development of array-based sensor systems. Novel enzyme stabilization procedures are key to patterning immobilized enzyme layers while maintaining enzyme activity. Results related to maximized enzyme loading, optimized enzyme activity and fluorescent imaging of patterned surfaces will be presented.

  18. A survey of GPU-based medical image computing techniques.

    PubMed

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming; Wang, Defeng

    2012-09-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  19. A survey of GPU-based medical image computing techniques

    PubMed Central

    Shi, Lin; Liu, Wen; Zhang, Heye; Xie, Yongming

    2012-01-01

    Medical imaging currently plays a crucial role throughout the entire clinical applications from medical scientific research to diagnostics and treatment planning. However, medical imaging procedures are often computationally demanding due to the large three-dimensional (3D) medical datasets to process in practical clinical applications. With the rapidly enhancing performances of graphics processors, improved programming support, and excellent price-to-performance ratio, the graphics processing unit (GPU) has emerged as a competitive parallel computing platform for computationally expensive and demanding tasks in a wide range of medical image applications. The major purpose of this survey is to provide a comprehensive reference source for the starters or researchers involved in GPU-based medical image processing. Within this survey, the continuous advancement of GPU computing is reviewed and the existing traditional applications in three areas of medical image processing, namely, segmentation, registration and visualization, are surveyed. The potential advantages and associated challenges of current GPU-based medical imaging are also discussed to inspire future applications in medicine. PMID:23256080

  20. Image processing technique based on image understanding architecture

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-12-01

    Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

  1. The efficacy and toxicity of individualized intensity-modulated radiotherapy based on the tumor extension patterns of nasopharyngeal carcinoma

    PubMed Central

    Zhou, Guan-Qun; Guo, Rui; Zhang, Fan; Zhang, Yuan; Xu, Lin; Zhang, Lu-Lu; Lin, Ai-Hua; Ma, Jun; Sun, Ying

    2016-01-01

    Background To evaluate the efficacy and toxicity of intensity-modulated radiotherapy (IMRT) using individualized clinical target volumes (CTVs) based on the loco-regional extension patterns of nasopharyngeal carcinoma (NPC). Methods From December 2009 to February 2012, 220 patients with histologically-proven, non-disseminated NPC were prospectively treated with IMRT according to an individualized delineation protocol. CTV1 encompassed the gross tumor volume, entire nasopharyngeal mucosa and structures within the pharyngobasilar fascia with a margin. CTV2 encompassed bilateral high risk anatomic sites and downstream anatomic sites adjacent to primary tumor, bilateral retropharyngeal regions, levels II, III and Va, and prophylactic irradiation was gave to one or two levels beyond clinical lymph nodes involvement. Clinical outcomes and toxicities were evaluated. Results Median follow-up was 50.8 (range, 1.3–68.0) months, four-year local relapse-free, regional relapse-free, distant metastasis-free, disease-free and overall survival rates were 94.7%, 97.0%, 91.7%, 87.2% and 91.9%, respectively. Acute severe (≥ grade 3) mucositis, dermatitis and xerostomia were observed in 27.6%, 3.6% and zero patients, respectively. At 1 year, xerostomia was mild, with frequencies of Grade 0, 1, 2 and 3 xerostomia of 27.9%, 63.3%, 8.3% and 0.5%, respectively. Conclusions IMRT using individualized CTVs provided high rates of local and regional control and a favorable toxicity profile in NPC. Individualized CTV delineation strategy is a promising one that may effectively avoid unnecessary or missed irradiation, and deserve optimization to define more precise individualized CTVs. PMID:26980744

  2. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    PubMed

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value. PMID:26883877

  3. Pseudorandom noise code-based technique for thin-cloud discrimination with CO2 and O2 absorption measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.

    2011-12-01

    NASA Langley Research Center is working on a continuous wave (cw) laser-based remote sensing scheme for the detection of CO2 and O2 from space-based platforms suitable for an active sensing of CO2 emissions over nights, days, and seasons (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multifrequency, intensity modulated cw laser absorption spectrometer operating at 1.57 μm for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a pseudonoise code-based technique for cloud and aerosol discrimination applications. The possibility of using maximum length sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated. Proof-of-concept experiments carried out using a sonar-based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths.

  4. Pseudorandom Noise Code-Based Technique for Thin Cloud Discrimination with CO2 and O2 Absorption Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a pseudo noise (PN) code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths.

  5. Region Duplication Forgery Detection Technique Based on SURF and HAC

    PubMed Central

    Mishra, Parul; Sharma, Sanjeev; Patel, Ravindra

    2013-01-01

    Region duplication forgery detection is a special type of forgery detection approach and widely used research topic under digital image forensics. In copy move forgery, a specific area is copied and then pasted into any other region of the image. Due to the availability of sophisticated image processing tools, it becomes very hard to detect forgery with naked eyes. From the forged region of an image no visual clues are often detected. For making the tampering more robust, various transformations like scaling, rotation, illumination changes, JPEG compression, noise addition, gamma correction, and blurring are applied. So there is a need for a method which performs efficiently in the presence of all such attacks. This paper presents a detection method based on speeded up robust features (SURF) and hierarchical agglomerative clustering (HAC). SURF detects the keypoints and their corresponding features. From these sets of keypoints, grouping is performed on the matched keypoints by HAC that shows copied and pasted regions. PMID:24311972

  6. Symbolic document image compression based on pattern matching techniques

    NASA Astrophysics Data System (ADS)

    Shiah, Chwan-Yi; Yen, Yun-Sheng

    2011-10-01

    In this paper, a novel compression algorithm for Chinese document images is proposed. Initially, documents are segmented into readable components such as characters and punctuation marks. Similar patterns within the text are found by shape context matching and grouped to form a set of prototype symbols. Text redundancies can be removed by replacing repeated symbols by their corresponding prototype symbols. To keep the compression visually lossless, we use a multi-stage symbol clustering procedure to group similar symbols and to ensure that there is no visible error in the decompressed image. In the encoding phase, the resulting data streams are encoded by adaptive arithmetic coding. Our results show that the average compression ratio is better than the international standard JBIG2 and the compressed form of a document image is suitable for a content-based keyword searching operation.

  7. Developing Visualization Techniques for Semantics-based Information Networks

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  8. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  9. Constellation choosing based on multi-dimensional sphere packing technique

    NASA Astrophysics Data System (ADS)

    Jinghe, Li; Guijun, Hu; Kashero, Enock; Zhaoxi, Li

    2016-09-01

    In this paper we address the sphere packing lattice points selection problem being used as constellation points in high-dimensional modulation. We propose a new type of points selection method based on threshold theory. Theoretically, this method improves the transmission performance of high-dimensional signal modulation systems. We find that the BER of a 4D modulation signal using the threshold value points selection method reduces. We also compared random and distant points selection methods in a BER of 10-3 and obtained a reduced SNR of about 2 db. At a 10-3 BER, a 8D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3 db. At a 10-3 BER, a 16D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3.5 db.

  10. Techniques to derive geometries for image-based Eulerian computations

    PubMed Central

    Dillard, Seth; Buchholz, James; Vigmostad, Sarah; Kim, Hyunggun; Udaykumar, H.S.

    2014-01-01

    Purpose The performance of three frequently used level set-based segmentation methods is examined for the purpose of defining features and boundary conditions for image-based Eulerian fluid and solid mechanics models. The focus of the evaluation is to identify an approach that produces the best geometric representation from a computational fluid/solid modeling point of view. In particular, extraction of geometries from a wide variety of imaging modalities and noise intensities, to supply to an immersed boundary approach, is targeted. Design/methodology/approach Two- and three-dimensional images, acquired from optical, X-ray CT, and ultrasound imaging modalities, are segmented with active contours, k-means, and adaptive clustering methods. Segmentation contours are converted to level sets and smoothed as necessary for use in fluid/solid simulations. Results produced by the three approaches are compared visually and with contrast ratio, signal-to-noise ratio, and contrast-to-noise ratio measures. Findings While the active contours method possesses built-in smoothing and regularization and produces continuous contours, the clustering methods (k-means and adaptive clustering) produce discrete (pixelated) contours that require smoothing using speckle-reducing anisotropic diffusion (SRAD). Thus, for images with high contrast and low to moderate noise, active contours are generally preferable. However, adaptive clustering is found to be far superior to the other two methods for images possessing high levels of noise and global intensity variations, due to its more sophisticated use of local pixel/voxel intensity statistics. Originality/value It is often difficult to know a priori which segmentation will perform best for a given image type, particularly when geometric modeling is the ultimate goal. This work offers insight to the algorithm selection process, as well as outlining a practical framework for generating useful geometric surfaces in an Eulerian setting. PMID

  11. Ultrasound-based technique for intrathoracic surgical guidance

    NASA Astrophysics Data System (ADS)

    Huang, Xishi; Hill, Nicholas A.; Peters, Terry M.

    2005-04-01

    Image-guided procedures within the thoracic cavity require accurate registration of a pre-operative virtual model to the patient. Currently, surface landmarks are used for thoracic cavity registration; however, this approach is unreliable due to skin movement relative to the ribs. An alternative method for providing surgeons with image feedback in the operating room is to integrate images acquired during surgery with images acquired pre-operatively. This integration process is required to be automatic, fast, accurate and robust; however inter-modal image registration is difficult due to the lack of a direct relationship between the intensities of the two image sets. To address this problem, Computed Tomography (CT) was used to acquire pre-operative images and Ultrasound (US) was used to acquire peri-operative images. Since bone has a high electron density and is highly echogenic, the rib cage is visualized as a bright white boundary in both datasets. The proposed approach utilizes the ribs as the basis for an intensity-based registration method -- mutual information. We validated this approach using a thorax phantom. Validation results demonstrate that this approach is accurate and shows little variation between operators. The fiducial registration error, the registration error between the US and CT images, was < 1.5mm. We propose this registration method as a basis for precise tracking of minimally invasive thoracic procedures. This method will permit the planning and guidance of image-guided minimally invasive procedures for the lungs, as well as for both catheter-based and direct trans-mural interventions within the beating heart.

  12. Office-based rapid prototyping in orthopedic surgery: a novel planning technique and review of the literature.

    PubMed

    Schwartz, Adam; Money, Kyle; Spangehl, Mark; Hattrup, Steven; Claridge, Richard J; Beauchamp, Christopher

    2015-01-01

    Three-dimensional (3-D) prototyping, based on high-quality axial images, may allow for more accurate and extensive preoperative planning and may even allow surgeons to perform procedures as part of preoperative preparation. In this article, we describe 7 cases of complex orthopedic disorders that were surgically treated after preoperative planning that was based on both industry-provided models and use of our in-house 3-D printer. Commercially available 3-D printers allow for rapid in-office production of a high-quality realistic prototype at relatively low per-case cost. Using this technique, surgeons can assess the accuracy of their original surgical plans and, if necessary, correct them preoperatively. The ability to "perform surgery preoperatively" adds another element to surgeons' perceptions of the potential issues that may arise. PMID:25566552

  13. Optical performance monitoring technique using software-based synchronous amplitude histograms.

    PubMed

    Choi, H G; Chang, J H; Kim, Hoon; Chung, Y C

    2014-10-01

    We propose and demonstrate a simple technique to monitor both the optical signal-to-noise ratio (OSNR) and chromatic dispersion (CD) by using the software-based synchronous amplitude histogram (SAH) analysis. We exploit the software-based synchronization technique to construct SAHs from the asynchronously sampled intensities of the signal. The use of SAHs facilitates the accurate extraction of the monitoring parameters at the center of the symbol. Thus, unlike in the case of using the technique based on the asynchronous amplitude histogram (AAH), this technique is not affected by the transient characteristics of the modulated signals. The performance of the proposed monitoring technique is evaluated experimentally by using 10-Gbaud quadrature phase-shift keying (QPSK) and quadrature amplitude modulation (QAM) signals over wide ranges of OSNR and CD. We also evaluate the robustness of the proposed technique to the signal's transient characteristics. PMID:25321978

  14. Study of hydrogen in coals, polymers, oxides, and muscle water by nuclear magnetic resonance; extension of solid-state high-resolution techniques. [Hydrogen molybdenum bronze

    SciTech Connect

    Ryan, L.M.

    1981-10-01

    Nuclear magnetic resonance (NMR) spectroscopy has been an important analytical and physical research tool for several decades. One area of NMR which has undergone considerable development in recent years is high resolution NMR of solids. In particular, high resolution solid state /sup 13/C NMR spectra exhibiting features similar to those observed in liquids are currently achievable using sophisticated pulse techniques. The work described in this thesis develops analogous methods for high resolution /sup 1/H NMR of rigid solids. Applications include characterization of hydrogen aromaticities in fossil fuels, and studies of hydrogen in oxides and bound water in muscle.

  15. A content-based image retrieval method for optical colonoscopy images based on image recognition techniques

    NASA Astrophysics Data System (ADS)

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro

    2015-03-01

    This paper proposes a content-based image retrieval method for optical colonoscopy images that can find images similar to ones being diagnosed. Optical colonoscopy is a method of direct observation for colons and rectums to diagnose bowel diseases. It is the most common procedure for screening, surveillance and treatment. However, diagnostic accuracy for intractable inflammatory bowel diseases, such as ulcerative colitis (UC), is highly dependent on the experience and knowledge of the medical doctor, because there is considerable variety in the appearances of colonic mucosa within inflammations with UC. In order to solve this issue, this paper proposes a content-based image retrieval method based on image recognition techniques. The proposed retrieval method can find similar images from a database of images diagnosed as UC, and can potentially furnish the medical records associated with the retrieved images to assist the UC diagnosis. Within the proposed method, color histogram features and higher order local auto-correlation (HLAC) features are adopted to represent the color information and geometrical information of optical colonoscopy images, respectively. Moreover, considering various characteristics of UC colonoscopy images, such as vascular patterns and the roughness of the colonic mucosa, we also propose an image enhancement method to highlight the appearances of colonic mucosa in UC. In an experiment using 161 UC images from 32 patients, we demonstrate that our method improves the accuracy of retrieving similar UC images.

  16. BEaST: brain extraction based on nonlocal segmentation technique.

    PubMed

    Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir; Manjón, José V; Leung, Kelvin K; Guizard, Nicolas; Wassef, Shafik N; Østergaard, Lasse Riis; Collins, D Louis

    2012-02-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors. PMID:21945694

  17. Doubly robust multiple imputation using kernel-based techniques.

    PubMed

    Hsu, Chiu-Hsieh; He, Yulei; Li, Yisheng; Long, Qi; Friese, Randall

    2016-05-01

    We consider the problem of estimating the marginal mean of an incompletely observed variable and develop a multiple imputation approach. Using fully observed predictors, we first establish two working models: one predicts the missing outcome variable, and the other predicts the probability of missingness. The predictive scores from the two models are used to measure the similarity between the incomplete and observed cases. Based on the predictive scores, we construct a set of kernel weights for the observed cases, with higher weights indicating more similarity. Missing data are imputed by sampling from the observed cases with probability proportional to their kernel weights. The proposed approach can produce reasonable estimates for the marginal mean and has a double robustness property, provided that one of the two working models is correctly specified. It also shows some robustness against misspecification of both models. We demonstrate these patterns in a simulation study. In a real-data example, we analyze the total helicopter response time from injury in the Arizona emergency medical service data. PMID:26647734

  18. Influence of an extensive inquiry-based field experience on pre-service elementary student teachers' science teaching beliefs

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sumita

    This study examined the effects of an extensive inquiry-based field experience on pre-service elementary teachers' personal agency beliefs (PAB) about teaching science and their ability to effectively implement science instruction. The research combined quantitative and qualitative approaches within an ethnographic research tradition. A comparison was made between the pre and posttest scores for two groups. The experimental group utilized the inquiry method; the control group did not. The experimental group had the stronger PAB pattern. The field experience caused no significant differences to the context beliefs of either groups, but did to the capability beliefs. The number of college science courses taken by pre-service elementary teachers' was positively related to their post capability belief (p = .0209). Qualitative information was collected through case studies which included observation of classrooms, assessment of lesson plans and open-ended, extended interviews of the participants about their beliefs in their teaching abilities (efficacy beliefs), and in teaching environments (context beliefs). The interview data were analyzed by the analytic induction method to look for themes. The emerging themes were then grouped under several attributes. Following a review of the attributes a number of hypotheses were formulated. Each hypothesis was then tested across all the cases by the constant comparative method. The pattern of relationship that emerged from the hypotheses testing clearly suggests a new hypothesis that there is a spiral relationship among the ability to establish communicative relationship with students, desire for personal growth and improvement, and greater content knowledge. The study concluded that inquiry based student teaching should be encouraged to train school science teachers. But the meaning and the practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom. A survey should be

  19. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  20. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  1. A new variance-based global sensitivity analysis technique

    NASA Astrophysics Data System (ADS)

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2013-11-01

    A new set of variance-based sensitivity indices, called W-indices, is proposed. Similar to the Sobol's indices, both main and total effect indices are defined. The W-main effect indices measure the average reduction of model output variance when the ranges of a set of inputs are reduced, and the total effect indices quantify the average residual variance when the ranges of the remaining inputs are reduced. Geometrical interpretations show that the W-indices gather the full information of the variance ratio function, whereas, Sobol's indices only reflect the marginal information. Then the double-loop-repeated-set Monte Carlo (MC) (denoted as DLRS MC) procedure, the double-loop-single-set MC (denoted as DLSS MC) procedure and the model emulation procedure are introduced for estimating the W-indices. It is shown that the DLRS MC procedure is suitable for computing all the W-indices despite its highly computational cost. The DLSS MC procedure is computationally efficient, however, it is only applicable for computing low order indices. The model emulation is able to estimate all the W-indices with low computational cost as long as the model behavior is correctly captured by the emulator. The Ishigami function, a modified Sobol's function and two engineering models are utilized for comparing the W- and Sobol's indices and verifying the efficiency and convergence of the three numerical methods. Results show that, for even an additive model, the W-total effect index of one input may be significantly larger than its W-main effect index. This indicates that there may exist interaction effects among the inputs of an additive model when their distribution ranges are reduced.

  2. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  3. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    PubMed

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  4. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  5. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-01

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis. PMID:26529095

  6. A Widely Applicable Extension of the Random Effects Two-Way Layout: Its Definition and Statistical Analysis Based on Group Invariance

    ERIC Educational Resources Information Center

    Li, Heng

    2004-01-01

    A type of data layout that may be considered as an extension of the two-way random effects analysis of variance is characterized and modeled based on group invariance. The data layout seems to be suitable for several scenarios in psychometrics, including the one in which multiple measurements are taken on each of a set of variables, and the…

  7. Computer-based video digitizer analysis of surface extension in maize roots: kinetics of growth rate changes during gravitropism.

    PubMed

    Ishikawa, H; Hasenstein, K H; Evans, M L

    1991-02-01

    We used a video digitizer system to measure surface extension and curvature in gravistimulated primary roots of maize (Zea mays L.). Downward curvature began about 25 +/- 7 min after gravistimulation and resulted from a combination of enhanced growth along the upper surface and reduced growth along the lower surface relative to growth in vertically oriented controls. The roots curved at a rate of 1.4 +/- 0.5 degrees min-1 but the pattern of curvature varied somewhat. In about 35% of the samples the roots curved steadily downward and the rate of curvature slowed as the root neared 90 degrees. A final angle of about 90 degrees was reached 110 +/- 35 min after the start of gravistimulation. In about 65% of the samples there was a period of backward curvature (partial reversal of curvature) during the response. In some cases (about 15% of those showing a period of reverse bending) this period of backward curvature occurred before the root reached 90 degrees. Following transient backward curvature, downward curvature resumed and the root approached a final angle of about 90 degrees. In about 65% of the roots showing a period of reverse curvature, the roots curved steadily past the vertical, reaching maximum curvature about 205 +/- 65 min after gravistimulation. The direction of curvature then reversed back toward the vertical. After one or two oscillations about the vertical the roots obtained a vertical orientation and the distribution of growth within the root tip became the same as that prior to gravistimulation. The period of transient backward curvature coincided with and was evidently caused by enhancement of growth along the concave and inhibition of growth along the convex side of the curve, a pattern opposite to that prevailing in the earlier stages of downward curvature. There were periods during the gravitropic response when the normally unimodal growth-rate distribution within the elongation zone became bimodal with two peaks of rapid elongation separated by

  8. Computer-based video digitizer analysis of surface extension in maize roots: kinetics of growth rate changes during gravitropism

    NASA Technical Reports Server (NTRS)

    Ishikawa, H.; Hasenstein, K. H.; Evans, M. L.

    1991-01-01

    We used a video digitizer system to measure surface extension and curvature in gravistimulated primary roots of maize (Zea mays L.). Downward curvature began about 25 +/- 7 min after gravistimulation and resulted from a combination of enhanced growth along the upper surface and reduced growth along the lower surface relative to growth in vertically oriented controls. The roots curved at a rate of 1.4 +/- 0.5 degrees min-1 but the pattern of curvature varied somewhat. In about 35% of the samples the roots curved steadily downward and the rate of curvature slowed as the root neared 90 degrees. A final angle of about 90 degrees was reached 110 +/- 35 min after the start of gravistimulation. In about 65% of the samples there was a period of backward curvature (partial reversal of curvature) during the response. In some cases (about 15% of those showing a period of reverse bending) this period of backward curvature occurred before the root reached 90 degrees. Following transient backward curvature, downward curvature resumed and the root approached a final angle of about 90 degrees. In about 65% of the roots showing a period of reverse curvature, the roots curved steadily past the vertical, reaching maximum curvature about 205 +/- 65 min after gravistimulation. The direction of curvature then reversed back toward the vertical. After one or two oscillations about the vertical the roots obtained a vertical orientation and the distribution of growth within the root tip became the same as that prior to gravistimulation. The period of transient backward curvature coincided with and was evidently caused by enhancement of growth along the concave and inhibition of growth along the convex side of the curve, a pattern opposite to that prevailing in the earlier stages of downward curvature. There were periods during the gravitropic response when the normally unimodal growth-rate distribution within the elongation zone became bimodal with two peaks of rapid elongation separated by

  9. Maternal Mortality in Rural South Ethiopia: Outcomes of Community-Based Birth Registration by Health Extension Workers

    PubMed Central

    Yaya, Yaliso; Data, Tadesse; Lindtjørn, Bernt

    2015-01-01

    Introduction Rural communities in low-income countries lack vital registrations to track birth outcomes. We aimed to examine the feasibility of community-based birth registration and measure maternal mortality ratio (MMR) in rural south Ethiopia. Methods In 2010, health extension workers (HEWs) registered births and maternal deaths among 421,639 people in three districts (Derashe, Bonke, and Arba Minch Zuria). One nurse-supervisor per district provided administrative and technical support to HEWs. The primary outcomes were the feasibility of registration of a high proportion of births and measuring MMR. The secondary outcome was the proportion of skilled birth attendance. We validated the completeness of the registry and the MMR by conducting a house-to-house survey in 15 randomly selected villages in Bonke. Results We registered 10,987 births (81·4% of expected 13,492 births) with annual crude birth rate of 32 per 1,000 population. The validation study showed that, of 2,401 births occurred in the surveyed households within eight months of the initiation of the registry, 71·6% (1,718) were registered with similar MMRs (474 vs. 439) between the registered and unregistered births. Overall, we recorded 53 maternal deaths; MMR was 489 per 100,000 live births and 83% (44 of 53 maternal deaths) occurred at home. Ninety percent (9,863 births) were at home, 4% (430) at health posts, 2·5% (282) at health centres, and 3·5% (412) in hospitals. MMR increased if: the male partners were illiterate (609 vs. 346; p= 0·051) and the villages had no road access (946 vs. 410; p= 0·039). The validation helped to increase the registration coverage by 10% through feedback discussions. Conclusion It is possible to obtain a high-coverage birth registration and measure MMR in rural communities where a functional system of community health workers exists. The MMR was high in rural south Ethiopia and most births and maternal deaths occurred at home. PMID:25799229

  10. AQA-PM: Extension of the Air-Quality model for Austria with satellite based Particulate Matter estimates

    NASA Astrophysics Data System (ADS)

    Hirtl, M.; Mantovani, S.; Krüger, B. C.; Triebnig, G.

    2012-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Applied Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using assimilation techniques. It is expected that the assimilation of satellite measurements will significantly improve the quality of AQA. Currently no observations are considered in the modeling system. At the current stage of the project, different datasets have been collected (ground measurements, satellite measurements, fine resolved regional emission inventories) and are analyzed and prepared for further processing. This contribution gives an overview of the project working plan and the upcoming developments. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  11. AQA-PM: Extension of the Air-Quality Model For Austria with Satellite based Particulate Matter Estimates

    NASA Astrophysics Data System (ADS)

    Hirtl, Marcus; Mantovani, Simone; Krüger, Bernd C.; Triebnig, Gerhard; Flandorfer, Claudia

    2013-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using regression- and assimilation techniques. For the model simulations WRF/Chem is used with a resolution of 3 km over the alpine region. Interfaces have been developed to account for the different measurements as input data. The available local emission inventories provided by the different Austrian regional governments were harmonized and used for the model simulations. An episode in February 2010 is chosen for the model evaluation. During that month exceedances of PM10-thresholds occurred at many measurement stations of the Austrian network. Different model runs (only model/only ground stations assimilated/satellite and ground stations assimilated) are compared to the respective measurements. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  12. Block based image compression technique using rank reduction and wavelet difference reduction

    NASA Astrophysics Data System (ADS)

    Bolotnikova, Anastasia; Rasti, Pejman; Traumann, Andres; Lusi, Iiris; Daneshmand, Morteza; Noroozi, Fatemeh; Samuel, Kadri; Sarkar, Suman; Anbarjafari, Gholamreza

    2015-12-01

    In this paper a new block based lossy image compression technique which is using rank reduction of the image and wavelet difference reduction (WDR) technique, is proposed. Rank reduction is obtained by applying singular value decomposition (SVD). The input image is divided into blocks of equal sizes after which quantization by SVD is carried out on each block followed by WDR technique. Reconstruction is carried out by decompressing each blocks bit streams and then merging all of them to obtain the decompressed image. The visual and quantitative experimental results of the proposed image compression technique are shown and also compared with those of the WDR technique and JPEG2000. From the results of the comparison, the proposed image compression technique outperforms the WDR and JPEG2000 techniques.

  13. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  14. Estimation of Missing Precipitation Data using Soft Computing based Spatial Interpolation Techniques

    NASA Astrophysics Data System (ADS)

    Teegavarapu, R. S.

    2007-12-01

    Deterministic and stochastic weighting methods are the most frequently used methods for estimating missing rainfall values at a gage based on values recorded at all other available recording gages. Traditional spatial interpolation techniques can be integrated with soft computing techniques to improve the estimation of missing precipitation data. Association rule mining based spatial interpolation approach, universal function approximation based kriging, optimal function approximation and clustering methods are developed and investigated in the current study to estimate missing precipitation values at a gaging station. Historical daily precipitation data obtained from 15 rain gauging stations from a temperate climatic region, Kentucky, USA, are used to test this approach and derive conclusions about efficacy of these methods in estimating missing precipitation data. Results suggest that the use of soft computing techniques in conjunction with a spatial interpolation technique can improve the precipitation estimates and help to address few limitations of traditional spatial interpolation techniques.

  15. [A Detection Technique for Gas Concentration Based on the Spectral Line Shape Function].

    PubMed

    Zhou, Mo; Yang, Bing-chu; Tao, Shao-hua

    2015-04-01

    The methods that can rapidly and precisely measure concentrations of various gases have extensive applications in the fields such as air quality analysis, environmental pollution detection, and so on. The gas detection method based on the tunable laser absorption spectroscopy is considered a promising technique. For the infrared spectrum detection techniques, the line shape function of an absorption spectrum of a gas is an important parameter in qualitative and quantitative analysis of a gas. Specifically, how to obtain the line shape function of an absorption spectrum of a gas quickly and accurately is a key problem in the gas detection fields. In this paper we analyzed several existing line shape functions and proposed a method to calculate precisely the line shape function of a gas, and investigated the relation between the gas concentration and the peak value of a line shape function. Then we experimentally measured the absorption spectra of an acetylene gas in the wavelength range of 1,515-1,545 nm with a tunable laser source and a built-in spectrometer. With Lambert-Beer law we calculated the peak values of the line shape function of the gas at the given frequencies, and obtained a fitting curve for the line shape function in the whole waveband by using a computer program. Comparing the measured results with the calculated results of the Voigt function, we found that there was a deviation-between the experimental results and the calculated results. And we found that the measured concentration of the acetylene gas by using the fitting curve of the line shape function was more accurate and compatible with the actual situation. Hence, the empirical formula for the line shape function obtained from the experimental results would be more suitable for the concentration measurement of a gas. As the fitting curve for the line shape function of the acetylene gas has been deduced from the experiment, the corresponding peak values of the spectral lines can be

  16. Application of USP inlet extensions to the TSI impactor system 3306/3320 using HFA 227 based solution metered dose inhalers.

    PubMed

    Mogalian, Erik; Myrdal, Paul Brian

    2005-12-01

    The objective of this study was to further evaluate the need for a vertical inlet extension when testing solution metered dose inhalers using the TSI Model 3306 Impactor Inlet in conjunction with the TSI Model 3320 Aerodynamic Particle Sizer (APS). The configurations tested using the TSI system were compared to baseline measurements that were performed using the Andersen Mark II 8-stage cascade impactor (ACI). Seven pressurized solution metered dose inhalers were tested using varied concentrations of beclomethasone dipropionate (BDP), ethanol, and HFA 227 propellant. The inhalers were tested with the cascade impactor, and with the TSI system. The TSI system had three different configurations as the manufacturer provided (0 cm) or with inlet extensions of 20 and 40 cm. The extensions were located between the USP inlet and the Model 3306 Impactor Inlet. There were no practical differences between each system for the stem, actuator, or USP inlet. The fine particle mass (aerodynamic mass < 4.7 microm) was affected by extension length and correlated well with the ACI when an extension was present. APS particle size measurements were unaffected by the extension lengths and correlated well to particle size determined from the ACI analysis. It has been confirmed that an inlet extension may be necessary for the TSI system in order to give mass results that correlate to the ACI, especially for formulations having significant concentrations of low volatility excipients. Additionally, the results generated from this study were used to evaluate the product performance of HFA 227 based solution formulations that contain varying concentrations of ethanol as a cosolvent. PMID:16316853

  17. Finite element modelling of non-bonded piezo sensors for biomedical health monitoring of bones based on EMI technique

    NASA Astrophysics Data System (ADS)

    Srivastava, Shashank; Bhalla, Suresh; Madan, Alok; Gupta, Ashok

    2016-04-01

    Extensive research is currently underway across the world for employing piezo sensors for biomedical health monitoring in view of their obvious advantages such as low cost,fast dynamics response and bio-compatibility.However,one of the limitations of the piezo sensor in bonded mode based on the electro-mechanical impedance (EMI) technique is that it can cause harmful effects to the humans in terms of irritation ,bone and skin disease. This paper which is in continuation of the recent demonstration of non-bonded configuration is a step towards simulating and analyzing the non-bonded configuration of the piezo sensor for gauging its effectiveness using FEA software. It has been noted that the conductance signatures obtained in non-bonded mode are significantly close to the conventional bonded configuration, thus giving a positive indication of its field use.

  18. A system identification technique based on the random decrement signatures. Part 1: Theory and simulation

    NASA Technical Reports Server (NTRS)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The mathematics of the technique is presented in addition to the results of computer simulations conducted to demonstrate the prediction of the response of the system and the random forcing function initially introduced to excite the system.

  19. Performance analysis of compressive ghost imaging based on different signal reconstruction techniques.

    PubMed

    Kang, Yan; Yao, Yin-Ping; Kang, Zhi-Hua; Ma, Lin; Zhang, Tong-Yi

    2015-06-01

    We present different signal reconstruction techniques for implementation of compressive ghost imaging (CGI). The different techniques are validated on the data collected from ghost imaging with the pseudothermal light experimental system. Experiment results show that the technique based on total variance minimization gives high-quality reconstruction of the imaging object with less time consumption. The different performances among these reconstruction techniques and their parameter settings are also analyzed. The conclusion thus offers valuable information to promote the implementation of CGI in real applications. PMID:26367039

  20. Using Fuzzy Logic Techniques for Assertion-Based Software Testing Metrics

    PubMed Central

    Alakeel, Ali M.

    2015-01-01

    Software testing is a very labor intensive and costly task. Therefore, many software testing techniques to automate the process of software testing have been reported in the literature. Assertion-Based automated software testing has been shown to be effective in detecting program faults as compared to traditional black-box and white-box software testing methods. However, the applicability of this approach in the presence of large numbers of assertions may be very costly. Therefore, software developers need assistance while making decision to apply Assertion-Based testing in order for them to get the benefits of this approach at an acceptable level of costs. In this paper, we present an Assertion-Based testing metrics technique that is based on fuzzy logic. The main goal of the proposed technique is to enhance the performance of Assertion-Based software testing in the presence of large numbers of assertions. To evaluate the proposed technique, an experimental study was performed in which the proposed technique is applied on programs with assertions. The result of this experiment shows that the effectiveness and performance of Assertion-Based software testing have improved when applying the proposed testing metrics technique. PMID:26060839

  1. Vocabulary Extension through Poetry.

    ERIC Educational Resources Information Center

    Surajlal, K. C.

    1986-01-01

    Based on the notion that teaching vocabulary extension in isolation makes little impact on students, a three-part exercise, designed to develop students' vocabulary through poetry while providing meaningful enjoyment, uses the poem "The Hawk" by A. C. Benson. In the first class period, students are introduced to both the exercise and the poem and…

  2. Multi technique amalgamation for enhanced information identification with content based image data.

    PubMed

    Das, Rik; Thepade, Sudeep; Ghosh, Saurav

    2015-01-01

    Image data has emerged as a resourceful foundation for information with proliferation of image capturing devices and social media. Diverse applications of images in areas including biomedicine, military, commerce, education have resulted in huge image repositories. Semantically analogous images can be fruitfully recognized by means of content based image identification. However, the success of the technique has been largely dependent on extraction of robust feature vectors from the image content. The paper has introduced three different techniques of content based feature extraction based on image binarization, image transform and morphological operator respectively. The techniques were tested with four public datasets namely, Wang Dataset, Oliva Torralba (OT Scene) Dataset, Corel Dataset and Caltech Dataset. The multi technique feature extraction process was further integrated for decision fusion of image identification to boost up the recognition rate. Classification result with the proposed technique has shown an average increase of 14.5 % in Precision compared to the existing techniques and the retrieval result with the introduced technique has shown an average increase of 6.54 % in Precision over state-of-the art techniques. PMID:26798574

  3. Effect of filling technique on the bond strength of methacrylate and silorane-based composite restorations.

    PubMed

    Machado, Fernanda Weingartner; Borges, Fernanda Blos; Cenci, Maximiliano Sérgio; Moraes, Rafael Ratto de; Boscato, Noéli

    2016-01-01

    The bond strength of methacrylate (Z350, 3M ESPE) and silorane (P90, 3M ESPE) restorations, using different cavity filling techniques, was investigated. Cavities (6 × 3 × 3) in bovine teeth were filled using bulk, oblique, or horizontal increments. A push-out test was carried out after 24 h. Data were statistically analyzed (α = 5%). Methacrylate-based composites and the horizontal filling technique showed the highest bond strength values (10.2 ± 3.9, p < 0.05). Silorane-based composites showed no statistically significant differences regarding the filling techniques (p > 0.05). PMID:27050940

  4. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  5. A novel background subtraction technique based on grayscale morphology for weld defect detection

    NASA Astrophysics Data System (ADS)

    Aminzadeh, Masoumeh; Kurfess, Thomas

    2016-04-01

    Optical inspection is a non-destructive quality monitoring technique to detect defects in manufactured parts. Automating the defect detection, by application of image processing, prevents the presence of human operators making the inspection more reliable, reproducible and faster. In this paper, a background subtraction technique, based on morphological operations, is proposed. The low-computational load associated with the used morphological operations makes this technique more computationally effective than background subtraction techniques such as spline approximation and surface-fitting. The performance of the technique is tested by applying to detect defects in a weld seam with non-uniform intensity distribution where the defects are precisely segmented. The proposed background subtraction technique is generalizable to sheet, surface, or part defect detection in various applications of manufacturing.

  6. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  7. Nondestructive testing techniques

    NASA Astrophysics Data System (ADS)

    Bray, Don E.; McBride, Don

    A comprehensive reference covering a broad range of techniques in nondestructive testing is presented. Based on years of extensive research and application at NASA and other government research facilities, the book provides practical guidelines for selecting the appropriate testing methods and equipment. Topics discussed include visual inspection, penetrant and chemical testing, nuclear radiation, sonic and ultrasonic, thermal and microwave, magnetic and electromagnetic techniques, and training and human factors. (No individual items are abstracted in this volume)

  8. A Novel Graph Based Fuzzy Clustering Technique For Unsupervised Classification Of Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Banerjee, B.; Krishna Moohan, B.

    2014-11-01

    This paper addresses the problem of unsupervised land-cover classification of multi-spectral remotely sensed images in the context of self-learning by exploring different graph based clustering techniques hierarchically. The only assumption used here is that the number of land-cover classes is known a priori. Object based image analysis paradigm which processes a given image at different levels, has emerged as a popular alternative to the pixel based approaches for remote sensing image segmentation considering the high spatial resolution of the images. A graph based fuzzy clustering technique is proposed here to obtain a better merging of an initially oversegmented image in the spectral domain compared to conventional clustering techniques. Instead of using Euclidean distance measure, the cumulative graph edge weight is used to find the distance between a pair of points to better cope with the topology of the feature space. In order to handle uncertainty in assigning class labels to pixels, which is not always a crisp allocation for remote sensing data, fuzzy set theoretic technique is incorporated to the graph based clustering. Minimum Spanning Tree (MST) based clustering technique is used to over-segment the image at the first level. Furthermore, considering that the spectral signature of different land-cover classes may overlap significantly, a self-learning based Maximum Likelihood (ML) classifier coupled with the Expectation Maximization (EM) based iterative unsupervised parameter retraining scheme is used to generate the final land-cover classification map. Results on two medium resolution images establish the superior performance of the proposed technique in comparison to the traditional fuzzy c-means clustering technique.

  9. A framework for validating light fields created using physically based rendering techniques

    NASA Astrophysics Data System (ADS)

    Whittinghill, David M.

    This research study presents a framework for applying physically based global illumination techniques to the creation of software models of light fields that are then validated against actual light fields measured in physical experiments. A prior experiment was performed by horticulture scientists in which the light field of an empty plant growth chamber was measured using quantum sensors at fixed spatial intervals. The result was a light map consisting of a 9 x 45, fixed-width, two-dimensional graph of sensor readings that described the intensity of radiant energy present in the chamber at the chosen locations. A single observation of the growth chamber was made resulting in a single data set consisting of 45 different, location-sensitive irradiance observations. To test this framework a series of simulations were performed in which the physical attributes of the growth chamber were duplicated as closely as possible in a virtual growth chamber software model. Modeled attributes included physical dimensions, wall and light reflectivity, and full-spectrum light characterization. Light transport was modeled using a physically based, global illumination rendering technique called photon mapping. Virtual sensors that recorded the intensity of the light that transmitted through their surface were placed in the virtual chamber at the same position and interval as the ones that were used in the physical experiment. The output of the virtual chamber experiments were represented as a graph in the same configuration as the one in the physical experiment. The experiment was conducted using a modified version of pbrt, a physically based, extensible renderer developed by Matt Pharr and Greg Humphreys [1]. As photon mapping uses a stochastic algorithm, many repetitions of the virtual chamber experiment were performed and the mean and standard deviation were recorded as a global measure for each chamber as well as for each individual sensor location. The global means of the

  10. Landau parameters for isospin asymmetric nuclear matter based on a relativistic model of composite and finite extension nucleons

    SciTech Connect

    Aguirre, R. M.; Paoli, A. L. de

    2007-04-15

    We study the properties of cold asymmetric nuclear matter at high density, applying the quark meson coupling model with excluded volume corrections in the framework of the Landau theory of relativistic Fermi liquids. We discuss the role of the finite spatial extension of composite baryons on dynamical and statistical properties such as the Landau parameters, the compressibility, and the symmetry energy. We have also calculated the low-lying collective eigenfrequencies arising from the collisionless quasiparticle transport equation, considering both unstable and stable modes. An overall analysis of the excluded volume correlations on the collective properties is performed.

  11. Rational extensions of the trigonometric Darboux-Pöschl-Teller potential based on para-Jacobi polynomials

    NASA Astrophysics Data System (ADS)

    Bagchi, B.; Grandati, Y.; Quesne, C.

    2015-06-01

    The possibility for the Jacobi equation to admit, in some cases, general solutions that are polynomials has been recently highlighted by Calogero and Yi, who termed them para-Jacobi polynomials. Such polynomials are used here to build seed functions of a Darboux-Bäcklund transformation for the trigonometric Darboux-Pöschl-Teller potential. As a result, one-step regular rational extensions of the latter depending both on an integer index n and on a continuously varying parameter λ are constructed. For each n value, the eigenstates of these extended potentials are associated with a novel family of λ-dependent polynomials, which are orthogonal on [-1,1].

  12. IMACCS: A Progress Report on NASA/GSFC's COTS-Based Ground Data Systems, and their Extension into New Domains

    NASA Technical Reports Server (NTRS)

    Scheidker, E. J.; Pendley, R. D.; Rashkin, R. M.; Werking, R. D.; Cruse, B. G.; Bracken, M. A.

    1996-01-01

    The integrated monitoring, analysis and control commercial off-the-shelf system (IMACCS) for the provision of real-time satellite command and telemetry support, orbit and attitude determination, events prediction and data trend analysis, is considered. The upgrades made to the original commercial, off-the-shelf (COTS) prototype are described. These upgrades include automation capability, and spacecraft integration and testing capability. A further extension to the prototype is the establishment of a direct radio frequency interface to a spacecraft. The systems development approach employed is described.

  13. Repeat Customer Success in Extension

    ERIC Educational Resources Information Center

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  14. Investigation of laser Doppler anemometry in developing a velocity-based measurement technique

    NASA Astrophysics Data System (ADS)

    Jung, Ki Won

    2009-12-01

    Acoustic properties, such as the characteristic impedance and the complex propagation constant, of porous materials have been traditionally characterized based on pressure-based measurement techniques using microphones. Although the microphone techniques have evolved since their introduction, the most general form of the microphone technique employs two microphones in characterizing the acoustic field for one continuous medium. The shortcomings of determining the acoustic field based on only two microphones can be overcome by using numerous microphones. However, the use of a number of microphones requires a careful and intricate calibration procedure. This dissertation uses laser Doppler anemometry (LDA) to establish a new measurement technique which can resolve issues that microphone techniques have: First, it is based on a single sensor, thus the calibration is unnecessary when only overall ratio of the acoustic field is required for the characterization of a system. This includes the measurements of the characteristic impedance and the complex propagation constant of a system. Second, it can handle multiple positional measurements without calibrating the signal at each position. Third, it can measure three dimensional components of velocity even in a system with a complex geometry. Fourth, it has a flexible adaptability which is not restricted to a certain type of apparatus only if the apparatus is transparent. LDA is known to possess several disadvantages, such as the requirement of a transparent apparatus, high cost, and necessity of seeding particles. The technique based on LDA combined with a curvefitting algorithm is validated through measurements on three systems. First, the complex propagation constant of the air is measured in a rigidly terminated cylindrical pipe which has very low dissipation. Second, the radiation impedance of an open-ended pipe is measured. These two parameters can be characterized by the ratio of acoustic field measured at multiple

  15. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity. PMID:26151172

  16. A Load-based Micro-indentation Technique for Mechanical Property and NDE Evaluation

    SciTech Connect

    Bruce S. Kang; Chuanyu Feng; Jared M. Tannenbaum; M.A. Alvin

    2009-06-04

    A load-based micro-indentation technique has been developed for evaluating mechanical properties of materials. Instead of using measured indentation depth or contact area as a necessary parameter, the new technique is based on the indentation load, coupled with a multiple-partial unloading procedure for mechanical property evaluation. The proposed load-based micro-indentation method is capable of determining Young’s modulus of metals, superalloys, and single crystal matrices, and stiffness of coated material systems with flat, tubular, or curved architectures. This micro-indentation technique can be viewed as a viable non-destructive evaluation (NDE) technique for determining as-manufactured and process-exposed metal, superalloy, single crystal, and TBC-coated material properties. Based on this technique, several bond coated substrates were tested at various stages of thermal cycles. The time-series evaluation of test material surface stiffness reveals the status of coating strength without any alternation of the coating surface, making it a true time-series NDE investigation. The microindentation test results show good correlation with post mortem microstructural analyses. This technique also shows promise for the development of a portable instrument for on-line, in-situ NDE and mechanical properties measurement of structural components.

  17. New Ways in Content-Based Instruction. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Brinton, Donna M., Ed.; Master, Peter, Ed.

    A wide variety of techniques and classroom activities, contributed by teachers, for content-based instruction (CBI) in English as a second language (ESL) are presented. CBI is defined to include theme-based second language courses, sheltered content-area courses, and paired or adjunct arrangements in which language and content courses are taught…

  18. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    NASA Astrophysics Data System (ADS)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  19. An optoelectrokinetic technique for programmable particle manipulation and bead-based biosignal enhancement.

    PubMed

    Wang, Kuan-Chih; Kumar, Aloke; Williams, Stuart J; Green, Nicolas G; Kim, Kyung Chun; Chuang, Han-Sheng

    2014-10-21

    Technologies that can enable concentration of low-abundance biomarkers are essential for early diagnosis of diseases. In this study, an optoelectrokinetic technique, termed Rapid Electrokinetic Patterning (REP), was used to enable dynamic particle manipulation in bead-based bioassays. Various manipulation capabilities, such as micro/nanoparticle aggregation, translation, sorting and patterning, were developed. The technique allows for versatile multi-parameter (voltage, light intensity and frequency) based modulation and dynamically addressable manipulation with simple device fabrication. Signal enhancement of a bead-based bioassay was demonstrated using dilute biotin-fluorescein isothiocyanate (FITC) solutions mixed with streptavidin-conjugated particles and rapidly concentrated with the technique. As compared with a conventional ELISA reader, the REP-enabled detection achieved a minimal readout of 3.87 nM, which was a 100-fold improvement in sensitivity. The multi-functional platform provides an effective measure to enhance detection levels in more bead-based bioassays. PMID:25109364

  20. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  1. Acceleration techniques for reduced-order models based on proper orthogonal decomposition

    SciTech Connect

    Cizmas, P.; Richardson, B.; Brenner, T.; O'Brien, T.; Breault, R.

    2008-01-01

    This paper presents several acceleration techniques for reduced-order models based on the proper orthogonal decomposition (POD) method. The techniques proposed herein are: (i) an algorithm for splitting the database of snapshots generated by the full-order model; (ii) a method for solving quasi-symmetrical matrices; (iii) a strategy for reducing the frequency of the projection. The acceleration techniques were applied to a POD-based reduced-order model of the twophase flows in fluidized beds. This reduced-order model was developed using numerical results from a full-order computational fluid dynamics model of a two-dimensional fluidized bed. Using these acceleration techniques the computational time of the POD model was two orders of magnitude shorter than the full-order model.

  2. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    PubMed Central

    Fu, Yu; Pedrini, Giancarlo

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  3. Interferometric dynamic measurement: techniques based on high-speed imaging or a single photodetector.

    PubMed

    Fu, Yu; Pedrini, Giancarlo; Li, Xide

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  4. Computer-aided diagnosis in breast MRI based on unsupervised clustering techniques

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Anke; Wismueller, Axel; Lange, Oliver; Leinsinger, Gerda

    2004-04-01

    Exploratory data analysis techniques are applied to the segmentation of lesions in MRI mammography as a first step of a computer-aided diagnosis system. Three new unsupervised clustering techniques are tested on biomedical time-series representing breast MRI scans: fuzzy clustering based on deterministic annealing, "neural gas" network, and topographic independent component analysis. While the first two methods enable a correct segmentation of the lesion, the latter, although incorporating a topographic mapping, fails to detect and subclassify lesions.

  5. Novel anti-jamming technique for OCDMA network through FWM in SOA based wavelength converter

    NASA Astrophysics Data System (ADS)

    Jyoti, Vishav; Kaler, R. S.

    2013-06-01

    In this paper, we propose a novel anti-jamming technique for optical code division multiple access (OCDMA) network through four wave mixing (FWM) in semiconductor optical amplifier (SOA) based wavelength converter. OCDMA signal can be easily jammed with high power jamming signal. It is shown that wavelength conversion through four wave mixing in SOA has improved capability of jamming resistance. It is observed that jammer has no effect on OCDMA network even at high jamming powers by using the proposed technique.

  6. Extensive soft-sediment deformation and peperite formation at the base of a rhyolite lava: Owyhee Mountains, SW Idaho, USA

    NASA Astrophysics Data System (ADS)

    McLean, Charlotte E.; Brown, David J.; Rawcliffe, Heather J.

    2016-06-01

    In the Northern Owyhee Mountains (SW Idaho), a >200-m-thick flow of the Miocene Jump Creek Rhyolite was erupted on to a sequence of tuffs, lapilli tuffs, breccias and lacustrine siltstones of the Sucker Creek Formation. The rhyolite lava flowed over steep palaeotopography, resulting in the forceful emplacement of lava into poorly consolidated sediments. The lava invaded this sequence, liquefying and mobilising the sediment, propagating sediment subvertically in large metre-scale fluidal diapirs and sediment injectites. The heat and the overlying pressure of the thick Jump Creek Rhyolite extensively liquefied and mobilised the sediment resulting in the homogenization of the Sucker Creek Formation units, and the formation of metre-scale loading structures (simple and pendulous load casts, detached pseudonodules). Density contrasts between the semi-molten rhyolite and liquefied sediment produced highly fluidal Rayleigh-Taylor structures. Local fluidisation formed peperite at the margins of the lava and elutriation structures in the disrupted sediment. The result is a 30-40-m zone beneath the rhyolite lava of extremely deformed stratigraphy. Brittle failure and folding is recorded in more consolidated sediments, indicating a differential response to loading due to the consolidation state of the sediments. The lava-sediment interaction is interpreted as being a function of (1) the poorly consolidated nature of the sediments, (2) the thickness and heat retention of the rhyolite lava, (3) the density contrast between the lava and the sediment and (4) the forceful emplacement of the lava. This study demonstrates how large lava bodies have the potential to extensively disrupt sediments and form significant lateral and vertical discontinuities that complicate volcanic facies architecture.

  7. A NURBS-based technique for subject-specific construction of knee bone geometry.

    PubMed

    Au, Anthony G; Palathinkal, Darren; Liggins, Adrian B; Raso, V James; Carey, Jason; Lambert, Robert G; Amirfazli, A

    2008-10-01

    Subject-specific finite element (FE) models of bones that form the knee joint require rapid and accurate geometry construction. The present study introduces a semi-automatic non-uniform rational B-spline (NURBS) technique to construct knee bone geometries from computed tomography (CT) images using a combination of edge extraction and CAD surface generation. In particular, this technique accurately constructs endosteal surfaces and can accommodate thin cortical bone by estimating the cortical thickness from well-defined surrounding bone. A procedure is also introduced to overcome the bifurcation at the femoral condyles during surface generation by combining transverse and sagittal plane CT data. Available voxel- and NURBS-based subject-specific construction techniques accurately capture periosteal surfaces but are limited in their ability to capture endosteal geometry. In this study, the proposed NURBS-based technique and a typical voxel mesh technique captured periosteal surfaces within an order of magnitude of image resolution. The endosteum of diaphyseal bone was also captured with similar accuracy by both techniques. However, the voxel mesh model failed to accurately capture the metaphyseal and epiphyseal endosteum due to the poor CT contrast of thin cortical bone, resulting in gross overestimation of cortical thickness. The proposed technique considered both the local and global nature of CT images to arrive at a description of cortical bone thickness accurate to within 2 pixel lengths. PMID:18644314

  8. Fourier transform image processing techniques for grid-based phase contrast imaging

    NASA Astrophysics Data System (ADS)

    Tahir, Sajjad; Bashir, Sajid; Petruccelli, Jonathan C.; MacDonald, C. A.

    2014-09-01

    A recently developed technique for phase imaging using table top sources is to use multiple fine-pitch gratings. However, the strict manufacturing tolerences and precise alignment required have limited the widespread adoption of grating-based techniques. In this work, we employ a technique recently demonstrated by Bennett et al.1 that ultilizes a single grid of much coarser pitch. Phase is extracted using Fourier processing on a single raw image taken using a focused mammography grid. The effects on the final image of varying grid, object, and detector distances, window widths, and of a variety of windowing functions, used to separate the harmonics, were investigated.

  9. Bandwidth-Tunable Fiber Bragg Gratings Based on UV Glue Technique

    NASA Astrophysics Data System (ADS)

    Fu, Ming-Yue; Liu, Wen-Feng; Chen, Hsin-Tsang; Chuang, Chia-Wei; Bor, Sheau-Shong; Tien, Chuen-Lin

    2007-07-01

    In this study, we have demonstrated that a uniform fiber Bragg grating (FBG) can be transformed into a chirped fiber grating by a simple UV glue adhesive technique without shifting the reflection band with respect to the center wavelength of the FBG. The technique is based on the induced strain of an FBG due to the UV glue adhesive force on the fiber surface that causes a grating period variation and an effective index change. This technique can provide a fast and simple method of obtaining the required chirp value of a grating for applications in the dispersion compensators, gain flattening in erbium-doped fiber amplifiers (EDFAs) or optical filters.

  10. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  11. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    PubMed Central

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-01-01

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation. PMID:26225974

  12. Non-linear control logics for vibrations suppression: a comparison between model-based and non-model-based techniques

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Orsini, Lorenzo; Resta, Ferruccio

    2015-04-01

    Non-linear behavior is present in many mechanical system operating conditions. In these cases, a common engineering practice is to linearize the equation of motion around a particular operating point, and to design a linear controller. The main disadvantage is that the stability properties and validity of the controller are local. In order to improve the controller performance, non-linear control techniques represent a very attractive solution for many smart structures. The aim of this paper is to compare non-linear model-based and non-model-based control techniques. In particular the model-based sliding-mode-control (SMC) technique is considered because of its easy implementation and the strong robustness of the controller even under heavy model uncertainties. Among the non-model-based control techniques, the fuzzy control (FC), allowing designing the controller according to if-then rules, has been considered. It defines the controller without a system reference model, offering many advantages such as an intrinsic robustness. These techniques have been tested on the pendulum nonlinear system.

  13. Characterization of high resolution MR images reconstructed by a GRAPPA based parallel technique

    NASA Astrophysics Data System (ADS)

    Banerjee, Suchandrima; Majumdar, Sharmila

    2006-03-01

    This work implemented an auto-calibrating parallel imaging technique and applied it to in vivo magnetic resonance imaging (MRI) of trabecular bone micro-architecture. A Generalized auto-calibrating partially parallel acquisition (GRAPPA) based reconstruction technique using modified robust data fitting was developed. The MR data was acquired with an eight channel phased array receiver on three normal volunteers on a General Electric 3 Tesla scanner. Microstructures comprising the trabecular bone architecture are of the order of 100 microns and hence their depiction requires very high imaging resolution. This work examined the effects of GRAPPA based parallel imaging on signal and noise characteristics and effective spatial resolution in high resolution (HR) images, for the range of undersampling or reduction factors 2-4. Additionally quantitative analysis was performed to obtain structural measures of trabecular bone from the images. Image quality in terms of contrast and depiction of structures was maintained in parallel images for reduction factors up to 3. Comparison between regular and parallel images suggested similar spatial resolution for both. However differences in noise characteristics in parallel images compared to regular images affected the threshholding based quantification. This suggested that GRAPPA based parallel images might require different analysis techniques. In conclusion, the study showed the feasibility of using parallel imaging techniques in HR-MRI of trabecular bone, although quantification strategies will have to be further investigated. Reduction of acquisition time using parallel techniques can improve the clinical feasibility of MRI of trabecular bone for prognosis and staging of the skeletal disorder osteoporosis.

  14. Microcapsule-based techniques for improving the safety of lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Baginska, Marta

    Lithium-ion batteries are vital energy storage devices due to their high specific energy density, lack of memory effect, and long cycle life. While they are predominantly used in small consumer electronics, new strategies for improving battery safety and lifetime are critical to the successful implementation of high-capacity, fast-charging materials required for advanced Li-ion battery applications. Currently, the presence of a volatile, combustible electrolyte and an oxidizing agent (Lithium oxide cathodes) make the Li-ion cell susceptible to fire and explosions. Thermal overheating, electrical overcharging, or mechanical damage can trigger thermal runaway, and if left unchecked, combustion of battery materials. To improve battery safety, autonomic, thermally-induced shutdown of Li-ion batteries is demonstrated by depositing thermoresponsive polymer microspheres onto battery anodes. When the internal temperature of the cell reaches a critical value, the microspheres melt and conformally coat the anode and/or separator with an ion insulating barrier, halting Li-ion transport and shutting down the cell permanently. Charge and discharge capacity is measured for Li-ion coin cells containing microsphere-coated anodes or separators as a function of capsule coverage. Scanning electron microscopy images of electrode surfaces from cells that have undergone autonomic shutdown provides evidence of melting, wetting, and re-solidification of polyethylene (PE) into the anode and polymer film formation at the anode/separator interface. As an extension of this autonomic shutdown approach, a particle-based separator capable of performing autonomic shutdown, but which reduces the shorting hazard posed by current bi- and tri-polymer commercial separators, is presented. This dual-particle separator is composed of hollow glass microspheres acting as a physical spacer between electrodes, and PE microspheres to impart autonomic shutdown functionality. An oil-immersion technique is

  15. Stabilizing operation point technique based on the tunable distributed feedback laser for interferometric sensors

    NASA Astrophysics Data System (ADS)

    Mao, Xuefeng; Zhou, Xinlei; Yu, Qingxu

    2016-02-01

    We describe a stabilizing operation point technique based on the tunable Distributed Feedback (DFB) laser for quadrature demodulation of interferometric sensors. By introducing automatic lock quadrature point and wavelength periodically tuning compensation into an interferometric system, the operation point of interferometric system is stabilized when the system suffers various environmental perturbations. To demonstrate the feasibility of this stabilizing operation point technique, experiments have been performed using a tunable-DFB-laser as light source to interrogate an extrinsic Fabry-Perot interferometric vibration sensor and a diaphragm-based acoustic sensor. Experimental results show that good tracing of Q-point was effectively realized.

  16. Chaotic Extension Neural Network Theory-Based XXY Stage Collision Fault Detection Using a Single Accelerometer Sensor

    PubMed Central

    Hsieh, Chin-Tsung; Yau, Her-Terng; Wu, Shang-Yi; Lin, Huo-Cheng

    2014-01-01

    The collision fault detection of a XXY stage is proposed for the first time in this paper. The stage characteristic signals are extracted and imported into the master and slave chaos error systems by signal filtering from the vibratory magnitude of the stage. The trajectory diagram is made from the chaos synchronization dynamic error signals E1 and E2. The distance between characteristic positive and negative centers of gravity, as well as the maximum and minimum distances of trajectory diagram, are captured as the characteristics of fault recognition by observing the variation in various signal trajectory diagrams. The matter-element model of normal status and collision status is built by an extension neural network. The correlation grade of various fault statuses of the XXY stage was calculated for diagnosis. The dSPACE is used for real-time analysis of stage fault status with an accelerometer sensor. Three stage fault statuses are detected in this study, including normal status, Y collision fault and X collision fault. It is shown that the scheme can have at least 75% diagnosis rate for collision faults of the XXY stage. As a result, the fault diagnosis system can be implemented using just one sensor, and consequently the hardware cost is significantly reduced. PMID:25405512

  17. An Extension of the Athena++ Code Framework for GRMHD Based on Advanced Riemann Solvers and Staggered-mesh Constrained Transport

    NASA Astrophysics Data System (ADS)

    White, Christopher J.; Stone, James M.; Gammie, Charles F.

    2016-08-01

    We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.

  18. A comparison of model-based and hyperbolic localization techniques as applied to marine mammal calls

    NASA Astrophysics Data System (ADS)

    Tiemann, Christopher O.; Porter, Michael B.

    2003-10-01

    A common technique for the passive acoustic localization of singing marine mammals is that of hyperbolic fixing. This technique assumes straight-line, constant wave speed acoustic propagation to associate travel time with range, but in some geometries, these assumptions can lead to localization errors. A new localization algorithm based on acoustic propagation models can account for waveguide and multipath effects, and it has successfully been tested against real acoustic data from three different environments (Hawaii, California, and Bahamas) and three different species (humpback, blue, and sperm whales). Accuracy of the model-based approach has been difficult to verify given the absence of concurrent visual and acoustic observations of the same animal. However, the model-based algorithm was recently exercised against a controlled source of known position broadcasting recorded whale sounds, and location estimates were then compared to hyperbolic techniques and true source position. In geometries where direct acoustic paths exist, both model-based and hyperbolic techniques perform equally well. However, in geometries where bathymetric and refractive effects are important, such as at long range, the model-based approach shows improved accuracy.

  19. Esthetic Craniofacial Bony and Skull Base Reconstruction Using Flap Wrapping Technique.

    PubMed

    Yano, Tomoyuki; Suesada, Nobuko; Usami, Satoshi

    2016-07-01

    For a safe and esthetic skull base reconstruction combined with repair of craniofacial bone defects, the authors introduce the flap wrapping technique in this study. This technique consists of skull base reconstruction using the vastus lateralis muscle of an anterolateral thigh (ALT) free flap, and structural craniofacial bony reconstruction using an autologous calvarial bone graft. The key to this technique is that all of the grafted autologous bone is wrapped with the vascularized fascia of the ALT free flap to protect the grafted bone from infection and exposure. Two anterior skull base tumors combined with craniofacial bony defects were included in this study. The subjects were a man and a woman, aged 18 and 64. Both patients had preoperative proton beam therapy. First, the skull base defect was filled with vastus lateralis muscle, and then structural reconstruction was performed with an autologous bone graft and a fabricated inner layer of calvarial bone, and then the grafted bone was completely wrapped in the vascularized fascia of the ALT free flap. By applying this technique, there was no intracranial infection or grafted bone exposure in these 2 patients postoperatively, even though both patients had preoperative proton beam therapy. Additionally, the vascularized fascia wrapped bone graft could provide a natural contour and prevent collapse of the craniofacial region, and this gives patients a better facial appearance even though they have had skull base surgery. PMID:27300454

  20. Reduction of large set data transmission using algorithmically corrected model-based techniques for bandwidth efficiency

    NASA Astrophysics Data System (ADS)

    Khair, Joseph Daniel

    Communication requirements and demands on deployed systems are increasing daily. This increase is due to the desire for more capability, but also, due to the changing landscape of threats on remote vehicles. As such, it is important that we continue to find new and innovative ways to transmit data to and from these remote systems, consistent with this changing landscape. Specifically, this research shows that data can be transmitted to a remote system effectively and efficiently with a model-based approach using real-time updates, called Algorithmically Corrected Model-based Technique (ACMBT), resulting in substantial savings in communications overhead. To demonstrate this model-based data transmission technique, a hardware-based test fixture was designed and built. Execution and analysis software was created to perform a series of characterizations demonstrating the effectiveness of the new transmission method. The new approach was compared to a traditional transmission approach in the same environment, and the results were analyzed and presented. A Figure of Merit (FOM) was devised and presented to allow standardized comparison of traditional and proposed data transmission methodologies alongside bandwidth utilization metrics. The results of this research have successfully shown the model-based technique to be feasible. Additionally, this research has opened the trade space for future discussion and implementation of this technique.

  1. Comparative study of manual liquid-based cytology (MLBC) technique and direct smear technique (conventional) on fine-needle cytology/fine-needle aspiration cytology samples

    PubMed Central

    Pawar, Prajkta Suresh; Gadkari, Rasika Uday; Swami, Sunil Y.; Joshi, Anil R.

    2014-01-01

    Background: Liquid-based cytology technique enables cells to be suspended in a liquid medium and spread in a monolayer, making better morphological assessment. Automated techniques have been widely used, but limited due to cost and availability. Aim: The aim was to establish manual liquid-based cytology (MLBC) technique on fine-needle aspiration cytology (FNAC) material and compare its results with conventional technique. Materials and Methods: In this study, we examined cells trapped in needles hub used for the collection of FNAC samples. 50 cases were examined by the MLBC technique and compared with the conventional FNAC technique. By centrifugation, sediment was obtained and imprint was taken on defined area. Papanicolaou (Pap) and May-Grünwald Giemsa (MGG) staining was done. Direct smears and MLBC smears were compared for cellularity, background, cellular preservation, and nuclear preservation. Slides were diagnosed independently by two cytologists with more than 5 years’ experience. Standard error of proportion was used for statistical analysis. Results: Cellularity was low in MLBC as compared with conventional smears, which is expected as remnant material in the needle hub was used. Nuclei overlap to a lesser extent and hemorrhage and necrosis was reduced, so cell morphology can be better studied in the MLBC technique. P value obtained was <0.05. Conclusion: This MLBC technique gives results comparable to the conventional technique with better morphology. In a set up where aspirators are learners, this technique will ensure adequacy due to remnant in needle hub getting processed PMID:25210235

  2. A Load-Based Multiple-Partial Unloading Micro-Indentation Technique for Mechanical Property Evaluation

    SciTech Connect

    C. Feng; J.M. Tannenbaum; B.S. Kang; M.A. Alvin

    2009-07-23

    A load-based multiple-partial unloading microindentation technique has been developed for evaluating mechanical properties of materials. Comparing to the current prevailing nano/micro-indentation methods, which require precise measurements of the indentation depth and load, the proposed technique only measures indentation load and the overall indentation displacement (i.e. including displacement of the loading apparatus). Coupled with a multiple-partial unloading procedure during the indentation process, this technique results in a load-depth sensing indentation system capable of determining Young’s modulus of metallic alloys with flat, tubular, or curved architectures. Test results show consistent and correct elastic modulus values when performing indentation tests on standard alloys such as steel, aluminum, bronze, and single crystal superalloys. The proposed micro-indentation technique has led to the development of a portable loaddepth sensing indentation system capable of on-site, in-situ material property measurement.

  3. Comparison of image compression techniques for high quality based on properties of visual perception

    NASA Astrophysics Data System (ADS)

    Algazi, V. Ralph; Reed, Todd R.

    1991-12-01

    The growing interest and importance of high quality imaging has several roots: Imaging and graphics, or more broadly multimedia, as the predominant means of man-machine interaction on computers, and the rapid maturing of advanced television technology. Because of their economic importance, proposed advanced television standards are being discussed and evaluated for rapid adoption. These advanced standards are based on well known image compression techniques, used for very low bit rate video communications as well. In this paper, we examine the expected improvement in image quality that advanced television and imaging techniques should bring about. We then examine and discuss the data compression techniques which are commonly used, to determine if they are capable of providing the achievable gain in quality, and to assess some of their limitations. We also discuss briefly the potential of these techniques for very high quality imaging and display applications, which extend beyond the range of existing and proposed television standards.

  4. A damage identification technique based on embedded sensitivity analysis and optimization processes

    NASA Astrophysics Data System (ADS)

    Yang, Chulho; Adams, Douglas E.

    2014-07-01

    A vibration based structural damage identification method, using embedded sensitivity functions and optimization algorithms, is discussed in this work. The embedded sensitivity technique requires only measured or calculated frequency response functions to obtain the sensitivity of system responses to each component parameter. Therefore, this sensitivity analysis technique can be effectively used for the damage identification process. Optimization techniques are used to minimize the difference between the measured frequency response functions of the damaged structure and those calculated from the baseline system using embedded sensitivity functions. The amount of damage can be quantified directly in engineering units as changes in stiffness, damping, or mass. Various factors in the optimization process and structural dynamics are studied to enhance the performance and robustness of the damage identification process. This study shows that the proposed technique can improve the accuracy of damage identification with less than 2 percent error of estimation.

  5. Evaluation of paint coating thickness variations based on pulsed Infrared thermography laser technique

    NASA Astrophysics Data System (ADS)

    Mezghani, S.; Perrin, E.; Vrabie, V.; Bodnar, J. L.; Marthe, J.; Cauwe, B.

    2016-05-01

    In this paper, a pulsed Infrared thermography technique using a homogeneous heat provided by a laser source is used for the non-destructive evaluation of paint coating thickness variations. Firstly, numerical simulations of the thermal response of a paint coated sample are performed. By analyzing the thermal responses as a function of thermal properties and thickness of both coating and substrate layers, optimal excitation parameters of the heating source are determined. Two characteristic parameters were studied with respect to the paint coating layer thickness variations. Results obtained using an experimental test bench based on the pulsed Infrared thermography laser technique are compared with those given by a classical Eddy current technique for paint coating variations from 5 to 130 μm. These results demonstrate the efficiency of this approach and suggest that the pulsed Infrared thermography technique presents good perspectives to characterize the heterogeneity of paint coating on large scale samples with other heating sources.

  6. Scatterometry based 65nm node CDU analysis and prediction using novel reticle measurement technique

    NASA Astrophysics Data System (ADS)

    van Ingen Schenau, Koen; Vanoppen, Peter; van der Laan, Hans; Kiers, Ton; Janssen, Maurice

    2005-05-01

    Scatterometry was selected as CD metrology for the 65nm CDU system qualification. Because of the dominant reticle residuals component in the 65nm CD budget for dense lines, significant improvements in reticle CD metrology were required. SEM is an option but requires extensive measurements due to the scatterometry grating modules. Therefore a new technique was developed and called SERUM (Spot sensor Enabled Reticle Uniformity Measurements). It uses the on board exposure system metrology sensors to measure transmission that is converted to reticle CD. It has the advantage that an entire reticle is measured within two minutes with good repeatability. The reticle fingerprints correlate well to the SEM measurements. With the improvements in reticle CD metrology offered by SEM and SERUM the reticle residuals component no longer dominates the 65nm budget for CDU system qualification.

  7. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    PubMed Central

    Madhusudhanan, B.; Chitra, S.; Rajan, C.

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality. PMID:25834838

  8. Web image retrieval using an effective topic and content-based technique

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Cheng; Prabhakara, Rashmi

    2005-03-01

    There has been an exponential growth in the amount of image data that is available on the World Wide Web since the early development of Internet. With such a large amount of information and image available and its usefulness, an effective image retrieval system is thus greatly needed. In this paper, we present an effective approach with both image matching and indexing techniques that improvise on existing integrated image retrieval methods. This technique follows a two-phase approach, integrating query by topic and query by example specification methods. In the first phase, The topic-based image retrieval is performed by using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. This technique consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. In the second phase, we use query by example specification to perform a low-level content-based image match in order to retrieve smaller and relatively closer results of the example image. From this, information related to the image feature is automatically extracted from the query image. The main objective of our approach is to develop a functional image search and indexing technique and to demonstrate that better retrieval results can be achieved.

  9. Three-dimensional region-based adaptive image processing techniques for volume visualization applications

    NASA Astrophysics Data System (ADS)

    de Deus Lopes, Roseli; Zuffo, Marcelo K.; Rangayyan, Rangaraj M.

    1996-04-01

    Recent advances in three-dimensional (3D) imaging techniques have expanded the scope of applications of volume visualization to many areas such as medical imaging, scientific visualization, robotic vision, and virtual reality. Advanced image filtering, enhancement, and analysis techniques are being developed in parallel in the field of digital image processing. Although the fields cited have many aspects in common, it appears that many of the latest developments in image processing are not being applied to the fullest extent possible in visualization. It is common to encounter the use of rather simple and elementary image pre- processing operations being used in visualization and 3D imaging applications. The purpose of this paper is to present an overview of selected topics from recent developments in adaptive image processing and demonstrate or suggest their applications in volume visualization. The techniques include adaptive noise removal; improvement of contrast and visibility of objects; space-variant deblurring and restoration; segmentation-based lossless coding for data compression; and perception-based measures for analysis, enhancement, and rendering. The techniques share the common base of identification of adaptive regions by region growing, which lends them a perceptual basis related to the human visual system. Preliminary results obtained with some of the techniques implemented so far are used to illustrate the concepts involved, and to indicate potential performance capabilities of the methods.

  10. Spatial and temporal variation of bulk snow properties in northern boreal and tundra environments based on extensive field measurements

    NASA Astrophysics Data System (ADS)

    Hannula, Henna-Reetta; Lemmetyinen, Juha; Kontu, Anna; Derksen, Chris; Pulliainen, Jouni

    2016-08-01

    An extensive in situ data set of snow depth, snow water equivalent (SWE), and snow density collected in support of the European Space Agency (ESA) SnowSAR-2 airborne campaigns in northern Finland during the winter of 2011-2012 is presented (ESA Earth Observation Campaigns data 2000-2016). The suitability of the in situ measurement protocol to provide an accurate reference for the simultaneous airborne SAR (synthetic aperture radar) data products over different land cover types was analysed in the context of spatial scale, sample spacing, and uncertainty. The analysis was executed by applying autocorrelation analysis and root mean square difference (RMSD) error estimations. The results showed overall higher variability for all the three bulk snow parameters over tundra, open bogs and lakes (due to wind processes); however, snow depth tended to vary over shorter distances in forests (due to snow-vegetation interactions). Sample spacing/sample size had a statistically significant effect on the mean snow depth over all land cover types. Analysis executed for 50, 100, and 200 m transects revealed that in most cases less than five samples were adequate to describe the snow depth mean with RMSD < 5 %, but for land cover with high overall variability an indication of increased sample size of 1.5-3 times larger was gained depending on the scale and the desired maximum RMSD. Errors for most of the land cover types reached ˜ 10 % if only three measurements were considered. The collected measurements, which are available via the ESA website upon registration, compose an exceptionally large manually collected snow data set in Scandinavian taiga and tundra environments. This information represents a valuable contribution to the snow research community and can be applied to various snow studies.

  11. Use of extensively hydrolysed formula for refeeding neonates postnecrotising enterocolitis: a nationwide survey-based, cross-sectional study

    PubMed Central

    Lapillonne, Alexandre; Matar, Maroun; Adleff, Ariane; Chbihi, Marwa; Kermorvant-Duchemin, Elsa; Campeotto, Florence

    2016-01-01

    Objective To evaluate the prevalence of and reasons for using extensively hydrolysed formulas (EHFs) of cow's milk proteins in the French neonatal units as well as the modality of their prescription for refeeding infants recovering from necrotising enterocolitis (NEC). Methods A multicentre nationwide cross-sectional study using a questionnaire to address the prevalence of use and the reasons for prescribing EHF in hospitalised neonates and to examine the protocols and the actual reasons for their use for refeeding infants in recovery from NEC. The questionnaire was sent to only 1 senior neonatologist in each neonatal unit included in the study. Results More than half of the French neonatal units participated in the survey. 91% of the surveyed units used EHF. Of 1969 infants hospitalised on the day the survey was run, 12% were fed on an EHF. 11% of the EHF prescriptions were due to previous NEC. The main reasons for using an EHF to feed infants post-NEC were the absence of human milk (75%) and surgical management of NEC (17%). When given, EHF was mainly prescribed for a period varying between 15 days and 3 months. None of the involved units continued using the EHF after 6 months of age. More than half of the surveyed units acknowledged hospitalising infants for the initiation of weaning EHF but only 21% of them tested these infants for cow's milk allergy. Conclusions The prevalence of EHF use in the French neonatal units is high. Refeeding infants post-NEC is one of the main reasons for such a high prevalence. The main incentive for using an EHF is the absence of human breast milk, either maternal or donor. PMID:27388344

  12. DCT-Yager FNN: a novel Yager-based fuzzy neural network with the discrete clustering technique.

    PubMed

    Singh, A; Quek, C; Cho, S Y

    2008-04-01

    superior performance. Extensive experiments have been conducted to test the effectiveness of these two networks, using various clustering algorithms. It follows that the SDCT and UDCT clustering algorithms are particularly suited to networks based on the Yager inference rule. PMID:18390309

  13. Extensive Reading Coursebooks in China

    ERIC Educational Resources Information Center

    Renandya, Willy A.; Hu, Guangwei; Xiang, Yu

    2015-01-01

    This article reports on a principle-based evaluation of eight dedicated extensive reading coursebooks published in mainland China and used in many universities across the country. The aim is to determine the extent to which these coursebooks reflect a core set of nine second language acquisition and extensive reading principles. Our analysis shows…

  14. Novel technique for distributed fibre sensing based on coherent Rayleigh scattering measurements of birefringence

    NASA Astrophysics Data System (ADS)

    Lu, Xin; Soto, Marcelo A.; Thévenaz, Luc

    2016-05-01

    A novel distributed fibre sensing technique is described and experimentally validated, based on birefringence measurements using coherent Rayleigh scattering. It natively provides distributed measurements of temperature and strain with more than an order of magnitude higher sensitivity than Brillouin sensing, and requiring access to a single fibre-end. Unlike the traditional Rayleigh-based coherent optical time-domain reflectometry, this new method provides absolute measurements of the measurand and may lead to a robust discrimination between temperature and strain in combination with another technique. Since birefringence is purposely induced in the fibre by design, large degrees of freedom are offered to optimize and scale the sensitivity to a given quantity. The technique has been validated in 2 radically different types of birefringent fibres - elliptical-core and Panda polarization-maintaining fibres - with a good repeatability.

  15. A general technique for computing evolutionarily stable strategies based on errors in decision-making.

    PubMed

    McNamara, J M; Webb, J N; Collins, E J; Székely, T; Houston, A I

    1997-11-21

    Realistic models of contests between animals will often involve a series of state-dependent decisions by the contestants. Computation of evolutionarily stable strategies for such state-dependent dynamic games are usually based on damped iterations of the best response map. Typically this map is discontinuous so that iterations may not converge and even if they do converge it may not be clear if the limiting strategy is a Nash equilibrium. We present a general computational technique based on errors in decision making that removes these computational difficulties. We show that the computational technique works for a simple example (the Hawk-Dove game) where an analytic solution is known, and prove general results about the technique for more complex games. It is also argued that there is biological justification for inclusion of the types of errors we have introduced. PMID:9405138

  16. Surface analysis of cast aluminum by means of artificial vision and AI-based techniques

    NASA Astrophysics Data System (ADS)

    Platero, Carlos; Fernandez, Carlos; Campoy, Pascual; Aracil, Rafael

    1996-02-01

    An architecture for surface analysis of continuous cast aluminum strip is described. The data volume to be processed has forced up the development of a high-parallel architecture for high- speed image processing. An especially suitable lighting system has been developed for defect enhancing in metallic surfaces. A special effort has been put in the design of the defect detection algorithm to reach two main objectives: robustness and low processing time. These goals have been achieved combining a local analysis together with data interpretation based on syntactical analysis that has allowed us to avoid morphological analysis. Defect classification is accomplished by means of rule-based systems along with data-based classifiers. The use of clustering techniques is discussed to perform partitions in Rn by SOM, divergency methods to reduce the feature vector applied to the data-based classifiers. The combination of techniques inside a hybrid system leads to near 100% classification success.

  17. A Hybrid Algorithm for Clustering of Time Series Data Based on Affinity Search Technique

    PubMed Central

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A.; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets. PMID:24982966

  18. Impedance-based health monitoring technique for massive structures and high-temperature structures

    NASA Astrophysics Data System (ADS)

    Park, Gyuhae; Cudney, Harley H.; Inman, Daniel J.

    1999-05-01

    This paper presents the recent research on impedance-based structural health monitoring technique at Center for Intelligent Material Systems and Structures. The basic principle behind this technique is to use high frequency structural excitation (typically greater than 30 kHz) through the surface-bonded piezoelectric sensor/actuator to detect changes in structural point impedance due to the presence of damage. Two examples are presented in this paper to explore its effectiveness to the practical field applications. First, the possibility of implementing the impedance-based health monitoring technique to detect damage on massive, dense structures was investigated. The test structure considered is a massive, circular, three-inch thick steel steam header pipe. Practical issues such as effects of external boundary condition changes and the extent of damage that could be detected were the issues to be identified. By the consistent repetition of tests, it has been determined that this impedance-based technique is able to detect a very small size of hole (4 X 20 mm), which can be considered the mass loss of 0.002% of entire structure. The second example includes the implementation of this technique in the high temperature applications. With high temperature piezoceramic materials, which have a Curie temperature higher than 2000 degrees F, experiments were performed to detect damage on the bolted joint structure in the temperature range of 900 - 1100 degrees F. Through the experimental investigations, the applicability of this impedance-based health monitoring technique to monitor such an extreme application was verified, with some practical issues need to be resolved. Data collected from the tests proved beyond a doubt the capability of this technology to detect both existing and imminent damage.

  19. A spline-based parameter estimation technique for static models of elastic structures

    NASA Technical Reports Server (NTRS)

    Dutt, P.; Taasan, S.

    1986-01-01

    The problem of identifying the spatially varying coefficient of elasticity using an observed solution to the forward problem is considered. Under appropriate conditions this problem can be treated as a first order hyperbolic equation in the unknown coefficient. Some continuous dependence results are developed for this problem and a spline-based technique is proposed for approximating the unknown coefficient, based on these results. The convergence of the numerical scheme is established and error estimates obtained.

  20. A spline-based parameter estimation technique for static models of elastic structures

    NASA Technical Reports Server (NTRS)

    Dutt, P.; Ta'asan, S.

    1989-01-01

    The problem of identifying the spatially varying coefficient of elasticity using an observed solution to the forward problem is considered. Under appropriate conditions this problem can be treated as a first order hyperbolic equation in the unknown coefficient. Some continuous dependence results are developed for this problem and a spline-based technique is proposed for approximating the unknown coefficient, based on these results. The convergence of the numerical scheme is established and error estimates obtained.

  1. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  2. Enhanced Detection of Multivariate Outliers Using Algorithm-Based Visual Display Techniques.

    ERIC Educational Resources Information Center

    Dickinson, Wendy B.

    This study uses an algorithm-based visual display technique (FACES) to provide enhanced detection of multivariate outliers within large-scale data sets. The FACES computer graphing algorithm (H. Chernoff, 1973) constructs a cartoon-like face, using up to 18 variables for each case. A major advantage of FACES is the ability to store and show the…

  3. Validation of Learning Effort Algorithm for Real-Time Non-Interfering Based Diagnostic Technique

    ERIC Educational Resources Information Center

    Hsu, Pi-Shan; Chang, Te-Jeng

    2011-01-01

    The objective of this research is to validate the algorithm of learning effort which is an indicator of a new real-time and non-interfering based diagnostic technique. IC3 Mentor, the adaptive e-learning platform fulfilling the requirements of intelligent tutor system, was applied to 165 university students. The learning records of the subjects…

  4. Studying Student Teachers' Concerns, Combining Image-Based and More Traditional Research Techniques

    ERIC Educational Resources Information Center

    Swennen, Anja; Jorg, Ton; Korthagen, Fred

    2004-01-01

    In a study of student teachers' concerns, a combination of image-based and more traditional research techniques was used. The first year student teachers appeared to be most concerned about matters that, in their view, form the core task of teaching, such as 'selecting and teaching content well', 'motivating pupils to learn' and 'adapting myself…

  5. Computer Based Techniques for School Bus Routing. Working Paper Series No. WP060690.

    ERIC Educational Resources Information Center

    Osborne, Kimberly A.; And Others

    This report details the data requirements and procedures used to develop new school bus routes for Six Mile Elementary School in South Carolina. The project examined the current routes of the school and applied computer based techniques to develop new routes given the existing bus stops. Bus routes were developed so that distance and travel time…

  6. Using a Written Journal Technique to Enhance Inquiry-Based Reflection about Teaching

    ERIC Educational Resources Information Center

    Fry, Jane; Carol, Klages; Venneman, Sandy

    2013-01-01

    The aim of this study was to explore the efficacy of two written journal techniques used to encourage teacher candidates' inquiry-based reflection regarding course textbook content. Ninety-six participants were randomly assigned to one of the two experimental conditions, journaling with Questions, Quotes and Reflections (Double Q R) or…

  7. Multidimensional Test Assembly Based on Lagrangian Relaxation Techniques. Research Report 98-08.

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.

    In this paper, a mathematical programming approach is presented for the assembly of ability tests measuring multiple traits. The values of the variance functions of the estimators of the traits are minimized, while test specifications are met. The approach is based on Lagrangian relaxation techniques and provides good results for the two…

  8. Phase demodulation from a single fringe pattern based on a correlation technique.

    PubMed

    Robin, Eric; Valle, Valéry

    2004-08-01

    We present a method for determining the demodulated phase from a single fringe pattern. This method, based on a correlation technique, searches in a zone of interest for the degree of similarity between a real fringe pattern and a mathematical model. This method, named modulated phase correlation, is tested with different examples. PMID:15298408

  9. Molecular-Based Optical Measurement Techniques for Transition and Turbulence in High-Speed Flow

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Cutler, Andrew D.

    2013-01-01

    High-speed laminar-to-turbulent transition and turbulence affect the control of flight vehicles, the heat transfer rate to a flight vehicle's surface, the material selected to protect such vehicles from high heating loads, the ultimate weight of a flight vehicle due to the presence of thermal protection systems, the efficiency of fuel-air mixing processes in high-speed combustion applications, etc. Gaining a fundamental understanding of the physical mechanisms involved in the transition process will lead to the development of predictive capabilities that can identify transition location and its impact on parameters like surface heating. Currently, there is no general theory that can completely describe the transition-to-turbulence process. However, transition research has led to the identification of the predominant pathways by which this process occurs. For a truly physics-based model of transition to be developed, the individual stages in the paths leading to the onset of fully turbulent flow must be well understood. This requires that each pathway be computationally modeled and experimentally characterized and validated. This may also lead to the discovery of new physical pathways. This document is intended to describe molecular based measurement techniques that have been developed, addressing the needs of the high-speed transition-to-turbulence and high-speed turbulence research fields. In particular, we focus on techniques that have either been used to study high speed transition and turbulence or techniques that show promise for studying these flows. This review is not exhaustive. In addition to the probe-based techniques described in the previous paragraph, several other classes of measurement techniques that are, or could be, used to study high speed transition and turbulence are excluded from this manuscript. For example, surface measurement techniques such as pressure and temperature paint, phosphor thermography, skin friction measurements and

  10. Allele-specific extension allows base-pair neutral homozygotes to be discriminated by high-resolution melting of small amplicons.

    PubMed

    Cai, Yanning; Yuan, Yanpeng; Lin, Qingling; Chan, Piu

    2010-11-01

    Not all single-nucleotide polymorphisms (SNPs) can be determined using high-resolution melting (HRM) of small amplicons, especially class 3 and 4 SNPs. This is due mainly to the small shift in the melting temperature (Tm) between two types of homozygote. Choosing rs1869458 (a class 4 SNP) as a sample, we developed a modified small amplicon HRM assay. An allele-specific extension (ASE) primer, which ended at an SNP site and matched only one of the alleles, was added to the reaction as well as additional thermal steps for ASE. Following asymmetric polymerase chain reaction and melting curve analysis, heterozygotes were easily identified. Two types of homozygote were also distinguishable, indicating that extension primers 11 to 13 bases in length worked efficiently in an allele-specific way. Modification of the limiting amplification primer with locked nucleic acid increased the Tm difference between extension and amplification peaks and facilitated subsequent genotyping. In addition, 194 human genomic DNA samples were genotyped with the developed assay and by direct sequencing, with the different methods providing identical genotyping results. In conclusion, ASE-HRM is a simple, inexpensive, closed-tube genotyping method that can be used to examine all types of SNP. PMID:20599636

  11. An Extension to the Constructivist Coding Hypothesis as a Learning Model for Selective Feedback when the Base Rate Is High

    ERIC Educational Resources Information Center

    Ghaffarzadegan, Navid; Stewart, Thomas R.

    2011-01-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…

  12. Encoding technique for high data compaction in data bases of fusion devices

    SciTech Connect

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.

    1996-12-01

    At present, data requirements of hundreds of Mbytes/discharge are typical in devices such as JET, TFTR, DIII-D, etc., and these requirements continue to increase. With these rates, the amount of storage required to maintain discharge information is enormous. Compaction techniques are now essential to reduce storage. However, general compression techniques may distort signals, but this is undesirable for fusion diagnostics. We have developed a general technique for data compression which is described here. The technique, which is based on delta compression, does not require an examination of the data as in delayed methods. Delta values are compacted according to general encoding forms which satisfy a prefix code property and which are defined prior to data capture. Several prefix codes, which are bit oriented and which have variable code lengths, have been developed. These encoding methods are independent of the signal analog characteristics and enable one to store undistorted signals. The technique has been applied to databases of the TJ-I tokamak and the TJ-IU torsatron. Compaction rates of over 80{percent} with negligible computational effort were achieved. Computer programs were written in ANSI C, thus ensuring portability and easy maintenance. We also present an interpretation, based on information theory, of the high compression rates achieved without signal distortion. {copyright} {ital 1996 American Institute of Physics.}

  13. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  14. Effect of root canal filling techniques on the bond strength of epoxy resin-based sealers.

    PubMed

    Rached-Júnior, Fuad Jacob Abi; Souza, Angélica Moreira; Macedo, Luciana Martins Domingues; Raucci-Neto, Walter; Baratto-Filho, Flares; Silva, Bruno Marques; Silva-Sousa, Yara Teresinha Corrêa

    2016-01-01

    The aim of this study was to evaluate the effects of different root canal filling techniques on the bond strength of epoxy resin-based sealers. Sixty single-rooted canines were prepared using ProTaper (F5) and divided into the following groups based on the root filling technique: Lateral Compaction (LC), Single Cone (SC), and Tagger Hybrid Technique (THT). The following subgroups (n = 10) were also created based on sealer material used: AH Plus and Sealer 26. Two-millimeter-thick slices were cut from all the root thirds and subjected to push-out test. Data (MPa) was analyzed using ANOVA and Tukey's test (α = 0.05). The push-out values were significantly affected by the sealer, filling technique, and root third (p < 0.05). AH Plus (1.37 ± 1.04) exhibited higher values than Sealer 26 (0.92 ± 0.51), while LC (1.80 ± 0.98) showed greater bond strength than THT (1.16 ± 0.50) and SC (0.92 ± 0.25). The cervical (1.45 ± 1.14) third exhibited higher bond strength, followed by the middle (1.20 ± 0.72) and apical (0.78 ± 0.33) thirds. AH Plus/LC (2.26 ± 1.15) exhibited the highest bond strength values, followed by AH Plus/THT (1.32 ± 0.61), Sealer 26/LC (1.34 ± 0.42), and Sealer 26/THT (1.00 ± 0.27). The lowest values were obtained with AH Plus/SC and Sealer 26/SC. Thus, it can be concluded that the filling technique affects the bond strength of sealers. LC was associated with higher bond strength between the material and intra-radicular dentine than THT and SC techniques. PMID:26910020

  15. Agarose-Based Substrate Modification Technique for Chemical and Physical Guiding of Neurons In Vitro.

    PubMed

    Krumpholz, Katharina; Rogal, Julia; El Hasni, Akram; Schnakenberg, Uwe; Bräunig, Peter; Bui-Göbbels, Katrin

    2015-08-26

    A new low cost and highly reproducible technique is presented that provides patterned cell culture substrates. These allow for selective positioning of cells and a chemically and mechanically directed guiding of their extensions. The patterned substrates consist of structured agarose hydrogels molded from reusable silicon micro templates. These templates consist of pins arranged equidistantly in squares, connected by bars, which mold corresponding wells and channels in the nonadhesive agarose hydrogel. Subsequent slice production with a standard vibratome, comprising the described template pattern, completes substrate production. Invertebrate neurons of locusts and pond snails are used for this application as they offer the advantage over vertebrate cells as being very large and suitable for cultivation in low cell density. Their neurons adhere to and grow only on the adhesive areas not covered by the agarose. Agarose slices of 50 μm thickness placed on glass, polystyrene, or MEA surfaces position and immobilize the neurons in the wells, and the channels guide their neurite outgrowth toward neighboring wells. In addition to the application with invertebrate neurons, the technique may also provide the potential for the application of a wide range of cell types. Long-term objective is the achievement of isolated low-density neuronal networks on MEAs or different culture substrates for various network analysis applications. PMID:26237337

  16. Rotational roadmapping: a new image-based navigation technique for the interventional room.

    PubMed

    Kukuk, Markus; Napel, Sandy

    2007-01-01

    For decades, conventional 2D-roadmaping has been the method of choice for image-based guidewire navigation during endovascular procedures. Only recently have 3D-roadmapping techniques become available that are based on the acquisition and reconstruction of a 3D image of the vascular tree. In this paper, we present a new image-based navigation technique called RoRo (Rotational Roadmapping) that eliminates the guess-work inherent to the conventional 2D method, but does not require a 3D image. Our preliminary clinical results show that there are situations in which RoRo is preferred over the existing two methods, thus demonstrating potential for filling a clinical niche and complementing the spectrum of available navigation tools. PMID:18044622

  17. Plasma-based ambient mass spectrometry techniques: The current status and future prospective.

    PubMed

    Ding, Xuelu; Duan, Yixiang

    2015-01-01

    Plasma-based ambient mass spectrometry is emerging as a frontier technology for direct analysis of sample that employs low-energy plasma as the ionization reagent. The versatile sources of ambient mass spectrometry (MS) can be classified according to the plasma formation approaches; namely, corona discharge, glow discharge, dielectric barrier discharge, and microwave-induced discharge. These techniques allow pretreatment-free detection of samples, ranging from biological materials (e.g., flies, bacteria, plants, tissues, peptides, metabolites, and lipids) to pharmaceuticals, food-stuffs, polymers, chemical warfare reagents, and daily-use chemicals. In most cases, plasma-based ambient MS performs well as a qualitative tool and as an analyzer for semi-quantitation. Herein, we provide an overview of the key concepts, mechanisms, and applications of plasma-based ambient MS techniques, and discuss the challenges and outlook. PMID:24338668

  18. Experimental comparison between speckle and grating-based imaging technique using synchrotron radiation X-rays.

    PubMed

    Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal

    2016-08-01

    X-ray phase contrast and dark-field imaging techniques provide important and complementary information that is inaccessible to the conventional absorption contrast imaging. Both grating-based imaging (GBI) and speckle-based imaging (SBI) are able to retrieve multi-modal images using synchrotron as well as lab-based sources. However, no systematic comparison has been made between the two techniques so far. We present an experimental comparison between GBI and SBI techniques with synchrotron radiation X-ray source. Apart from the simple experimental setup, we find SBI does not suffer from the issue of phase unwrapping, which can often be problematic for GBI. In addition, SBI is also superior to GBI since two orthogonal differential phase gradients can be simultaneously extracted by one dimensional scan. The GBI has less stringent requirements for detector pixel size and transverse coherence length when a second or third grating can be used. This study provides the reference for choosing the most suitable technique for diverse imaging applications at synchrotron facility. PMID:27505829

  19. Planning and delivery comparison of six linac-based stereotactic radiosurgery techniques

    NASA Astrophysics Data System (ADS)

    Thakur, Varun Singh

    This work presents planning and delivery comparison of linac-based SRS treatment techniques currently available for single lesion cranial SRS. In total, two dedicated SRS systems (Novalis Tx, Cyberknife) and a HI-ART TomoTherapy system with six different delivery techniques are evaluated. Four delivery techniques are evaluated on a Novalis Tx system: circular cones, dynamic conformal arcs (DCA), static non-coplanar intensity modulated radiotherapy (NCP-IMRT), and volumetric modulated arc therapy (RapidArc) techniques are compared with intensity modulation based helical Tomotherapy on the HI-ART Tomotherapy system and with non-isocentric, multiple overlapping based robotic radiosurgery using the CyberKnife system. Thirteen patients are retrospectively selected for the study. The target volumes of each patient are transferred to a CT scan of a Lucy phantom (Standard Imaging Inc., Middleton, WI, USA) designed for end-to-end SRS QA. In order to evaluate the plans, several indices scoring the conformality, homogeneity and gradients in the plan are calculated and compared for each of the plans. Finally, to check the clinical deliverability of the plans and the delivery accuracy of different systems, a few targets are delivered on each system. A comparison between planned dose on treatment planning system and dose delivered on Gafchromic EBT film (ISP, Wayne, New Jersey, USA) is carried out by comparing dose beam profiles, isodose lines and by calculating gamma index.

  20. Nano-Al Based Energetics: Rapid Heating Studies and a New Preparation Technique

    NASA Astrophysics Data System (ADS)

    Sullivan, Kyle; Kuntz, Josh; Gash, Alex; Zachariah, Michael

    2011-06-01

    Nano-Al based thermites have become an attractive alternative to traditional energetic formulations due to their increased energy density and high reactivity. Understanding the intrinsic reaction mechanism has been a difficult task, largely due to the lack of experimental techniques capable of rapidly and uniform heating a sample (~104- 108 K/s). The current work presents several studies on nano-Al based thermites, using rapid heating techniques. A new mechanism termed a Reactive Sintering Mechanism is proposed for nano-Al based thermites. In addition, new experimental techniques for nanocomposite thermite deposition onto thin Pt electrodes will be discussed. This combined technique will offer more precise control of the deposition, and will serve to further our understanding of the intrinsic reaction mechanism of rapidly heated energetic systems. An improved mechanistic understanding will lead to the development of optimized formulations and architectures. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. DSP-based optical modulation technique for long-haul transmission

    NASA Astrophysics Data System (ADS)

    Yoshida, T.; Sugihara, T.; Uto, K.

    2015-01-01

    Fiber nonlinearity and equalization-enhanced phase noise (EEPN) generate rapid perturbations and critically limit the system capacity and range of long-haul optical transmission. It is possible to cancel the rapid perturbations by introducing a particular correlation between multiple signals at the transmitter and analyzing the received signals using digital signal processing. In this paper, we review our proposed techniques to cancel rapid perturbations of polarization multiplexed signals due to fiber nonlinearity and EEPN. Numerical simulation of quaternary phase-shift keying based signals shows 1.2 dB and 0.5 dB improvement respectively from the proposed cancellation techniques for fiber nonlinearity and EEPN.

  2. Gradient-based multiobjective optimization using a distance constraint technique and point replacement

    NASA Astrophysics Data System (ADS)

    Sato, Yuki; Izui, Kazuhiro; Yamada, Takayuki; Nishiwaki, Shinji

    2016-07-01

    This paper proposes techniques to improve the diversity of the searching points during the optimization process in an Aggregative Gradient-based Multiobjective Optimization (AGMO) method, so that well-distributed Pareto solutions are obtained. First to be discussed is a distance constraint technique, applied among searching points in the objective space when updating design variables, that maintains a minimum distance between the points. Next, a scheme is introduced that deals with updated points that violate the distance constraint, by deleting the offending points and introducing new points in areas of the objective space where searching points are sparsely distributed. Finally, the proposed method is applied to example problems to illustrate its effectiveness.

  3. Airframe structural damage detection: a non-linear structural surface intensity based technique.

    PubMed

    Semperlotti, Fabio; Conlon, Stephen C; Barnard, Andrew R

    2011-04-01

    The non-linear structural surface intensity (NSSI) based damage detection technique is extended to airframe applications. The selected test structure is an upper cabin airframe section from a UH-60 Blackhawk helicopter (Sikorsky Aircraft, Stratford, CT). Structural damage is simulated through an impact resonator device, designed to simulate the induced vibration effects typical of non-linear behaving damage. An experimental study is conducted to prove the applicability of NSSI on complex mechanical systems as well as to evaluate the minimum sensor and actuator requirements. The NSSI technique is shown to have high damage detection sensitivity, covering an extended substructure with a single sensing location. PMID:21476618

  4. Simplified Technique for Incorporating a Metal Mesh into Record Bases for Mandibular Implant Overdentures.

    PubMed

    Godoy, Antonio; Siegel, Sharon C

    2015-12-01

    Mandibular implant-retained overdentures have become the standard of care for patients with mandibular complete edentulism. As part of the treatment, the mandibular implant-retained overdenture may require a metal mesh framework to be incorporated to strengthen the denture and avoid fracture of the prosthesis. Integrating the metal mesh framework as part of the acrylic record base and wax occlusion rim before the jaw relation procedure will avoid the distortion of the record base and will minimize the chances of processing errors. A simplified method to incorporate the mesh into the record base and occlusion rim is presented in this technique article. PMID:25659988

  5. Dense estimation and object-based segmentation of the optical flow with robust techniques.

    PubMed

    Mémin, E; Pérez, P

    1998-01-01

    In this paper, we address the issue of recovering and segmenting the apparent velocity field in sequences of images. As for motion estimation, we minimize an objective function involving two robust terms. The first one cautiously captures the optical flow constraint, while the second (a priori) term incorporates a discontinuity-preserving smoothness constraint. To cope with the nonconvex minimization problem thus defined, we design an efficient deterministic multigrid procedure. It converges fast toward estimates of good quality, while revealing the large discontinuity structures of flow fields. We then propose an extension of the model by attaching to it a flexible object-based segmentation device based on deformable closed curves (different families of curve equipped with different kinds of prior can be easily supported). Experimental results on synthetic and natural sequences are presented, including an analysis of sensitivity to parameter tuning. PMID:18276286

  6. Phylogenetic relationships within the speciose family Characidae (Teleostei: Ostariophysi: Characiformes) based on multilocus analysis and extensive ingroup sampling

    PubMed Central

    2011-01-01

    Background With nearly 1,100 species, the fish family Characidae represents more than half of the species of Characiformes, and is a key component of Neotropical freshwater ecosystems. The composition, phylogeny, and classification of Characidae is currently uncertain, despite significant efforts based on analysis of morphological and molecular data. No consensus about the monophyly of this group or its position within the order Characiformes has been reached, challenged by the fact that many key studies to date have non-overlapping taxonomic representation and focus only on subsets of this diversity. Results In the present study we propose a new definition of the family Characidae and a hypothesis of relationships for the Characiformes based on phylogenetic analysis of DNA sequences of two mitochondrial and three nuclear genes (4,680 base pairs). The sequences were obtained from 211 samples representing 166 genera distributed among all 18 recognized families in the order Characiformes, all 14 recognized subfamilies in the Characidae, plus 56 of the genera so far considered incertae sedis in the Characidae. The phylogeny obtained is robust, with most lineages significantly supported by posterior probabilities in Bayesian analysis, and high bootstrap values from maximum likelihood and parsimony analyses. Conclusion A monophyletic assemblage strongly supported in all our phylogenetic analysis is herein defined as the Characidae and includes the characiform species lacking a supraorbital bone and with a derived position of the emergence of the hyoid artery from the anterior ceratohyal. To recognize this and several other monophyletic groups within characiforms we propose changes in the limits of several families to facilitate future studies in the Characiformes and particularly the Characidae. This work presents a new phylogenetic framework for a speciose and morphologically diverse group of freshwater fishes of significant ecological and evolutionary importance

  7. Studies of an extensively axisymmetric rocket based combined cycle (RBCC) engine powered single-stage-to-orbit (SSTO) vehicle

    SciTech Connect

    Foster, R.W.; Escher, W.J.D.; Robinson, J.W.

    1989-01-01

    The present comparative performance study has established that rocket-based combined cycle (RBCC) propulsion systems, when incorporated by essentially axisymmetric SSTO launch vehicle configurations whose conical forebody maximizes both capture-area ratio and total capture area, are capable of furnishing payload-delivery capabilities superior to those of most multistage, all-rocket launchers. Airbreathing thrust augmentation in the rocket-ejector mode of an RBCC powerplant is noted to make a major contribution to final payload capability, by comparison to nonair-augmented rocket engine propulsion systems. 16 refs.

  8. Studies of an extensively axisymmetric rocket based combined cycle (RBCC) engine powered single-stage-to-orbit (SSTO) vehicle

    NASA Technical Reports Server (NTRS)

    Foster, Richard W.; Escher, William J. D.; Robinson, John W.

    1989-01-01

    The present comparative performance study has established that rocket-based combined cycle (RBCC) propulsion systems, when incorporated by essentially axisymmetric SSTO launch vehicle configurations whose conical forebody maximizes both capture-area ratio and total capture area, are capable of furnishing payload-delivery capabilities superior to those of most multistage, all-rocket launchers. Airbreathing thrust augmentation in the rocket-ejector mode of an RBCC powerplant is noted to make a major contribution to final payload capability, by comparison to nonair-augmented rocket engine propulsion systems.

  9. Microsatellite marker based genetic linkage maps of Oreochromis aureus and O. niloticus (Cichlidae): extensive linkage group segment homologies revealed.

    PubMed

    McConnell, S K; Beynon, C; Leamon, J; Skibinski, D O

    2000-06-01

    Partial genetic linkage maps, based on microsatellite markers, were constructed for two tilapia species, Oreochromis aureus and Oreochromis niloticus using an interspecific backcross population. The linkage map for O. aureus comprised 28 markers on 10 linkage groups and covered 212.8 CM. Nine markers were mapped to four linkage groups on an O. niloticus female linkage map covering 40.6 CM. Results revealed a high degree of conservation of synteny between the linkage groups defined in O. aureus and the previously published genetic linkage map of O. niloticus. PMID:10895314

  10. A cluster randomized control field trial of the ABRACADABRA web-based reading technology: replication and extension of basic findings

    PubMed Central

    Piquette, Noella A.; Savage, Robert S.; Abrami, Philip C.

    2014-01-01

    The present paper reports a cluster randomized control trial evaluation of teaching using ABRACADABRA (ABRA), an evidence-based and web-based literacy intervention (http://abralite.concordia.ca) with 107 kindergarten and 96 grade 1 children in 24 classes (12 intervention 12 control classes) from all 12 elementary schools in one school district in Canada. Children in the intervention condition received 10–12 h of whole class instruction using ABRA between pre- and post-test. Hierarchical linear modeling of post-test results showed significant gains in letter-sound knowledge for intervention classrooms over control classrooms. In addition, medium effect sizes were evident for three of five outcome measures favoring the intervention: letter-sound knowledge (d= +0.66), phonological blending (d = +0.52), and word reading (d = +0.52), over effect sizes for regular teaching. It is concluded that regular teaching with ABRA technology adds significantly to literacy in the early elementary years. PMID:25538663

  11. A cluster randomized control field trial of the ABRACADABRA web-based reading technology: replication and extension of basic findings.

    PubMed

    Piquette, Noella A; Savage, Robert S; Abrami, Philip C

    2014-01-01

    The present paper reports a cluster randomized control trial evaluation of teaching using ABRACADABRA (ABRA), an evidence-based and web-based literacy intervention (http://abralite.concordia.ca) with 107 kindergarten and 96 grade 1 children in 24 classes (12 intervention 12 control classes) from all 12 elementary schools in one school district in Canada. Children in the intervention condition received 10-12 h of whole class instruction using ABRA between pre- and post-test. Hierarchical linear modeling of post-test results showed significant gains in letter-sound knowledge for intervention classrooms over control classrooms. In addition, medium effect sizes were evident for three of five outcome measures favoring the intervention: letter-sound knowledge (d= +0.66), phonological blending (d = +0.52), and word reading (d = +0.52), over effect sizes for regular teaching. It is concluded that regular teaching with ABRA technology adds significantly to literacy in the early elementary years. PMID:25538663

  12. Reformulation linearization technique based branch-and-reduce approach applied to regional water supply system planning

    NASA Astrophysics Data System (ADS)

    Lan, Fujun; Bayraksan, Güzin; Lansey, Kevin

    2016-03-01

    A regional water supply system design problem that determines pipe and pump design parameters and water flows over a multi-year planning horizon is considered. A non-convex nonlinear model is formulated and solved by a branch-and-reduce global optimization approach. The lower bounding problem is constructed via a three-pronged effort that involves transforming the space of certain decision variables, polyhedral outer approximations, and the Reformulation Linearization Technique (RLT). Range reduction techniques are employed systematically to speed up convergence. Computational results demonstrate the efficiency of the proposed algorithm; in particular, the critical role range reduction techniques could play in RLT based branch-and-bound methods. Results also indicate using reclaimed water not only saves freshwater sources but is also a cost-effective non-potable water source in arid regions. Supplemental data for this article can be accessed at http://dx.doi.org/10.1080/0305215X.2015.1016508.

  13. Model-based super-resolution reconstruction techniques for underwater imaging

    NASA Astrophysics Data System (ADS)

    Chen, Yuzhang; Yang, Bofei; Xia, Min; Li, Wei; Yang, Kecheng; Zhang, Xiaohui

    2012-01-01

    The visibility of underwater imaging has been of long-standing interest to investigators working in many civilian and military areas such as oceanographic environments, efforts such as image restoration techniques can help to enhance the image quality; however, the resolution is still limited. Image super resolution reconstruction (SRR) techniques are promising approaches for improving resolution beyond the limit of hardware; furthermore, with the prior knowledge of the imaging system such as the point spread function and diffration limit, performance of the super resolution reconstruction can be further enhanced, which can also extend the imaging range as well. In order to improve the resolution to a best possible level, an imaging model based on beam propagation is established and applied to image super-resolution reconstruction techniques for an underwater range-gated pulsed laser imaging system in the presented effort. Experimental results show that the proposed approaches can effectively enhance the resolution and quality of underwater imaging

  14. Model-based super-resolution reconstruction techniques for underwater imaging

    NASA Astrophysics Data System (ADS)

    Chen, Yuzhang; Yang, Bofei; Xia, Min; Li, Wei; Yang, Kecheng; Zhang, Xiaohui

    2011-11-01

    The visibility of underwater imaging has been of long-standing interest to investigators working in many civilian and military areas such as oceanographic environments, efforts such as image restoration techniques can help to enhance the image quality; however, the resolution is still limited. Image super resolution reconstruction (SRR) techniques are promising approaches for improving resolution beyond the limit of hardware; furthermore, with the prior knowledge of the imaging system such as the point spread function and diffration limit, performance of the super resolution reconstruction can be further enhanced, which can also extend the imaging range as well. In order to improve the resolution to a best possible level, an imaging model based on beam propagation is established and applied to image super-resolution reconstruction techniques for an underwater range-gated pulsed laser imaging system in the presented effort. Experimental results show that the proposed approaches can effectively enhance the resolution and quality of underwater imaging

  15. A study on laser-based ultrasonic technique by the use of guided wave tomographic imaging

    SciTech Connect

    Park, Junpil Lim, Juyoung; Cho, Younho; Krishnaswamy, Sridhar

    2015-03-31

    Guided wave tests are impractical for investigating specimens with limited accessibility and coarse surfaces or geometrically complicated features. A non-contact setup with a laser ultrasonic transmitter and receiver is the classic attractive for guided wave inspection. The present work was done to develop a non-contact guided-wave tomography technique by laser ultrasonic technique in a plate-like structure. A method for Lam wave generation and detection in an aluminum plate with a pulse laser ultrasonic transmitter and a Michelson interferometer receiver has been developed. In the images obtained by laser scanning, the defect shape and area showed good agreement with the actual defect. The proposed approach can be used as a non-contact-based online inspection and monitoring technique.

  16. Towards a balanced software team formation based on Belbin team role using fuzzy technique

    NASA Astrophysics Data System (ADS)

    Omar, Mazni; Hasan, Bikhtiyar; Ahmad, Mazida; Yasin, Azman; Baharom, Fauziah; Mohd, Haslina; Darus, Norida Muhd

    2016-08-01

    In software engineering (SE), team roles play significant impact in determining the project success. To ensure the optimal outcome of the project the team is working on, it is essential to ensure that the team members are assigned to the right role with the right characteristics. One of the prevalent team roles is Belbin team role. A successful team must have a balance of team roles. Thus, this study demonstrates steps taken to determine balance of software team formation based on Belbin team role using fuzzy technique. Fuzzy technique was chosen because it allows analyzing of imprecise data and classifying selected criteria. In this study, two roles in Belbin team role, which are Shaper (Sh) and Plant (Pl) were chosen to assign the specific role in software team. Results show that the technique is able to be used for determining the balance of team roles. Future works will focus on the validation of the proposed method by using empirical data in industrial setting.

  17. A Rapid, Fluorescence-Based Field Screening Technique for Organic Species in Soil and Water Matrices.

    PubMed

    Russell, Amber L; Martin, David P; Cuddy, Michael F; Bednar, Anthony J

    2016-06-01

    Real-time detection of hydrocarbon contaminants in the environment presents analytical challenges because traditional laboratory-based techniques are cumbersome and not readily field portable. In the current work, a method for rapid and semi-quantitative detection of organic contaminants, primarily crude oil, in natural water and soil matrices has been developed. Detection limits in the parts per million and parts per billion were accomplished when using visual and digital detection methods, respectively. The extraction technique was modified from standard methodologies used for hydrocarbon analysis and provides a straight-forward separation technique that can remove interference from complex natural constituents. For water samples this method is semi-quantitative, with recoveries ranging from 70 % to 130 %, while measurements of soil samples are more qualitative due to lower extraction efficiencies related to the limitations of field-deployable procedures. PMID:26988223

  18. Research on target recognition techniques of radar networking based on fuzzy mathematics

    NASA Astrophysics Data System (ADS)

    Guan, Chengbin; Wang, Guohong; Guan, Chengzhun; Pan, Jinshan

    2007-11-01

    Nowadays there are more and more targets, so it is more difficult for radar networking to track the important targets. To reduce the pressure on radar networking and the waste of ammunition, it is very necessary for radar networking to recognize the targets. Two target recognition approaches of radar networking based on fuzzy mathematics are proposed in this paper, which are multi-level fuzzy synthetical evaluation technique and lattice approaching degree technique. By analyzing the principles, the application techniques are given, the merits and shortcomings are also analyzed, and applying environments are advised. Another emphasis is the compare between the multiple mono-level fuzzy synthetical evaluation and the multi-level fuzzy synthetical evaluation, an instance is carried out to illuminate the problem, then the results are analyzed in theory, the conclusions are gotten which can be instructions for application in engineering.

  19. X-ray phase imaging using a Gd-based absorption grating fabricated by imprinting technique

    NASA Astrophysics Data System (ADS)

    Yashiro, Wataru; Kato, Kosuke; Sadeghilaridjani, Maryam; Momose, Atsushi; Shinohara, Takenao; Kato, Hidemi

    2016-04-01

    A high-aspect-ratio absorption grating with a pitch of several µm is a key component of X-ray grating interferometery, which is an X-ray phase imaging technique that allows for highly sensitive X-ray imaging with a compact laboratory X-ray source. Here, we report that X-ray phase imaging was successfully performed at 15 keV by using a 23 ± 1-µm-height, 9-µm-pitch absorption grating (10 × 10 mm2) based on Gd (Gd60Cu25Al15) fabricated by a metallic glass imprinting technique. The imprinting technique is cost-efficient and has a high-production rate, and will be widely used for fabricating gratings not only for X-rays but also neutrons in the near future.

  20. Optimal technique of linear accelerator-based stereotactic radiosurgery for tumors adjacent to brainstem.

    PubMed

    Chang, Chiou-Shiung; Hwang, Jing-Min; Tai, Po-An; Chang, You-Kang; Wang, Yu-Nong; Shih, Rompin; Chuang, Keh-Shih

    2016-01-01

    Stereotactic radiosurgery (SRS) is a well-established technique that is replacing whole-brain irradiation in the treatment of intracranial lesions, which leads to better preservation of brain functions, and therefore a better quality of life for the patient. There are several available forms of linear accelerator (LINAC)-based SRS, and the goal of the present study is to identify which of these techniques is best (as evaluated by dosimetric outcomes statistically) when the target is located adjacent to brainstem. We collected the records of 17 patients with lesions close to the brainstem who had previously been treated with single-fraction radiosurgery. In all, 5 different lesion catalogs were collected, and the patients were divided into 2 distance groups-1 consisting of 7 patients with a target-to-brainstem distance of less than 0.5cm, and the other of 10 patients with a target-to-brainstem distance of ≥ 0.5 and < 1cm. Comparison was then made among the following 3 types of LINAC-based radiosurgery: dynamic conformal arcs (DCA), intensity-modulated radiosurgery (IMRS), and volumetric modulated arc radiotherapy (VMAT). All techniques included multiple noncoplanar beams or arcs with or without intensity-modulated delivery. The volume of gross tumor volume (GTV) ranged from 0.2cm(3) to 21.9cm(3). Regarding the dose homogeneity index (HIICRU) and conformity index (CIICRU) were without significant difference between techniques statistically. However, the average CIICRU = 1.09 ± 0.56 achieved by VMAT was the best of the 3 techniques. Moreover, notable improvement in gradient index (GI) was observed when VMAT was used (0.74 ± 0.13), and this result was significantly better than those achieved by the 2 other techniques (p < 0.05). For V4Gy of brainstem, both VMAT (2.5%) and IMRS (2.7%) were significantly lower than DCA (4.9%), both at the p < 0.05 level. Regarding V2Gy of normal brain, VMAT plans had attained 6.4 ± 5%; this was significantly better (p < 0.05) than

  1. An extensively hydrolysed rice protein-based formula in the management of infants with cow's milk protein allergy: preliminary results after 1 month

    PubMed Central

    Vandenplas, Yvan; De Greef, Elisabeth; Hauser, Bruno

    2014-01-01

    Background Guidelines recommend extensively hydrolysed cow's milk protein formulas (eHF) in the treatment of infants diagnosed with cow's milk protein allergy (CMPA). Extensively hydrolysed rice protein infant formulas (eRHFs) have recently become available, and could offer a valid alternative. Methods A prospective trial was performed to evaluate the clinical tolerance of a new eRHF in infants with a confirmed CMPA. Patients were followed for 1 month. Clinical tolerance of the eRHF was evaluated with a symptom-based score (SBS) and growth (weight and length) was monitored. Results Thirty-nine infants (mean age 3.4 months, range 0.5–6 months) diagnosed with CMPA were enrolled. All infants tolerated the eRHF and experienced a normal growth. Conclusions In accordance with current guidelines, this eRHF is tolerated by more than 90% of children with proven CMPA with a 95% CI, and is an adequate alternative to cow's milk-based eHF. Trial registration number ClinicalTrials.gov NCT01998074. PMID:24914098

  2. Review of pyroelectric thermal energy harvesting and new MEMs-based resonant energy conversion techniques

    NASA Astrophysics Data System (ADS)

    Hunter, Scott R.; Lavrik, Nickolay V.; Mostafa, Salwa; Rajic, Slo; Datskos, Panos G.

    2012-06-01

    Harvesting electrical energy from thermal energy sources using pyroelectric conversion techniques has been under investigation for over 50 years, but it has not received the attention that thermoelectric energy harvesting techniques have during this time period. This lack of interest stems from early studies which found that the energy conversion efficiencies achievable using pyroelectric materials were several times less than those potentially achievable with thermoelectrics. More recent modeling and experimental studies have shown that pyroelectric techniques can be cost competitive with thermoelectrics and, using new temperature cycling techniques, has the potential to be several times as efficient as thermoelectrics under comparable operating conditions. This paper will review the recent history in this field and describe the techniques that are being developed to increase the opportunities for pyroelectric energy harvesting. The development of a new thermal energy harvester concept, based on temperature cycled pyroelectric thermal-to-electrical energy conversion, are also outlined. The approach uses a resonantly driven, pyroelectric capacitive bimorph cantilever structure that can be used to rapidly cycle the temperature in the energy harvester. The device has been modeled using a finite element multi-physics based method, where the effect of the structure material properties and system parameters on the frequency and magnitude of temperature cycling, and the efficiency of energy recycling using the proposed structure, have been modeled. Results show that thermal contact conductance and heat source temperature differences play key roles in dominating the cantilever resonant frequency and efficiency of the energy conversion technique. This paper outlines the modeling, fabrication and testing of cantilever and pyroelectric structures and single element devices that demonstrate the potential of this technology for the development of high efficiency thermal

  3. Development of evaluation technique of GMAW welding quality based on statistical analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shengqiang; Terasaki, Hidenri; Komizo, Yuichi; Hu, Shengsun; Chen, Donggao; Ma, Zhihua

    2014-11-01

    Nondestructive techniques for appraising gas metal arc welding(GMAW) faults plays a very important role in on-line quality controllability and prediction of the GMAW process. On-line welding quality controllability and prediction have several disadvantages such as high cost, low efficiency, complication and greatly being affected by the environment. An enhanced, efficient evaluation technique for evaluating welding faults based on Mahalanobis distance(MD) and normal distribution is presented. In addition, a new piece of equipment, designated the weld quality tester(WQT), is developed based on the proposed evaluation technique. MD is superior to other multidimensional distances such as Euclidean distance because the covariance matrix used for calculating MD takes into account correlations in the data and scaling. The values of MD obtained from welding current and arc voltage are assumed to follow a normal distribution. The normal distribution has two parameters: the mean µ and standard deviation σ of the data. In the proposed evaluation technique used by the WQT, values of MD located in the range from zero to µ+3 σ are regarded as "good". Two experiments which involve changing the flow of shielding gas and smearing paint on the surface of the substrate are conducted in order to verify the sensitivity of the proposed evaluation technique and the feasibility of using WQT. The experimental results demonstrate the usefulness of the WQT for evaluating welding quality. The proposed technique can be applied to implement the on-line welding quality controllability and prediction, which is of great importance to design some novel equipment for weld quality detection.

  4. Review of pyroelectric thermal energy harvesting and new MEMs based resonant energy conversion techniques

    SciTech Connect

    Hunter, Scott Robert; Lavrik, Nickolay V; Mostafa, Salwa; Rajic, Slobodan; Datskos, Panos G

    2012-01-01

    Harvesting electrical energy from thermal energy sources using pyroelectric conversion techniques has been under investigation for over 50 years, but it has not received the attention that thermoelectric energy harvesting techniques have during this time period. This lack of interest stems from early studies which found that the energy conversion efficiencies achievable using pyroelectric materials were several times less than those potentially achievable with thermoelectrics. More recent modeling and experimental studies have shown that pyroelectric techniques can be cost competitive with thermoelectrics and, using new temperature cycling techniques, has the potential to be several times as efficient as thermoelectrics under comparable operating conditions. This paper will review the recent history in this field and describe the techniques that are being developed to increase the opportunities for pyroelectric energy harvesting. The development of a new thermal energy harvester concept, based on temperature cycled pyroelectric thermal-to-electrical energy conversion, are also outlined. The approach uses a resonantly driven, pyroelectric capacitive bimorph cantilever structure that can be used to rapidly cycle the temperature in the energy harvester. The device has been modeled using a finite element multi-physics based method, where the effect of the structure material properties and system parameters on the frequency and magnitude of temperature cycling, and the efficiency of energy recycling using the proposed structure, have been modeled. Results show that thermal contact conductance and heat source temperature differences play key roles in dominating the cantilever resonant frequency and efficiency of the energy conversion technique. This paper outlines the modeling, fabrication and testing of cantilever and pyroelectric structures and single element devices that demonstrate the potential of this technology for the development of high efficiency thermal

  5. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. PMID:26851478

  6. The Flap Sandwich Technique for a Safe and Aesthetic Skull Base Reconstruction.

    PubMed

    Yano, Tomoyuki; Okazaki, Mutsumi; Tanaka, Kentarou; Iida, Hideo

    2016-02-01

    For safe and reliable skull base reconstruction combined with repair of cranial bone defects, we introduce the flap sandwich technique in this study. A titanium mesh is often used to repair structural cranial bone defects because it has less donor site morbidity and is easy to handle. However, titanium mesh has disadvantages of exposure and infection postoperatively. To improve surgical outcomes, we applied the flap sandwich technique to 3 cases of skull base reconstruction combined with cranial bone defect repair. Two anterior skull base defects and 1 middle skull base defect were included in this study. The subjects were all women, aged 30, 58, and 62 years. One patient had former multiple craniotomies and another patient had preoperative radiotherapy. The flap sandwich technique involves structural cranial bone reconstruction with a titanium mesh and soft tissue reconstruction with a chimeric anterolateral thigh free flap. First, the dead space between the repaired dura and the titanium mesh is filled with vastus lateralis muscle, and then structural reconstruction is performed with a titanium mesh. Finally, the titanium mesh is totally covered with the adiposal flap of the anterolateral thigh free flap. The muscle flap protects the dead space from infection, and the adiposal flap covers the titanium mesh to reduce mechanical stress on the covered skin and thus prevent the exposure of the titanium mesh through the scalp. By applying this technique, there was no intracranial infection or titanium mesh exposure in these 3 cases postoperatively, even though 2 patients had postoperative radiotherapy. Additionally, the adiposal flap could provide a soft and natural contour to the scalp and forehead region, and this gives patients a better facial appearance even though they have had skull base surgery. PMID:25954846

  7. An efficient algorithm for multipole energies and derivatives based on spherical harmonics and extensions to particle mesh Ewald.

    PubMed

    Simmonett, Andrew C; Pickard, Frank C; Schaefer, Henry F; Brooks, Bernard R

    2014-05-14

    Next-generation molecular force fields deliver accurate descriptions of non-covalent interactions by employing more elaborate functional forms than their predecessors. Much work has been dedicated to improving the description of the electrostatic potential (ESP) generated by these force fields. A common approach to improving the ESP is by augmenting the point charges on each center with higher-order multipole moments. The resulting anisotropy greatly improves the directionality of the non-covalent bonding, with a concomitant increase in computational cost. In this work, we develop an efficient strategy for enumerating multipole interactions, by casting an efficient spherical harmonic based approach within a particle mesh Ewald (PME) framework. Although the derivation involves lengthy algebra, the final expressions are relatively compact, yielding an approach that can efficiently handle both finite and periodic systems without imposing any approximations beyond PME. Forces and torques are readily obtained, making our method well suited to modern molecular dynamics simulations. PMID:24832247

  8. caTissue Suite to OpenSpecimen: Developing an extensible, open source, web-based biobanking management system.

    PubMed

    McIntosh, Leslie D; Sharma, Mukesh K; Mulvihill, David; Gupta, Snehil; Juehne, Anthony; George, Bijoy; Khot, Suhas B; Kaushal, Atul; Watson, Mark A; Nagarajan, Rakesh

    2015-10-01

    The National Cancer Institute (NCI) Cancer Biomedical Informatics Grid® (caBIG®) program established standards and best practices for biorepository data management by creating an infrastructure to propagate biospecimen resource sharing while maintaining data integrity and security. caTissue Suite, a biospecimen data management software tool, has evolved from this effort. More recently, the caTissue Suite continues to evolve as an open source initiative known as OpenSpecimen. The essential functionality of OpenSpecimen includes the capture and representation of highly granular, hierarchically-structured data for biospecimen processing, quality assurance, tracking, and annotation. Ideal for multi-user and multi-site biorepository environments, OpenSpecimen permits role-based access to specific sets of data operations through a user-interface designed to accommodate varying workflows and unique user needs. The software is interoperable, both syntactically and semantically, with an array of other bioinformatics tools given its integration of standard vocabularies thus enabling research involving biospecimens. End-users are encouraged to share their day-to-day experiences in working with the application, thus providing to the community board insight into the needs and limitations which need be addressed. Users are also requested to review and validate new features through group testing environments and mock screens. Through this user interaction, application flexibility and interoperability have been recognized as necessary developmental focuses essential for accommodating diverse adoption scenarios and biobanking workflows to catalyze advances in biomedical research and operations. Given the diversity of biobanking practices and workforce roles, efforts have been made consistently to maintain robust data granularity while aiding user accessibility, data discoverability, and security within and across applications by providing a lower learning curve in using Open

  9. Validation and extension of the PREMM1,2 model in a population-based cohort of colorectal cancer patients

    PubMed Central

    Balaguer, Francesc; Balmaña, Judith; Castellví-Bel, Sergi; Steyerberg, Ewout W.; Andreu, Montserrat; Llor, Xavier; Jover, Rodrigo; Syngal, Sapna; Castells, Antoni

    2008-01-01

    Summary Background and aims Early recognition of patients at risk for Lynch syndrome is critical but often difficult. Recently, a predictive algorithm -the PREMM1,2 model- has been developed to quantify the risk of carrying a germline mutation in the mismatch repair (MMR) genes, MLH1 and MSH2. However, its performance in an unselected, population-based colorectal cancer population as well as its performance in combination with tumor MMR testing are unknown. Methods We included all colorectal cancer cases from the EPICOLON study, a prospective, multicenter, population-based cohort (n=1,222). All patients underwent tumor microsatellite instability analysis and immunostaining for MLH1 and MSH2, and those with MMR deficiency (n=91) underwent tumor BRAF V600E mutation analysis and MLH1/MSH2 germline testing. Results The PREMM1,2 model with a ≥5% cut-off had a sensitivity, specificity and positive predictive value (PPV) of 100%, 68% and 2%, respectively. The use of a higher PREMM1,2 cut-off provided a higher specificity and PPV, at expense of a lower sensitivity. The combination of a ≥5% cut-off with tumor MMR testing maintained 100% sensitivity with an increased specificity (97%) and PPV (21%). The PPV of a PREMM1,2 score ≥20% alone (16%) approached the PPV obtained with PREMM1,2 score ≥5% combined with tumor MMR testing. In addition, a PREMM1,2 score of <5% was associated with a high likelihood of a BRAF V600E mutation. Conclusions The PREMM1,2 model is useful to identify MLH1/MSH2 mutation carriers among unselected colorectal cancer patients. Quantitative assessment of the genetic risk might be useful to decide on subsequent tumor MMR and germline testing. PMID:18061181

  10. Preliminary study of an angiographic and angio-tomographic technique based on K-edge filters

    SciTech Connect

    Golosio, Bruno; Brunetti, Antonio; Oliva, Piernicola; Carpinelli, Massimo; Luca Masala, Giovanni; Meloni, Francesco; Battista Meloni, Giovanni

    2013-08-14

    Digital Subtraction Angiography is commonly affected by artifacts due to the patient movements during the acquisition of the images without and with the contrast medium. This paper presents a preliminary study on an angiographic and angio-tomographic technique based on the quasi-simultaneous acquisition of two images, obtained using two different filters at the exit of an X-ray tube. One of the two filters (K-edge filter) contains the same chemical element used as a contrast agent (gadolinium in this study). This filter absorbs more radiation with energy just above the so called K-edge energy of gadolinium than the radiation with energy just below it. The other filter (an aluminium filter in this study) is simply used to suppress the low-energy contribution to the spectrum. Using proper calibration curves, the two images are combined to obtain an image of the contrast agent distribution. In the angio-tomographic application of the proposed technique two images, corresponding to the two filter types, are acquired for each viewing angle of the tomographic scan. From the two tomographic reconstructions, it is possible to obtain a three-dimensional map of the contrast agent distribution. The technique was tested on a sample consisting of a rat skull placed inside a container filled with water. Six small cylinders with 4.7 mm internal diameter containing the contrast medium at different concentrations were placed inside the skull. In the plain angiographic application of the technique, five out of six cylinders were visible, with gadolinium concentration down to 0.96%. In the angio-tomographic application, all six cylinders were visible, with gadolinium concentration down to 0.49%. This preliminary study shows that the proposed technique can provide images of the contrast medium at low concentration without most of the artifacts that are present in images produced by conventional techniques. The results encourage further investigation on the feasibility of a clinical

  11. LiftingWiSe: a lifting-based efficient data processing technique in wireless sensor networks.

    PubMed

    Aboelela, Emad

    2014-01-01

    Monitoring thousands of objects which are deployed over large-hard-to-reach areas, is an important application of the wireless sensor networks (WSNs). Such an application requires disseminating a large amount of data within the WSN. This data includes, but is not limited to, the object's location and the environment conditions at that location. WSNs require efficient data processing and dissemination processes due to the limited storage, processing power, and energy available in the WSN nodes. The aim of this paper is to propose a data processing technique that can work under constrained storage, processing, and energy resource conditions. The proposed technique utilizes the lifting procedure in processing the disseminated data. Lifting is usually used in discrete wavelet transform (DWT) operations. The proposed technique is referred to as LiftingWiSe, which stands for Lifting-based efficient data processing technique for Wireless Sensor Networks. LiftingWiSe has been tested and compared to other relevant techniques from the literature. The test has been conducted via a simulation of the monitored field and the deployed wireless sensor network nodes. The simulation results have been analyzed and discussed. PMID:25116902

  12. Weighted Least Squares Techniques for Improved Received Signal Strength Based Localization

    PubMed Central

    Tarrío, Paula; Bernardos, Ana M.; Casar, José R.

    2011-01-01

    The practical deployment of wireless positioning systems requires minimizing the calibration procedures while improving the location estimation accuracy. Received Signal Strength localization techniques using propagation channel models are the simplest alternative, but they are usually designed under the assumption that the radio propagation model is to be perfectly characterized a priori. In practice, this assumption does not hold and the localization results are affected by the inaccuracies of the theoretical, roughly calibrated or just imperfect channel models used to compute location. In this paper, we propose the use of weighted multilateration techniques to gain robustness with respect to these inaccuracies, reducing the dependency of having an optimal channel model. In particular, we propose two weighted least squares techniques based on the standard hyperbolic and circular positioning algorithms that specifically consider the accuracies of the different measurements to obtain a better estimation of the position. These techniques are compared to the standard hyperbolic and circular positioning techniques through both numerical simulations and an exhaustive set of real experiments on different types of wireless networks (a wireless sensor network, a WiFi network and a Bluetooth network). The algorithms not only produce better localization results with a very limited overhead in terms of computational cost but also achieve a greater robustness to inaccuracies in channel modeling. PMID:22164092

  13. [Research progress on urban carbon fluxes based on eddy covariance technique].

    PubMed

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique. PMID:24830264

  14. LiftingWiSe: A Lifting-Based Efficient Data Processing Technique in Wireless Sensor Networks

    PubMed Central

    Aboelela, Emad

    2014-01-01

    Monitoring thousands of objects which are deployed over large-hard-to-reach areas, is an important application of the wireless sensor networks (WSNs). Such an application requires disseminating a large amount of data within the WSN. This data includes, but is not limited to, the object's location and the environment conditions at that location. WSNs require efficient data processing and dissemination processes due to the limited storage, processing power, and energy available in the WSN nodes. The aim of this paper is to propose a data processing technique that can work under constrained storage, processing, and energy resource conditions. The proposed technique utilizes the lifting procedure in processing the disseminated data. Lifting is usually used in discrete wavelet transform (DWT) operations. The proposed technique is referred to as LiftingWiSe, which stands for Lifting-based efficient data processing technique for Wireless Sensor Networks. LiftingWiSe has been tested and compared to other relevant techniques from the literature. The test has been conducted via a simulation of the monitored field and the deployed wireless sensor network nodes. The simulation results have been analyzed and discussed. PMID:25116902

  15. Evaluation of Clipping Based Iterative PAPR Reduction Techniques for FBMC Systems

    PubMed Central

    Kollár, Zsolt

    2014-01-01

    This paper investigates filter bankmulticarrier (FBMC), a multicarrier modulation technique exhibiting an extremely low adjacent channel leakage ratio (ACLR) compared to conventional orthogonal frequency division multiplexing (OFDM) technique. The low ACLR of the transmitted FBMC signal makes it especially favorable in cognitive radio applications, where strict requirements are posed on out-of-band radiation. Large dynamic range resulting in high peak-to-average power ratio (PAPR) is characteristic of all sorts of multicarrier signals. The advantageous spectral properties of the high-PAPR FBMC signal are significantly degraded if nonlinearities are present in the transceiver chain. Spectral regrowth may appear, causing harmful interference in the neighboring frequency bands. This paper presents novel clipping based PAPR reduction techniques, evaluated and compared by simulations and measurements, with an emphasis on spectral aspects. The paper gives an overall comparison of PAPR reduction techniques, focusing on the reduction of the dynamic range of FBMC signals without increasing out-of-band radiation. An overview is presented on transmitter oriented techniques employing baseband clipping, which can maintain the system performance with a desired bit error rate (BER). PMID:24558338

  16. An acoustic-array based structural health monitoring technique for wind turbine blades

    NASA Astrophysics Data System (ADS)

    Aizawa, Kai; Poozesh, Peyman; Niezrecki, Christopher; Baqersad, Javad; Inalpolat, Murat; Heilmann, Gunnar

    2015-04-01

    This paper proposes a non-contact measurement technique for health monitoring of wind turbine blades using acoustic beamforming techniques. The technique works by mounting an audio speaker inside a wind turbine blade and observing the sound radiated from the blade to identify damage within the structure. The main hypothesis for the structural damage detection is that the structural damage (cracks, edge splits, holes etc.) on the surface of a composite wind turbine blade results in changes in the sound radiation characteristics of the structure. Preliminary measurements were carried out on two separate test specimens, namely a composite box and a section of a wind turbine blade to validate the methodology. The rectangular shaped composite box and the turbine blade contained holes with different dimensions and line cracks. An acoustic microphone array with 62 microphones was used to measure the sound radiation from both structures when the speaker was located inside the box and also inside the blade segment. A phased array beamforming technique and CLEAN-based subtraction of point spread function from a reference (CLSPR) were employed to locate the different damage types on both the composite box and the wind turbine blade. The same experiment was repeated by using a commercially available 48-channel acoustic ring array to compare the test results. It was shown that both the acoustic beamforming and the CLSPR techniques can be used to identify the damage in the test structures with sufficiently high fidelity.

  17. Effectiveness of WISE colour-based selection techniques to uncover obscured AGN

    NASA Astrophysics Data System (ADS)

    Mateos, S.

    2014-07-01

    We present a highly reliable and efficient mid-infrared colour-based selection technique for luminous active galactic nuclei (AGN) using the Wide-field Infrared Survey Explorer (WISE) survey. Our technique is designed to identify objects with red mid-infrared power-law spectral energy distributions. We studied the dependency of our mid-infrared selection on the AGN intrinsic luminosity and the effectiveness of our technique to uncover obscured AGN missed in X-ray surveys. To do so we used two samples of luminous AGN independently selected in hard X-ray and optical surveys. We used the largest catalogue of 887 [OIII] λ5007-selected type 2 quasars (QSO2s) at z<~0.83 in the literature from the Sloan Digital Sky Survey (SDSS), and the 258 hard (>4.5 keV) X-ray-selected AGN from the Bright Ultrahard XMM-Newton Survey (BUXS). The effectiveness of our mid-infrared selection technique increases with the AGN luminosity. At high luminosities and at least up to z~1 our technique is very effective at identifying both Compton-thin and Compton-thick AGN.

  18. Resonant fiber optic gyro based on a sinusoidal wave modulation and square wave demodulation technique.

    PubMed

    Wang, Linglan; Yan, Yuchao; Ma, Huilian; Jin, Zhonghe

    2016-04-20

    New developments are made in the resonant fiber optic gyro (RFOG), which is an optical sensor for the measurement of rotation rate. The digital signal processing system based on the phase modulation technique is capable of detecting the weak frequency difference induced by the Sagnac effect and suppressing the reciprocal noise in the circuit, which determines the detection sensitivity of the RFOG. A new technique based on the sinusoidal wave modulation and square wave demodulation is implemented, and the demodulation curve of the system is simulated and measured. Compared with the past technique using sinusoidal modulation and demodulation, it increases the slope of the demodulation curve by a factor of 1.56, improves the spectrum efficiency of the modulated signal, and reduces the occupancy of the field-programmable gate array resource. On the basis of this new phase modulation technique, the loop is successfully locked and achieves a short-term bias stability of 1.08°/h, which is improved by a factor of 1.47. PMID:27140098

  19. A scale space feature based registration technique for fusion of satellite imagery

    NASA Technical Reports Server (NTRS)

    Raghavan, Srini; Cromp, Robert F.; Campbell, William C.

    1997-01-01

    Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.

  20. Capillary electrophoresis of an 11-plex mtDNA coding region SNP single base extension assay for discrimination of the most common Caucasian HV1/HV2 mitotype.

    PubMed

    Vallone, Peter M

    2012-01-01

    The typing of single nucleotide polymorphisms (SNPs) located throughout the human mitochondrial genome assists in resolving individuals with an identical HV1/HV2 haplotype. A set of 11 sites which were selected for distinguishing individuals of a common Western European Caucasian HV1/HV2 mitotype was incorporated into a single base extension (SBE) assay. The assay was optimized for multiplex detection of sequence polymorphisms at positions 3010, 4793, 10211, 5004, 7028, 7202, 16519, 12858, 4580, 477, and 14470 in the mitochondrial genome. PCR primers were designed to allow for multiplex amplification of unique regions in the mitochondrial genome followed by an 11-plex SBE reaction using the SNaPshot(®) reagent kit. Separation and detection can be accomplished with a capillary-based electrophoresis platform commonly found in most forensic laboratories. PMID:22139659