Science.gov

Sample records for base extension technique

  1. Fluorescence Based Primer Extension Technique to Determine Transcriptional Starting Points and Cleavage Sites of RNases In Vivo

    PubMed Central

    Schuster, Christopher F.; Bertram, Ralph

    2014-01-01

    Fluorescence based primer extension (FPE) is a molecular method to determine transcriptional starting points or processing sites of RNA molecules. This is achieved by reverse transcription of the RNA of interest using specific fluorescently labeled primers and subsequent analysis of the resulting cDNA fragments by denaturing polyacrylamide gel electrophoresis. Simultaneously, a traditional Sanger sequencing reaction is run on the gel to map the ends of the cDNA fragments to their exact corresponding bases. In contrast to 5'-RACE (Rapid Amplification of cDNA Ends), where the product must be cloned and multiple candidates sequenced, the bulk of cDNA fragments generated by primer extension can be simultaneously detected in one gel run. In addition, the whole procedure (from reverse transcription to final analysis of the results) can be completed in one working day. By using fluorescently labeled primers, the use of hazardous radioactive isotope labeled reagents can be avoided and processing times are reduced as products can be detected during the electrophoresis procedure. In the following protocol, we describe an in vivo fluorescent primer extension method to reliably and rapidly detect the 5' ends of RNAs to deduce transcriptional starting points and RNA processing sites (e.g., by toxin-antitoxin system components) in S. aureus, E. coli and other bacteria. PMID:25406941

  2. Extension of the preceding birth technique.

    PubMed

    Aguirre, A

    1994-01-01

    The Brass-inspired Preceding Birth Technique (PBT), is an indirect estimation technique with low costs of administration. PBT involves asking women at a time close to delivery about the survival of the preceding births. The proportion dead is close to the probability of dying between the birth and the second birthday or an index of early childhood mortality (II or Q). Brass and Macrae have determined that II is an estimate of mortality between birth and an age lower than the birth interval or around 4/5 of the birth interval. Hospital and clinic data are likely to include a concentration of women with lower risks of disease because of higher educational levels and socioeconomic status. A simulation of PBT data from the World Fertility Survey for Mexico and Peru found that the proportions of previously dead children were 0.156 in Peru and 0.092 in Mexican home deliveries. Maternity clinic proportions were 0.088 in Peru and 0.066 in Mexico. Use of clinic and hospital data collection underestimated mortality by 32% in Peru and 15% in Mexico. Another alternative was proposed: interviewing women at some other time than delivery. If the interview was during a child/infant intervention after delivery, the subsample would still be subject to a bias, but this problem could be overcome by computing the weighted average of the actual probability of the older child being dead and the conditional probability of the younger child being dead or both younger and older children being dead. Correction factors could be applied using the general standard of the logit life table system of Brass. Calculation of a simple average of the ages of the younger children could provide enough information to help decide which tables to use. Five surveys were selected for testing the factors of dependence between probabilities of death of successive siblings: Bangladesh, Lesotho, Kenya, Ghana, and Guyana. Higher mortality was related to lower dependency factors between the probabilities of death

  3. Two Extension Block Kirschner Wires' Technique for Bony Mallet Thumb

    PubMed Central

    Takase, Fumiaki; Ueda, Yasuhiro; Shinohara, Issei; Kuroda, Ryosuke; Kokubu, Takeshi

    2016-01-01

    Mallet fingers with an avulsion fracture of the distal phalanx or rupture of the terminal tendon of the extensor mechanism is known as a common injury, while mallet thumb is very rare. In this paper, the case of a 19-year-old woman with a sprained left thumb sustained while playing basketball is presented. Plain radiographs and computed tomography revealed an avulsion fracture involving more than half of the articular surface at the base of the distal phalanx. Closed reduction and percutaneous fixation were performed using the two extension block Kirschner wires' technique under digital block anesthesia. At 4 months postoperatively, the patient had achieved excellent results according to Crawford's evaluation criteria and had no difficulties in working or playing basketball. Various conservative and operative treatment strategies have been reported for management of mallet thumb. We chose the two extension block Kirschner wires' technique to minimize invasion of the extensor mechanism and nail bed and to stabilize the large fracture fragment. PMID:27774329

  4. Extension of an Itô-based general approximation technique for random vibration of a BBW general hysteris model part II: Non-Gaussian analysis

    NASA Astrophysics Data System (ADS)

    Davoodi, H.; Noori, M.

    1990-07-01

    The work presented in this paper constitutes the second phase of on-going research aimed at developing mathematical models for representing general hysteretic behavior of structures and approximation techniques for the computation and analysis of the response of hysteretic systems to random excitations. In this second part, the technique previously developed by the authors for the Gaussian response analysis of non-linear systems with general hysteretic behavior is extended for the non-Gaussian analysis of these systems. This approximation technique is based on the approach proposed independently by Ibrahim and Wu-Lin. In this work up to fourth order moments of the response co-ordinates are obtained for the Bouc-Baber-Wen smooth hysteresis model. These higher order statistics previously have not been made available for general hysteresis models by using existing approximation methods. Second order moments obtained for the model by this non-Gaussian closure scheme are compared with equivalent linearization and Gaussian closure results via Monte Carlo simulation (MCS). Higher order moments are compared with the simulation results. The study performed for a wide range of degradation parameters and input power spectral density ( PSD) levels shows that the non-Gaussian responses obtained by this approach are in better agreement with the MCS results than the linearized and Gaussian ones. This approximation technique can provide information on higher order moments for general hysteretic systems. This information is valuable in random vibration and the reliability analysis of hysteretically yielding structures.

  5. Growth Physics in Nitella: a Method for Continuous in Vivo Analysis of Extensibility Based on a Micro-manometer Technique for Turgor Pressure.

    PubMed

    Green, P B

    1968-08-01

    THE VIEW THAT THE PLANT CELL GROWS BY THE YIELDING OF THE CELL WALL TO TURGOR PRESSURE CAN BE EXPRESSED IN THE EQUATION: rate = cell extensibility x turgor. All growth rate responses can in principle be resolved into changes in the 2 latter variables. Extensibility will relate primarily to the yielding properties of the cell wall, turgor primarily to solute uptake or production. Use of this simple relationship in vivo requires that at least 2 of the 3 variables be measured in a growing cell. Extensibility is not amenable to direct measurement. Data on rate and turgor for single Nitella cells can, however, be continuously gathered to permit calculation of extensibility (rate/turgor). Rate is accurately obtained from measurements on time-lapse film. Turgor is estimated in the same cell, to within 0.1 atm or less, by measurement of the ability of the cell to compress gas trapped in the closed end of a capillary the open end of which is in the cell vacuole. The method is independent of osmotic equilibrium. It operates continuously for several days, over a several fold increase in cell length, and has response time of less than one minute. Rapid changes in turgor brought on by changes in tonicity of the medium, show that extensibility, as defined above, is not constant but has a value of zero unless the cell has about 80% of normal turgor. Because elastic changes are small, extensibility relates to growth. Over long periods of treatment in a variety of osmotica the threshold value for extensibility and growth is seen to fall to lower values to permit resumption of growth at reduced turgor. A brief period of rapid growth (5x normal) follows the return to normal turgor. All variables then become normal and the cycle can be repeated. The cell remains essentially at osmotic equilibrium, even while growing at 5x the normal rate. The method has potential for detailed in vivo analyses of "wall softening."

  6. A comparison of four streamflow record extension techniques.

    USGS Publications Warehouse

    Hirsch, R.M.

    1982-01-01

    One approach to developing time series of streamflow, which may be used for simulation and optimization studies of water resources development activities, is to extend an existing gage record in time by exploiting the interstation correlation between the station of interest and some nearby (long-term) base station. Four methods of extension are described, and their properties are explored. -from Author

  7. Percutaneous nephrostomy with extensions of the technique: step by step.

    PubMed

    Dyer, Raymond B; Regan, John D; Kavanagh, Peter V; Khatod, Elaine G; Chen, Michael Y; Zagoria, Ronald J

    2002-01-01

    Minimally invasive therapy in the urinary tract begins with renal access by means of percutaneous nephrostomy. Indications for percutaneous nephrostomy include urinary diversion, treatment of nephrolithiasis and complex urinary tract infections, ureteral intervention, and nephroscopy and ureteroscopy. Bleeding complications can be minimized by entering the kidney in a relatively avascular zone created by branching of the renal artery. The specific site of renal entry is dictated by the indication for access with consideration of the anatomic constraints. Successful percutaneous nephrostomy requires visualization of the collecting system for selection of an appropriate entry site. The definitive entry site is then selected; ideally, the entry site should be subcostal and lateral to the paraspinous musculature. Small-bore nephrostomy tracks can be created over a guide wire coiled in the renal pelvis. A large-diameter track may be necessary for percutaneous stone therapy, nephroscopy, or antegrade ureteroscopy. The most common extension of percutaneous nephrostomy is placement of a ureteral stent for treatment of obstruction. Transient hematuria occurs in virtually every patient after percutaneous nephrostomy, but severe bleeding that requires transfusion or intervention is uncommon. In patients with an obstructed urinary tract complicated by infection, extensive manipulations pose a risk of septic complications. PMID:12006684

  8. Flexible use and technique extension of logistics management

    NASA Astrophysics Data System (ADS)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  9. Scan-Based Implementation of JPEG 2000 Extensions

    NASA Technical Reports Server (NTRS)

    Rountree, Janet C.; Webb, Brian N.; Flohr, Thomas J.; Marcellin, Michael W.

    2001-01-01

    JPEG 2000 Part 2 (Extensions) contains a number of technologies that are of potential interest in remote sensing applications. These include arbitrary wavelet transforms, techniques to limit boundary artifacts in tiles, multiple component transforms, and trellis-coded quantization (TCQ). We are investigating the addition of these features to the low-memory (scan-based) implementation of JPEG 2000 Part 1. A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version. A proposed amendment to JPEG 2000 Part 2 will effect the syntax changes required to make scan-based TCQ compatible with the standard.

  10. Scan-based implementation of JPEG 2000 extensions

    NASA Astrophysics Data System (ADS)

    Rountree, Janet C.; Webb, Brian N.; Flohr, Thomas J.; Marcellin, Michael W.

    2001-12-01

    JPEG 2000 Part 2 (extensions) contains a number of technologies that are of potential interest in remote sensing applications. These include arbitrary wavelet transforms, techniques to limit boundary artifacts in tiles, multiple component transforms, and trellis-coded quantization (TCQ). We are investigating the addition of these features to the low-memory (scan-based) implementation of JPEG 2000 Part 1. A scan-based implementation of TCQ has been realized and tested, with a very small performance loss as compared with the full image (frame-based) version. A proposed amendment to JPEG 2000 Part 2 will effect the syntax changes required to make scan-based TCQ compatible with the standard.

  11. Biomechanical analysis of press-extension technique on degenerative lumbar with disc herniation and staggered facet joint.

    PubMed

    Du, Hong-Gen; Liao, Sheng-Hui; Jiang, Zhong; Huang, Huan-Ming; Ning, Xi-Tao; Jiang, Neng-Yi; Pei, Jian-Wei; Huang, Qin; Wei, Hui

    2016-05-01

    This study investigates the effect of a new Chinese massage technique named "press-extension" on degenerative lumbar with disc herniation and facet joint dislocation, and provides a biomechanical explanation of this massage technique. Self-developed biomechanical software was used to establish a normal L1-S1 lumbar 3D FE model, which integrated the spine CT and MRI data-based anatomical structure. Then graphic technique is utilized to build a degenerative lumbar FE model with disc herniation and facet joint dislocation. According to the actual press-extension experiments, mechanic parameters are collected to set boundary condition for FE analysis. The result demonstrated that press-extension techniques bring the annuli fibrosi obvious induction effect, making the central nucleus pulposus forward close, increasing the pressure in front part. Study concludes that finite element modelling for lumbar spine is suitable for the analysis of press-extension technique impact on lumbar intervertebral disc biomechanics, to provide the basis for the disease mechanism of intervertebral disc herniation using press-extension technique. PMID:27275119

  12. Extension and Home-Based Businesses.

    ERIC Educational Resources Information Center

    Loker, Suzanne; And Others

    1990-01-01

    Includes "Building Home Businesses in Rural Communities" (Loker et al.); "Home-Based Business...A Means to Economic Growth in Rural Areas" (Bastow-Shoop et al.); "Business Not As Usual" (Millar, Mallilo); and "Economic Options for Farm Families" (Williams). (SK)

  13. Space-based observation of the extensive airshowers

    NASA Astrophysics Data System (ADS)

    Ebisuzaki, T.

    2013-06-01

    Space based observations of extensive air showers constitute the next experimental challenge for the study of the universe at extreme energy. Space observation will allow a "quantum jump" in the observational area available to detect the UV light tracks produced by particles with energies higher than 1020 eV. These are thought to reach the Earth almost undeflected by the cosmic magnetic field. This new technique will contribute to establish the new field of astronomy and astrophysics performed with charged particles and neutrinos at the highest energies. This idea was created by the incredible efforts of three outstanding comic ray physicists: John Linsley, Livio Scarsi, and Yoshiyuki Takahashi. This challenging technique has four significant merits in comparison with ground-based observations: 1) Very large observational area, 2) Well constrained distances of the showers, 3) Clear and stable atmospheric transmission in the above half troposphere, 4) Uniform Exposure across both the north and south skies. Four proposed and planned missions constitute the roadmap of the community: TUS, JEM-EUSO, KLPVE, and Super-EUSO will contribute step-by-step to establish this challenging field of research.

  14. Extensible User-Based XML Grammar Matching

    NASA Astrophysics Data System (ADS)

    Tekli, Joe; Chbeir, Richard; Yetongnon, Kokou

    XML grammar matching has found considerable interest recently due to the growing number of heterogeneous XML documents on the web and the increasing need to integrate, and consequently search and retrieve XML data originated from different data sources. In this paper, we provide an approach for automatic XML grammar matching and comparison aiming to minimize the amount of user effort required to perform the match task. We propose an open framework based on the concept of tree edit distance, integrating different matching criterions so as to capture XML grammar element semantic and syntactic similarities, cardinality and alternativeness constraints, as well as data-type correspondences and relative ordering. It is flexible, enabling the user to chose mapping cardinality (1:1, 1:n, n:1, n:n), in comparison with existing static methods (constrained to 1:1), and considers user feedback to adjust matching results to the user's perception of correct matches. Conducted experiments demonstrate the efficiency of our approach, in comparison with alternative methods.

  15. Research on Customer Value Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Chun-Yan, Yang; Wei-Hua, Li

    Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

  16. Technique for anisotropic extension of organic crystals: Application to temperature dependence of electrical resistance

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takashi; Kato, Reizo; Yamamoto, Hiroshi M.; Fukaya, Atsuko; Yamasawa, Kenji; Takahashi, Ichiro; Akutsu, Hiroki; Akutsu-Sato, Akane; Day, Peter

    2007-08-01

    We have developed a technique for the anisotropic extension of fragile molecular crystals. The pressure medium and the instrument, which extends the pressure medium, are both made from epoxy resin. Since the thermal contraction of our instrument is identical to that of the pressure medium, the strain applied to the pressure medium has no temperature dependence down to 2K. Therefore, the degree of extension applied to the single crystal at low temperatures is uniquely determined from the degree of extension in the pressure medium and thermal contractions of the epoxy resin and the single crystal at ambient pressure. Using this novel instrument, we have measured the temperature dependence of the electrical resistance of metallic, superconducting, and insulating materials. The experimental results are discussed from the viewpoint of the extension (compression) of the lattice constants along the parallel (perpendicular) direction.

  17. Extension and Home-Based Business: A Collaborative Approach.

    ERIC Educational Resources Information Center

    Burns, Marilyn; Biers, Karen

    1991-01-01

    The Center for Home-Based Entrepreneurship at Oklahoma State University developed from collaborative efforts of extension, government agencies, business associations, and the vo-tech system. It provides education, directories, information services, and other assistance to people interested in establishing businesses in their homes. (SK)

  18. Anterior Ridge Extension Using Modified Kazanjian Technique in Mandible- A Clinical Study

    PubMed Central

    Kumar, Jagannadham Vijay; Chakravarthi, Pandi Srinivas; Sridhar, Meka; Devi, Kolli Naga Neelima; Lingamaneni, Krishna Prasad

    2016-01-01

    Introduction Good alveolar ridge is a prerequisite for successful conventional/ implant supported partial/complete denture. Extensively resorbed ridges with shallow vestibule and high insertion of muscles in to the ridge crest, leads to failure of prosthesis. Success of prosthesis depends on surgical repositioning of mucosa and muscle insertions, which increases the depth of vestibule and denture flange area for retention. So, the study was planned to provide good attached gingiva with adequate vestibular depth using Modified Kazanjian Vestibuloplasty (MKV). Aim To evaluate efficacy of MKV technique for increasing vestibular depth in anterior mandible so that successful prosthesis can be delivered. Efficacy of the technique was evaluated through operating time required, vestibular depth achieved, scarring or relapse and any postoperative complications associated with the healing. Materials and Methods Total of 10 patients were included in the study, who had minimum 20mm of bone height and less than 5mm of vestibular depth for MKV procedure. The results were tabulated and statistical analysis was carried out to assess vestibular depth achieved i.e. from crest of the ridge to junction of attached mucosa both pre and postoperatively. The study results were compared with existing literature. Results Healing of raw surface was uneventful with satisfactory achievement of vestibular depth. The average gain in vestibular depth was 11 mm. The patients had good satisfaction index for prosthesis. Conclusion Even in the era of implant prosthesis Modified Kazanjian technique is worth to practice to achieve good results and overcorrection is not required as that of standard Kazanjian technique. It provides adequate attached gingiva for successful prosthesis. Extension of vestibular depth enables fabrication of better denture flange with improved oral hygiene. This technique does not require hospitalization and additional surgery for grafts. PMID:27042579

  19. An Extension Dynamic Model Based on BDI Agent

    NASA Astrophysics Data System (ADS)

    Yu, Wang; Feng, Zhu; Hua, Geng; WangJing, Zhu

    this paper's researching is based on the model of BDI Agent. Firstly, This paper analyze the deficiencies of the traditional BDI Agent model, Then propose an extension dynamic model of BDI Agent based on the traditional ones. It can quickly achieve the internal interaction of the tradition model of BDI Agent, deal with complex issues under dynamic and open environment and achieve quick reaction of the model. The new model is a natural and reasonable model by verifying the origin of civilization using the model of monkeys to eat sweet potato based on the design of the extension dynamic model. It is verified to be feasible by comparing the extended dynamic BDI Agent model with the traditional BDI Agent Model uses the SWARM, it has important theoretical significance.

  20. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    ERIC Educational Resources Information Center

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  1. A rare case of oncocytic Schneiderian papilloma with intradural and intraorbital extension with notes of operative techniques.

    PubMed

    Bignami, Maurizio; Pistochini, Andrea; Meloni, Francesco; Delehaye, Emilio; Castelnuovo, Paolo

    2009-09-01

    Epithelial cells of cylindrical cell papilloma are oncocytes, which arise from the sinonasal respiratory epithelium, hence the term Oncocytic Schneiderian papilloma.This is a rare and benign neoplasm of the nose and paranasal sinuses and it should be considered in the work-up of all unilateral nasal polypoid lesions. Clinically behaviour is comparable to inverted papillomas for local recurrence and malignancy coexistence. We report a case arisen from the nasoethmoidal space that extended to the anterior skull base through a bone dehiscence with intradural invasion and orbital space involvement. Surgical therapy is the treatment of choice, the endonasal endoscopic approach can be used in most of the cases and this surgical technique is safe and suitable also in presence of an extra nasal extension. We describe our experience for management of this kind of lesions and some notes on our operative technique.

  2. Project Milestone. Analysis of Range Extension Techniques for Battery Electric Vehicles

    SciTech Connect

    Neubauer, Jeremy; Wood, Eric; Pesaran, Ahmad

    2013-07-01

    This report documents completion of the July 2013 milestone as part of NREL’s Vehicle Technologies Annual Operating Plan with the U.S. Department of Energy. The objective was to perform analysis on range extension techniques for battery electric vehicles (BEVs). This work represents a significant advancement over previous thru-life BEV analyses using NREL’s Battery Ownership Model, FastSim,* and DRIVE.* Herein, the ability of different charging infrastructure to increase achievable travel of BEVs in response to real-world, year-long travel histories is assessed. Effects of battery and cabin thermal response to local climate, battery degradation, and vehicle auxiliary loads are captured. The results reveal the conditions under which different public infrastructure options are most effective, and encourage continued study of fast charging and electric roadway scenarios.

  3. Extension of the broadband single-mode integrated optical waveguide technique to the ultraviolet spectral region and its applications.

    PubMed

    Wiederkehr, Rodrigo S; Mendes, Sergio B

    2014-03-21

    We report here the fabrication, characterization, and application of a single-mode integrated optical waveguide (IOW) spectrometer capable of acquiring optical absorbance spectra of surface-immobilized molecules in the visible and ultraviolet spectral region down to 315 nm. The UV-extension of the single-mode IOW technique to shorter wavelengths was made possible by our development of a low-loss single-mode dielectric waveguide in the UV region based on an alumina film grown by atomic layer deposition (ALD) over a high quality fused silica substrate, and by our design/fabrication of a broadband waveguide coupler formed by an integrated diffraction grating combined with a highly anamorphic optical beam of large numerical aperture. As an application of the developed technology, we report here the surface adsorption process of bacteriochlorophyll a on different interfaces using its Soret absorption band centred at 370 nm. The effects of different chemical compositions at the solid-liquid interface on the adsorption and spectral properties of bacteriochlorophyll a were determined from the polarized UV-Vis IOW spectra acquired with the developed instrumentation. The spectral extension of the single-mode IOW technique into the ultraviolet region is an important advance as it enables extremely sensitive studies in key characteristics of surface molecular processes (e.g., protein unfolding and solvation of aromatic amino-acid groups under surface binding) whose spectral features are mainly located at wavelengths below the visible spectrum.

  4. A graph-based approach for designing extensible pipelines

    PubMed Central

    2012-01-01

    Background In bioinformatics, it is important to build extensible and low-maintenance systems that are able to deal with the new tools and data formats that are constantly being developed. The traditional and simplest implementation of pipelines involves hardcoding the execution steps into programs or scripts. This approach can lead to problems when a pipeline is expanding because the incorporation of new tools is often error prone and time consuming. Current approaches to pipeline development such as workflow management systems focus on analysis tasks that are systematically repeated without significant changes in their course of execution, such as genome annotation. However, more dynamism on the pipeline composition is necessary when each execution requires a different combination of steps. Results We propose a graph-based approach to implement extensible and low-maintenance pipelines that is suitable for pipeline applications with multiple functionalities that require different combinations of steps in each execution. Here pipelines are composed automatically by compiling a specialised set of tools on demand, depending on the functionality required, instead of specifying every sequence of tools in advance. We represent the connectivity of pipeline components with a directed graph in which components are the graph edges, their inputs and outputs are the graph nodes, and the paths through the graph are pipelines. To that end, we developed special data structures and a pipeline system algorithm. We demonstrate the applicability of our approach by implementing a format conversion pipeline for the fields of population genetics and genetic epidemiology, but our approach is also helpful in other fields where the use of multiple software is necessary to perform comprehensive analyses, such as gene expression and proteomics analyses. The project code, documentation and the Java executables are available under an open source license at http

  5. Balloon-based interferometric techniques

    NASA Technical Reports Server (NTRS)

    Rees, David

    1985-01-01

    A balloon-borne triple-etalon Fabry-Perot Interferometer, observing the Doppler shifts of absorption lines caused by molecular oxygen and water vapor in the far red/near infrared spectrum of backscattered sunlight, has been used to evaluate a passive spaceborne remote sensing technique for measuring winds in the troposphere and stratosphere. There have been two successful high altitude balloon flights of the prototype UCL instrument from the National Scientific Balloon Facility at Palestine, TE (May 80, Oct. 83). The results from these flights have demonstrated that an interferometer with adequate resolution, stability and sensitivity can be built. The wind data are of comparable quality to those obtained from operational techniques (balloon and rocket sonde, cloud-top drift analysis, and from the gradient wind analysis of satellite radiance measurements). However, the interferometric data can provide a regular global grid, over a height range from 5 to 50 km in regions of clear air. Between the middle troposphere (5 km) and the upper stratosphere (40 to 50 km), an optimized instrument can make wind measurements over the daylit hemisphere with an accuracy of about 3 to 5 m/sec (2 sigma). It is possible to obtain full height profiles between altitudes of 5 and 50 km, with 4 km height resolution, and a spatial resolution of about 200 km, along the orbit track. Below an altitude of about 10 km, Fraunhofer lines of solar origin are possible targets of the Doppler wind analysis. Above an altitude of 50 km, the weakness of the backscattered solar spectrum (decreasing air density) is coupled with the low absorption crosssection of all atmospheric species in the spectral region up to 800 nm (where imaging photon detectors can be used), causing the along-the-track resolution (or error) to increase beyond values useful for operational purposes. Within the region of optimum performance (5 to 50 km), however, the technique is a valuable potential complement to existing wind

  6. Extensibility in Model-Based Business Process Engines

    NASA Astrophysics Data System (ADS)

    Sánchez, Mario; Jiménez, Camilo; Villalobos, Jorge; Deridder, Dirk

    An organization’s ability to embrace change, greatly depends on systems that support their operation. Specifically, process engines might facilitate or hinder changes, depending on their flexibility, their extensibility and the changes required: current workflow engine characteristics create difficulties in organizations that need to incorporate some types of modifications. In this paper we present Cumbia, an extensible MDE platform to support the development of flexible and extensible process engines. In a Cumbia process, models represent participating concerns (control, resources, etc.), which are described with concern-specific languages. Cumbia models are executed in a coordinated way, using extensible engines specialized for each concern.

  7. Extension Clientele Preferences: Accessing Research-Based Information Online

    ERIC Educational Resources Information Center

    Davis, Jamie M.

    2014-01-01

    Research has indicated there are a number of benefits to Extension educators in delivering educational program and content through distance technology methods. However, Extension educators are commonly apprehensive about this transition due to assumptions made about their clientele, because little research has been conducted to examine…

  8. Comparison of Three Techniques to Monitor Bathymetric Evolution in a Spatially Extensive, Rapidly Changing Environment

    NASA Astrophysics Data System (ADS)

    Rutten, J.; Ruessink, G.

    2014-12-01

    The wide variety in spatial and temporal scales inherent to nearshore morphodynamics, together with site-specific environmental characteristics, complicate our current understanding and predictive capability of large (~ km)-scale, long-term (seasons-years) sediment transport patterns and morphologic evolution. The monitoring of this evolution at all relevant scales demands a smart combination of multiple techniques. Here, we compare depth estimates derived from operational optical (Argus video) and microwave (X-band radar) remote sensing with those from jet-ski echo-sounding in an approximately 2.5 km2 large region at the Sand Engine, a 20 Mm3 mega-nourishment at the Dutch coast. Using depth inversion techniques based on linear wave theory, frequent (hourly-daily) bathymetric maps were derived from instantaneous Argus video and X-band radar imagery. Jet-ski surveys were available every 2 to 3 months. Depth inversion on Argus imagery overestimates surveyed depths by up to 0.5 m in shallow water (< 2 m), but underestimates larger water depths (> 5m) by up to 1 m. Averaged over the entire subtidal study area, the errors canceled in volumetric budget computations. Additionally, estimates of shoreline and subtidal sandbar positions were derived from Argus imagery and jet-ski surveys. Sandbar crest positions extracted from daily low-tide time-exposure Argus images reveal a persistent onshore offset of some 20 m, but do show the smaller temporal variability not visible from jet-ski surveys. Potential improvements to the applied depth-inversion technique will be discussed.

  9. Technique for Extension of Small Antenna Array Mutual-Coupling Data to Larger Antenna Arrays

    NASA Technical Reports Server (NTRS)

    Bailey, M. C.

    1996-01-01

    A technique is presented whereby the mutual interaction between a small number of elements in a planar array can be interpolated and extrapolated to accurately predict the combined interactions in a much larger array of many elements. An approximate series expression is developed, based upon knowledge of the analytical characteristic behavior of the mutual admittance between small aperture antenna elements in a conducting ground plane. This expression is utilized to analytically extend known values for a few spacings and orientations to other element configurations, thus eliminating the need to numerically integrate a large number of highly oscillating and slowly converging functions. This paper shows that the technique can predict very accurately the mutual coupling between elements in a very large planar array with a knowledge of the self-admittance of an isolated element and the coupling between only two-elements arranged in eight different pair combinations. These eight pair combinations do not necessarily have to correspond to pairs in the large array, although all of the individual elements must be identical.

  10. Two dimensional restoration of seismic reflection profiles from Mozambique: technique for assessing rift extension histories

    SciTech Connect

    Iliffe, J.E.; Debuyl, M.; Kendall, C.G.St.C.; Lerche, I.

    1986-05-01

    Seismic reflection data from offshore Mozambique between longitudes 25/sup 0/ and 26/sup 0/ and latitudes 34/sup 0/ and 35/sup 0/ reveals a V-shaped rift, the apex of which points northward, toward the coast. This study retraces the rift's extensional history by geometric reconstruction of seismic profiles, selected perpendicular to tectonic strike. Depth conversions are performed, followed by bed length and volume balancing to test the interpretations and calculate a total extension value for the extension factor. The sediments are then backstripped in sedimentary sequences, restoring the increments of throw on faults accordingly. After each sequence is removed, the sediments are decompacted in an attempt to recover the original volume prior to the sequence deposition. The extension factor is again calculated. This process is repeated down the sequences until the result is the pre-rift state of the basin. This analysis results in an extension estimate for each sequence-time increment, as a percentage of the total extension. From this method, a detailed crustal extension history is deduced, which, when coupled to the thermal history from subsidence backstripping and paleoheatflow studies, could be used in the basin analysis assessment of the oil potential of this and other rifts.

  11. Combined surgical and catheter-based treatment of extensive thoracic aortic aneurysm and aortic valve stenosis.

    PubMed

    De Backer, Ole; Lönn, Lars; Søndergaard, Lars

    2015-02-15

    An extensive thoracic aortic aneurysm (TAA) is a potentially life-threatening condition and remains a technical challenge to surgeons. Over the past decade, repair of aortic arch aneurysms has been accomplished using both hybrid (open and endovascular) and totally endovascular techniques. Thoracic endovascular aneurysm repair (TEVAR) has changed and extended management options in thoracic aorta disease, including in those patients deemed unfit or unsuitable for open surgery. Accordingly, transcatheter aortic valve replacement (TAVR) is increasingly used to treat patients with symptomatic severe aortic valve stenosis (AS) who are considered at high risk for surgical aortic valve replacement. In this report, we describe the combined surgical and catheter-based treatment of an extensive TAA and AS. To our knowledge, this is the first report of hybrid TAA repair combined with TAVR.

  12. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  13. Reliability of Goniometric and Trigonometric Techniques for Measuring Hip-Extension Range of Motion Using the Modified Thomas Test

    PubMed Central

    Wakefield, C. Brent; Halls, Amanda; Difilippo, Nicole; Cottrell, G. Trevor

    2015-01-01

    Context: Goniometric assessment of hip-extension range of motion is a standard practice in clinical rehabilitation settings. A weakness of goniometric measures is that small errors in landmarking may result in substantial measurement error. A less commonly used protocol for measuring hip range of motion involves applying trigonometric principles to the length and vertical displacement of the upper part of the lower extremity to determine hip angle; however, the reliability of this measure has never been assessed using the modified Thomas test. Objective: To compare the intrarater and interrater reliability of goniometric (GON) and trigonometric (TRIG) techniques for assessing hip-extension range of motion during the modified Thomas test. Design: Controlled laboratory study. Setting: Institutional athletic therapy facility. Patients or Other Participants: A total of 22 individuals (12 men, 10 women; age range, 18–36 years) with no pathologic knee or back conditions. Main Outcome Measure(s): Hip-extension range of motion of each participant during a modified Thomas test was assessed by 2 examiners with both GON and TRIG techniques in a randomly selected order on 2 separate days. Results: The intraclass correlation coefficient (ICC) revealed that the reliability of the GON technique was low for both the intrarater (ICC = 0.51, 0.54) and interrater (ICC = 0.30, 0.65) comparisons, but the reliability of the TRIG technique was high for both intrarater (ICC = 0.90, 0.95) and interrater (ICC = 0.91, 0.94) comparisons. Single-factorial repeated-measures analyses of variance revealed no mean differences in scoring within or between examiners for either measurement protocol, whereas a difference was observed when comparing the TRIG and GON tests due to the differences in procedures used to identify landmarks. Conclusions: Using the TRIG technique to measure hip-extension range of motion during the modified Thomas test results in superior intrarater and interrater

  14. Tandem Rhomboid Flap Repair: A New Technique in Treatment of Extensive Pilonidal Disease of the Natal Cleft

    PubMed Central

    Kumar M, Kamal; Babu K, Ramesh; Dhanraj, Prema

    2014-01-01

    Pilonidal sinus is an annoying chronic benign disease causing disability in young adults, mainly affecting the intergluteal furrow. Treatment of this condition remains controversial and is represented by a myriad of techniques available. Most of the techniques are judged against open excision and secondary healing in terms of minimizing disease recurrence and patient discomfort. More recently superiority of flap reconstruction to non-flap techniques is accepted. An ideal operation should be simple, associated with minimal pain and wound care after surgery, minimize hospital stay and have a low recurrence rate. We hereby present a new type of rhomboid flap technique for an extensive pilonidal sinus disease. This technique has given good results in our hands considering the aforementioned factors of an ideal operation. The following case report is of our first stint with the procedure. PMID:25386481

  15. Group-Based Learning in an Authoritarian Setting? Novel Extension Approaches in Vietnam's Northern Uplands

    ERIC Educational Resources Information Center

    Schad, Iven; Roessler, Regina; Neef, Andreas; Zarate, Anne Valle; Hoffmann, Volker

    2011-01-01

    This study aims to analyze the potential and constraints of group-based extension approaches as an institutional innovation in the Vietnamese agricultural extension system. Our analysis therefore unfolds around the challenges of how to foster this kind of approach within the hierarchical extension policy setting and how to effectively shape and…

  16. Low-cost computer classification of land cover in the Portland area, Oregon, by signature extension techniques

    USGS Publications Warehouse

    Gaydos, Leonard

    1978-01-01

    The cost of classifying 5,607 square kilometers (2,165 sq. mi.) in the Portland area was less than 8 cents per square kilometer ($0.0788, or $0.2041 per square mile). Besides saving in costs, this and other signature extension techniques may be useful in completing land use and land cover mapping in other large areas where multispectral and multitemporal Landsat data are available in digital form but other source materials are generally lacking.

  17. Optical connection management in ASON based on LDP extensions

    NASA Astrophysics Data System (ADS)

    Qi, Tianlong; Zheng, Xiaoping; Zhang, Hanyi; Guo, Yili

    2002-09-01

    We make an extension to the label distribute protocol in order to realize optical connection management in automatically switched optical network. Its characteristics, such as the dynamic lightpath creation and deletion, and fast restoration are investigated and tested experimentally. How to handle failures occur during these two processes is also explained in detail. The experiment results are discussed and compared with othersí previous work. The experiment setup which is made of four optical nodes and of mesh type network is also described in details in the paper.

  18. Extension of the Viscous Collision Limiting Direct Simulation Monte Carlo Technique to Multiple Species

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Burt, Jonathan M.

    2016-01-01

    There are many flows fields that span a wide range of length scales where regions of both rarefied and continuum flow exist and neither direct simulation Monte Carlo (DSMC) nor computational fluid dynamics (CFD) provide the appropriate solution everywhere. Recently, a new viscous collision limited (VCL) DSMC technique was proposed to incorporate effects of physical diffusion into collision limiter calculations to make the low Knudsen number regime normally limited to CFD more tractable for an all-particle technique. This original work had been derived for a single species gas. The current work extends the VCL-DSMC technique to gases with multiple species. Similar derivations were performed to equate numerical and physical transport coefficients. However, a more rigorous treatment of determining the mixture viscosity is applied. In the original work, consideration was given to internal energy non-equilibrium, and this is also extended in the current work to chemical non-equilibrium.

  19. Some Novel Solidification Processing Techniques Being Investigated at MSFC: Their Extension for Study Aboard the ISS

    NASA Technical Reports Server (NTRS)

    Grugel, R. N.; Anilkumar, A. V.; Fedoseyev, A. I.; Mazuruk, K.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    The float-zone and the Bridgman techniques are two classical directional solidification processing methods that are used to improve materials properties. Unfortunately, buoyancy effects and gravity-driven convection due to unstable temperature and/or composition gradients still produce solidified products that exhibit segregation and, consequently, degraded properties. This presentation will briefly introduce how some novel processing applications can minimize detrimental gravitational effects and enhance microstructural uniformity. Discussion follows that to fully understand and model these procedures requires utilizing, in conjunction with a novel mixing technique, the facilities and quiescent microgravity environment available on the ISS.

  20. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    SciTech Connect

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing and implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.

  1. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    NASA Astrophysics Data System (ADS)

    Hofer, Thomas James

    2014-10-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed effective fiducialization of the crystal volumes and background rejection sufficient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125--128 were analyzed for the first time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel "mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have significant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them difficult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent fiducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for

  2. Extension of vibrational power flow techniques to two-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1987-01-01

    In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or Finite Element Analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid- frequencies between the optimum frequency regimes for FEA and SEA. Power flow analysis has in general been used on one-dimensional beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to two-dimensional plate like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.

  3. Extension of vibrational power flow techniques to two-dimensional structures

    NASA Technical Reports Server (NTRS)

    Cuschieri, Joseph M.

    1988-01-01

    In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or finite element analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid frequencies between the optimum frequency regimes for SEA and FEA. Power flow analysis has in general been used on 1-D beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to 2-D plate-like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA results at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.

  4. CAD techniques applied to diesel engine design. Extension of the RK range. [Ruston diesels

    SciTech Connect

    Sinha, S.K.; Buckthorpe, D.E.

    1980-01-01

    Rustion Diesels Ltd. produce three ranges of engines, the AP range covering engine powers from 500 to 1400 bhp (350 to 1000 kW electrical), the RK range covering 1410 to 4200 bhp (1 to 3 MW electrical), and the AT range covering 1650 to 4950 bhp (1-2 to 3-5 MW electrical). The AT engine range is available at speeds up to 600 rev/min, whereas the AP and RK ranges cover engine speeds from 600 to 1000 rev/min. The design philosophy and extension of the RK range of engines are investigated. This is a 251 mm (ten inch) bore by 305mm (twelve inch) stroke engine and is available in 6-cylinder in-line form and 8-, 12-, and 16-cylinder vee form. The RK engine features a cast-iron crankcase and bedplate design with a forged alloy-steel crankshaft. Combustion-chamber components consist of a cast-iron cylinder head and liner, steel exhaust and inlet valves, and a single-piece aluminium piston. The durability and reliability of RK engines have been fully proven in service with over 30 years' experience in numerous applications for power generation, reaction, and marine propulsion.

  5. Development of CDMS-II Surface Event Rejection Techniques and Their Extensions to Lower Energy Thresholds

    SciTech Connect

    Hofer, Thomas James

    2014-12-01

    The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis of the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low

  6. Definition of data bases, codes, and technologies for cable life extension

    SciTech Connect

    Bustard, L.D.

    1986-01-01

    The substantial number of cables inside containment for a typical nuclear facility provides a strong motivation to extend cable life rather than replace cables. Hence, it is important to understand what information is necessary to accomplish life extension. This paper defines utility-specific as well as collective industry actions that would facilitate extending cable life. The focus of these recommendations is (1) to more realistically define the environmental profiles during which cables must function, (2) to better understand the validity of accelerated aging methodology through examination of naturally aged cables, (3) to better understand the validity of accelerated aging methodology via selected experimentation, (4) to support cable aging analysis by improving nonproprietary data bases, (5) to reduce the impact of the design basis accident assumptions on cable performance so additional cable aging can be accommodated during extended life, and (6) to complement life predictions with more powerful cable condition monitoring techniques than those currently available.

  7. Hierarchic plate and shell models based on p-extension

    NASA Technical Reports Server (NTRS)

    Szabo, Barna A.; Sahrmann, Glenn J.

    1988-01-01

    Formulations of finite element models for beams, arches, plates and shells based on the principle of virtual work was studied. The focus is on computer implementation of hierarchic sequences of finite element models suitable for numerical solution of a large variety of practical problems which may concurrently contain thin and thick plates and shells, stiffeners, and regions where three dimensional representation is required. The approximate solutions corresponding to the hierarchic sequence of models converge to the exact solution of the fully three dimensional model. The stopping criterion is based on: (1) estimation of the relative error in energy norm; (2) equilibrium tests, and (3) observation of the convergence of quantities of interest.

  8. Fibre based integral field unit constructional techniques

    NASA Astrophysics Data System (ADS)

    Murray, Graham J.

    2006-06-01

    Presented here is a selected overview of constructional techniques and principles that have been developed and implemented at the University of Durham in the manufacture of successful fibre-based integral field units. The information contained herein is specifically intended to highlight the constructional methods that have been devised to assemble an efficient fibre bundle. Potential pitfalls that need to be considered when embarking upon such a deceptively simple instrument are also discussed.

  9. Definition of data base, code, and technologies for cable life extension

    SciTech Connect

    Bustard, L.D.

    1987-03-01

    The substantial number of cables inside containment for a typical nuclear facility provides a strong motivation to extend cable life rather than replace cables as part of an overall plant life extension strategy. Hence, it is important to understand what information is necessary to accomplish life extension. This paper defines utility-specific as well as collective-industry actions that would facilitate extending cable life. The focus of these recommendations is (1) to more realistically define the environmental profiles during which cables must function, (2) to define plant configuration and operational changes which may enahnce cable life, (3) to better understand the validity of accelerated aging methodology through examination of naturally aged cables, (4) to better understand the validity of accelerated aging methodology via selected experimentation, (5) to support cable aging analysis by improving nonproprietary data bases, (6) to reduce the impact of the design basis accident assumptions on cable performance so additional cable aging can be accommodated during extended life, and (7) to complement life predictions with more effective cable condition monitoring techniques than those currently available.

  10. Laser Remote Sensing: Velocimetry Based Techniques

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  11. Video based lifting technique coding system.

    PubMed

    Hsiang, S M; Brogmus, G E; Martin, S E; Bezverkhny, I B

    1998-03-01

    Despite automation and improved working conditions, many materials in industry are still handled manually. Among the basic activities involved in manual materials handling, lifting is the one most frequently associated with low-back pain (LBP). Biomechanical analysis techniques have been used to better understand the risk factors associated with manual handling, but because these techniques require specialized equipment, highly trained personnel, and interfere with normal business operations, they are limited in their usefulness. A video based lifting technique analysis system (the VidLiTeCTM System) is presented that provides for quantifiable non-invasive biomechanical analysis of the dynamic features of lifting with high inter-coder reliability and low sensitivity to absolute errors. Analysis of results from a laboratory experiment and from field-collected videotape are described that support the reliability, sensitivity, and accuracy claims of the VidLiTeCTM System. The VidLiTeCTM System allows technicians with minimal training and low-tech equipment (a camcorder) to collect large sets of lifting data without interfering with normal business operations. A reasonably accurate estimate of the peak compressive force on the L5/S1 joint can be made from the data collected. Such a system can be used to collect quantified data on lifting techniques that can be related to LBP reporting.

  12. Extension of POA based on Fiber Element to Girder Bridge

    SciTech Connect

    Li Zhenxin; Qiang Shizhong

    2010-05-21

    Because of its main advantage of simplicity, practicality, lower computational cost and relative good results Pushover analysis (POA) has become an effective analytical tool during the last decade for the seismic assessment of buildings. But such work on bridges has been very limited. Hence, the aim of this study is to adapt POA for nonlinear seismic analysis of girder bridges, and investigate its applicability in the case of an existing river-spanning approach bridge. To three different types bridge models the nonlinear POA, which adopts fiber model nonlinear beam-column element based on flexibility approach, with return period about 2500 years is carried out. It can be concluded that POA is applicable for bridges, with some shortcomings associated with the method in general, even when it is applied for buildings. Finally the applicable selection for monitoring point and lateral load pattern is suggested according to dynamic characteristic of girder bridges.

  13. A Smalltalk-based extension to traditional Geographic Information Systems

    SciTech Connect

    Korp, P.A.; Lurie, G.R.; Christiansen, J.H.

    1995-11-01

    The Dynamic Environmental Effects Model{copyright} (DEEM), under development at Argonne National Laboratory, is a fully object-based modeling software system that supports distributed, dynamic representation of the interlinked processes and behavior of the earth`s surface and near-surface environment, at variable scales of resolution and aggregation. Many of these real world objects are not stored in a format conducive to efficient GIS usage. Their dynamic nature, complexity and number of possible DEEM entity classes precluded efficient integration with traditional GIS technologies due to the loosely coupled nature of their data representations. To address these shortcomings, an intelligent object-oriented GIS engine (OOGIS) was developed. This engine provides not only a spatially optimized object representation, but also direct linkages to the underlying object, its data and behaviors.

  14. Performance Evaluation of Extension Education Centers in Universities Based on the Balanced Scorecard

    ERIC Educational Resources Information Center

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-01-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance…

  15. Design of extensible meteorological data acquisition system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Yin-hua; Zhang, Hui-jun; Li, Xiao-hui

    2015-02-01

    In order to compensate the tropospheric refraction error generated in the process of satellite navigation and positioning. Temperature, humidity and air pressure had to be used in concerned models to calculate the value of this error. While FPGA XC6SLX16 was used as the core processor, the integrated silicon pressure sensor MPX4115A and digital temperature-humidity sensor SHT75 are used as the basic meteorological parameter detection devices. The core processer was used to control the real-time sampling of ADC AD7608 and to acquire the serial output data of SHT75. The data was stored in the BRAM of XC6SLX16 and used to generate standard meteorological parameters in NEMA format. The whole design was based on Altium hardware platform and ISE software platform. The system was described in the VHDL language and schematic diagram to realize the correct detection of temperature, humidity, air pressure. The 8-channel synchronous sampling characteristics of AD7608 and programmable external resources of FPGA laid the foundation for the increasing of analog or digital meteorological element signal. The designed meteorological data acquisition system featured low cost, high performance, multiple expansions.

  16. XSemantic: An Extension of LCA Based XML Semantic Search

    NASA Astrophysics Data System (ADS)

    Supasitthimethee, Umaporn; Shimizu, Toshiyuki; Yoshikawa, Masatoshi; Porkaew, Kriengkrai

    One of the most convenient ways to query XML data is a keyword search because it does not require any knowledge of XML structure or learning a new user interface. However, the keyword search is ambiguous. The users may use different terms to search for the same information. Furthermore, it is difficult for a system to decide which node is likely to be chosen as a return node and how much information should be included in the result. To address these challenges, we propose an XML semantic search based on keywords called XSemantic. On the one hand, we give three definitions to complete in terms of semantics. Firstly, the semantic term expansion, our system is robust from the ambiguous keywords by using the domain ontology. Secondly, to return semantic meaningful answers, we automatically infer the return information from the user queries and take advantage of the shortest path to return meaningful connections between keywords. Thirdly, we present the semantic ranking that reflects the degree of similarity as well as the semantic relationship so that the search results with the higher relevance are presented to the users first. On the other hand, in the LCA and the proximity search approaches, we investigated the problem of information included in the search results. Therefore, we introduce the notion of the Lowest Common Element Ancestor (LCEA) and define our simple rule without any requirement on the schema information such as the DTD or XML Schema. The first experiment indicated that XSemantic not only properly infers the return information but also generates compact meaningful results. Additionally, the benefits of our proposed semantics are demonstrated by the second experiment.

  17. Artificial Intelligence based technique for BTS placement

    NASA Astrophysics Data System (ADS)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  18. An Extensible Space-Based Coordination Approach for Modeling Complex Patterns in Large Systems

    NASA Astrophysics Data System (ADS)

    Kühn, Eva; Mordinyi, Richard; Schreiber, Christian

    Coordination is frequently associated with shared data spaces employing Linda coordination. But in practice, communication between parallel and distributed processes is carried out with message exchange patterns. What, actually, do shared data spaces contribute beyond these? In this paper we present a formal representation for a definition of shared spaces by introducing an "extensible tuple model", based on existing research on Linda coordination, some Linda extensions, and virtual shared memory. The main enhancements of the extensible tuple model comprise: means for structuring of spaces, Internet- compatible addressing of resources, more powerful coordination capabilities, a clear separation of user data and coordination information, support of symmetric peer application architectures, and extensibility through programmable aspects. The advantages of the extensible tuple model (XTM) are that it allows for a specification of complex coordination patterns.

  19. The Knowledge Base as an Extension of Distance Learning Reference Service

    ERIC Educational Resources Information Center

    Casey, Anne Marie

    2012-01-01

    This study explores knowledge bases as extension of reference services for distance learners. Through a survey and follow-up interviews with distance learning librarians, this paper discusses their interest in creating and maintaining a knowledge base as a resource for reference services to distance learners. It also investigates their perceptions…

  20. Graduate Followup: B.S. Occupational Education Extension Program at Naval Submarine Base, Bangor, Washington.

    ERIC Educational Resources Information Center

    Fellows, George; And Others

    Graduates of a military-base extension program at the Naval Submarine Base, Bangor, Washington, leading to a Bachelor of Science degree in occupational education were studied. Graduates are prepared to teach their occupational specialty at colleges as well as for occupational education work in government, private enterprise, and health care…

  1. Extension of 193 nm dry lithography to 45-nm half-pitch node: double exposure and double processing technique

    NASA Astrophysics Data System (ADS)

    Biswas, Abani M.; Li, Jianliang; Hiserote, Jay A.; Melvin, Lawrence S., III

    2006-10-01

    Immersion lithography and multiple exposure techniques are the most promising methods to extend lithography manufacturing to the 45nm node. Although immersion lithography has attracted much attention recently as a promising optical lithography extension, it will not solve all the problems at the 45-nm node. The 'dry' option, (i.e. double exposure/etch) which can be realized with standard processing practice, will extend 193-nm lithography to the end of the current industry roadmap. Double exposure/etch lithography is expensive in terms of cost, throughput time, and overlay registration accuracy. However, it is less challenging compared to other possible alternatives and has the ability to break through the κ I barrier (0.25). This process, in combination with attenuated PSM (att-PSM) mask, is a good imaging solution that can reach, and most likely go beyond, the 45-nm node. Mask making requirements in a double exposure scheme will be reduced significantly. This can be appreciated by the fact that the separation of tightly-pitched mask into two less demanding pitch patterns will reduce the stringent specifications for each mask. In this study, modeling of double exposure lithography (DEL) with att-PSM masks to target 45-nm node is described. In addition, mask separation and implementation issues of optical proximity corrections (OPC) to improve process window are studied. To understand the impact of OPC on the process window, Fourier analysis of the masks has been carried out as well.

  2. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project

  3. Towards Semantically Sensitive Text Clustering: A Feature Space Modeling Technology Based on Dimension Extension

    PubMed Central

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach. PMID:25794172

  4. Towards semantically sensitive text clustering: a feature space modeling technology based on dimension extension.

    PubMed

    Liu, Yuanchao; Liu, Ming; Wang, Xin

    2015-01-01

    The objective of text clustering is to divide document collections into clusters based on the similarity between documents. In this paper, an extension-based feature modeling approach towards semantically sensitive text clustering is proposed along with the corresponding feature space construction and similarity computation method. By combining the similarity in traditional feature space and that in extension space, the adverse effects of the complexity and diversity of natural language can be addressed and clustering semantic sensitivity can be improved correspondingly. The generated clusters can be organized using different granularities. The experimental evaluations on well-known clustering algorithms and datasets have verified the effectiveness of our approach.

  5. Analysis and amendment of flow control credit-based in SAN extension

    NASA Astrophysics Data System (ADS)

    Qin, Leihua; Yu, Shengsheng; Zhou, Jingli

    2005-11-01

    As organizations increasingly face an enormous influx of data that must be stored, protected, backed up and replicated. One of the best ways to achieve the goal is to interconnect geographically dispersed SANs through reliable and high-speed links. In this storage extension application, flow control deals with the problem where a device receives the frames faster than it can process them, when this happens, the result is that the device is forced to drop some of the frames. The FC flow control protocol is a credit-based mechanism and usually used for SAN extension over WDM and over SONET/SDH. With FC flow control, when a source storage device intends to send data to a target storage device, the initiating storage device must receive credits from target device. For every credit the initiating device obtains, it is permitted to transmit a FC frame, so congestion is always avoided in the network. This paper analysis the mechanisms of FC flow control and it's limitation in SAN extension when the extension distance increases. Computing result indicates that the maximum link efficiency and throughput in SAN extension have relation to credits, frame size and extension distance. In order to achieve the maximum link efficiency and throughput, an extended FC flow control mechanisms are proposed.

  6. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  7. Strategic Partnerships that Strengthen Extension's Community-Based Entrepreneurship Programs: An Example from Maine

    ERIC Educational Resources Information Center

    Bassano, Louis V.; McConnon, James C., Jr.

    2011-01-01

    This article explains how Extension can enhance and expand its nationwide community-based entrepreneurship programs by developing strategic partnerships with other organizations to create highly effective educational programs for rural entrepreneurs. The activities and impacts of the Down East Micro-Enterprise Network (DEMN), an alliance of three…

  8. Raising Awareness of Assistive Technology in Older Adults through a Community-Based, Cooperative Extension Program

    ERIC Educational Resources Information Center

    Sellers, Debra M.; Markham, Melinda Stafford

    2012-01-01

    The Fashion an Easier Lifestyle with Assistive Technology (FELAT) curriculum was developed as a needs-based, community educational program provided through a state Cooperative Extension Service. The overall goal for participants was to raise awareness of assistive technology. Program evaluation included a postassessment and subsequent interview to…

  9. School-Based Peer Mediation Programs: A Natural Extension of Developmental Guidance Programs.

    ERIC Educational Resources Information Center

    Robertson, Gwendolyn

    School-based peer mediation programs are natural extensions of the kindergarten-grade 12 developmental guidance programs. Peer mediation programs not only provide schools with alternatives to traditional discipline practices, but also teach students important life skills. Existing research on peer mediation is very limited, yet promising. This…

  10. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.

    2014-12-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes?An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project.

  11. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  12. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  13. Performance evaluation of extension education centers in universities based on the balanced scorecard.

    PubMed

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-02-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance evaluation indices have been selected and then utilizing the decision making trial and evaluation laboratory (DEMATEL) and analytic network process (ANP), respectively, further establishes the causality between the four BSC perspectives as well as the relative weights between evaluation indices. According to this previous result, an empirical analysis of the performance evaluation of extension education centers of three universities at Taoyuan County in Taiwan is illustrated by applying VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). From the analysis results, it indicates that "Learning and growth" is the significant influential factor and it would affect the other three perspectives. In addition, it is discovered that "Internal process" perspective as well as "Financial" perspective play important roles in the performance evaluation of extension education centers. The top three key performance indices are "After-sales service", "Turnover volume", and "Net income". The proposed evaluation model could be considered as a reference for extension education centers in universities to prioritize their improvements on the key performance indices after performing VIKOR analyses.

  14. Performance evaluation of extension education centers in universities based on the balanced scorecard.

    PubMed

    Wu, Hung-Yi; Lin, Yi-Kuei; Chang, Chi-Hsiang

    2011-02-01

    This study aims at developing a set of appropriate performance evaluation indices mainly based on balanced scorecard (BSC) for extension education centers in universities by utilizing multiple criteria decision making (MCDM). Through literature reviews and experts who have real practical experiences in extension education, adequate performance evaluation indices have been selected and then utilizing the decision making trial and evaluation laboratory (DEMATEL) and analytic network process (ANP), respectively, further establishes the causality between the four BSC perspectives as well as the relative weights between evaluation indices. According to this previous result, an empirical analysis of the performance evaluation of extension education centers of three universities at Taoyuan County in Taiwan is illustrated by applying VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR). From the analysis results, it indicates that "Learning and growth" is the significant influential factor and it would affect the other three perspectives. In addition, it is discovered that "Internal process" perspective as well as "Financial" perspective play important roles in the performance evaluation of extension education centers. The top three key performance indices are "After-sales service", "Turnover volume", and "Net income". The proposed evaluation model could be considered as a reference for extension education centers in universities to prioritize their improvements on the key performance indices after performing VIKOR analyses. PMID:20619892

  15. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    NASA Astrophysics Data System (ADS)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  16. Research on the alternatives in a strategic environmental assessment based on the extension theory.

    PubMed

    Du, Jing; Yang, Yang; Xu, Ling; Zhang, Shushen; Yang, Fenglin

    2012-09-01

    The main purpose of a strategic environmental assessment (SEA) is to facilitate the early consideration of potential environmental impacts in decision-making processes. SEA alternative identification is a core issue within the SEA framework. However, the current methods of SEA alternative formulation and selection are constrained by the limited setting range and lack of scientific evaluation. Thus, the current paper attempts to provide a new methodology based on the extension theory to identify a range of alternatives and screen the best one. Extension planning is applied to formulate a set of alternatives that satisfy the reasonable interests of the stakeholders. Extension priority evaluation is used to assess and optimize the alternatives and present a scientific methodology for the SEA alternative study. Thereafter, the urban traffic plan of Dalian City is used as an example to demonstrate the feasibility of the new method. The traffic planning scheme and the environmental protection scheme are organically combined based on the extension theory, and the reliability and practicality of this approach are examined.

  17. Association of Anterior and Lateral Extraprostatic Extensions with Base-Positive Resection Margins in Prostate Cancer

    PubMed Central

    Abalajon, Mark Joseph; Jang, Won Sik; Kwon, Jong Kyou; Yoon, Cheol Yong; Lee, Joo Yong; Cho, Kang Su; Ham, Won Sik

    2016-01-01

    Introduction Positive surgical margins (PSM) detected in the radical prostatectomy specimen increase the risk of biochemical recurrence (BCR). Still, with formidable number of patients never experiencing BCR in their life, the reason for this inconsistency has been attributed to the artifacts and to the spontaneous regression of micrometastatic site. To investigate the origin of margin positive cancers, we have looked into the influence of extraprostatic extension location on the resection margin positive site and its implications on BCR risk. Materials & Methods The clinical information and follow-up data of 612 patients who had extraprostatic extension and positive surgical margin at the time of robot assisted radical prostatectomy (RARP) in the single center between 2005 and 2014 were modeled using Fine and Gray’s competing risk regression analysis for BCR. Extraprostatic extensions were divided into categories according to location as apex, base, anterior, posterior, lateral, and posterolateral. Extraprostatic extensions were defined as presence of tumor beyond the borders of the gland in the posterior and posterolateral regions. Tumor admixed with periprostatic fat was additionally considered as having extraprostatic extension if capsule was vague in the anterior, apex, and base regions. Positive surgical margins were defined as the presence of tumor cells at the inked margin on the inspection under microscopy. Association of these classifications with the site of PSM was evaluated by Cohen’s Kappa analysis for concordance and logistic regression for the odds of apical and base PSMs. Results Median follow-up duration was 36.5 months (interquartile range[IQR] 20.1–36.5). Apex involvement was found in 158 (25.8%) patients and base in 110 (18.0%) patients. PSMs generally were found to be associated with increased risk of BCR regardless of location, with BCR risk highest for base PSM (HR 1.94, 95% CI 1.40–2.68, p<0.001) after adjusting for age, initial

  18. Physics based modeling of a series parallel battery pack for asymmetry analysis, predictive control and life extension

    NASA Astrophysics Data System (ADS)

    Ganesan, Nandhini; Basu, Suman; Hariharan, Krishnan S.; Kolake, Subramanya Mayya; Song, Taewon; Yeo, Taejung; Sohn, Dong Kee; Doo, Seokgwang

    2016-08-01

    Lithium-Ion batteries used for electric vehicle applications are subject to large currents and various operation conditions, making battery pack design and life extension a challenging problem. With increase in complexity, modeling and simulation can lead to insights that ensure optimal performance and life extension. In this manuscript, an electrochemical-thermal (ECT) coupled model for a 6 series × 5 parallel pack is developed for Li ion cells with NCA/C electrodes and validated against experimental data. Contribution of the cathode to overall degradation at various operating conditions is assessed. Pack asymmetry is analyzed from a design and an operational perspective. Design based asymmetry leads to a new approach of obtaining the individual cell responses of the pack from an average ECT output. Operational asymmetry is demonstrated in terms of effects of thermal gradients on cycle life, and an efficient model predictive control technique is developed. Concept of reconfigurable battery pack is studied using detailed simulations that can be used for effective monitoring and extension of battery pack life.

  19. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  20. ChemSem: an extensible and scalable RSS-based seminar alerting system for scientific collaboration.

    PubMed

    Rzepa, Henry S; Wheat, Andrew; Williamson, Mark J

    2006-01-01

    A seminar announcement system based on the extensive use of XML-based data structures, CML/MathML for carrying more domain-specific molecular content, and open source software components is described. The output is a resource description framework (RDF) site summary (RSS) feed, which potentially carries many advantages over conventional announcement mechanisms, including the ability to aggregate and then sort multiple and diverse RSS feeds on the basis of declared metadata and to feed into RDF-based mechanisms for establishing links between different subject areas. PMID:16711716

  1. A giant vagal schwannoma with unusual extension from skull base to the mediastinum.

    PubMed

    Vijendra, Shenoy S; Rao, Raghavendra A; Prasad, Vishnu; Haseena, S; Niprupama, M

    2015-01-01

    Cervical vagal schwannoma is an extremely rare neoplasm. Middle aged people are usually affected. These tumors usually present as asymptomatic masses. These tumors are almost always benign. Preoperative diagnosis of these lesions is important due to the morbidity associated with its excision. Preoperative tissue diagnosis is not accurate. The imaging modality can be done to assess the extent and for planning the treatment. Surgical excision with preservation of neural origin is the treatment option. Giant vagal schwannomas are extremely rare. Only one case has been reported in the literature till date. There has no reported case of extensive vagal schwannoma from skull base to the mediastinum. Here, we describe the asymptomatic presentation of an unusual appearing giant cervical vagal schwannoma with an extension from skull base to the mediastinum. PMID:26881559

  2. Service-Based Extensions to an OAIS Archive for Science Data Management

    NASA Astrophysics Data System (ADS)

    Flathers, E.; Seamon, E.; Gessler, P. E.

    2014-12-01

    With new data management mandates from major funding sources such as the National Institutes for Health and the National Science Foundation, architecture of science data archive systems is becoming a critical concern for research institutions. The Consultative Committee for Space Data Systems (CCSDS), in 2002, released their first version of a Reference Model for an Open Archival Information System (OAIS). The CCSDS document (now an ISO standard) was updated in 2012 with additional focus on verifying the authenticity of data and developing concepts of access rights and a security model. The OAIS model is a good fit for research data archives, having been designed to support data collections of heterogeneous types, disciplines, storage formats, etc. for the space sciences. As fast, reliable, persistent Internet connectivity spreads, new network-available resources have been developed that can support the science data archive. A natural extension of an OAIS archive is the interconnection with network- or cloud-based services and resources. We use the Service Oriented Architecture (SOA) design paradigm to describe a set of extensions to an OAIS-type archive: purpose and justification for each extension, where and how each extension connects to the model, and an example of a specific service that meets the purpose.

  3. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  4. Key Point Based Data Analysis Technique

    NASA Astrophysics Data System (ADS)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  5. Latent practice profiles of substance abuse treatment counselors: do evidence-based techniques displace traditional techniques?

    PubMed

    Smith, Brenda D; Liu, Junqing

    2014-04-01

    As more substance abuse treatment counselors begin to use evidence-based treatment techniques, questions arise regarding the continued use of traditional techniques. This study aims to (1) assess whether there are meaningful practice profiles among practitioners reflecting distinct combinations of cognitive-behavioral and traditional treatment techniques; and (2) if so, identify practitioner characteristics associated with the distinct practice profiles. Survey data from 278 frontline counselors working in community substance abuse treatment organizations were used to conduct latent profile analysis. The emergent practice profiles illustrate that practitioners vary most in the use of traditional techniques. Multinomial regression models suggest that practitioners with less experience, more education, and less traditional beliefs about treatment and substance abuse are least likely to mix traditional techniques with cognitive-behavioral techniques. Findings add to the understanding of how evidence-based practices are implemented in routine settings and have implications for training and support of substance abuse treatment counselors.

  6. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  7. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension.

    PubMed

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-09-16

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N'-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials.

  8. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension

    PubMed Central

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-01-01

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N’-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials. PMID:27634095

  9. Enhanced mechanical performance of biocompatible hemicelluloses-based hydrogel via chain extension.

    PubMed

    Qi, Xian-Ming; Chen, Ge-Gu; Gong, Xiao-Dong; Fu, Gen-Que; Niu, Ya-Shuai; Bian, Jing; Peng, Feng; Sun, Run-Cang

    2016-01-01

    Hemicelluloses are widely used to prepare gel materials because of their renewability, biodegradability, and biocompatibility. Here, molecular chain extension of hemicelluloses was obtained in a two-step process. Composite hydrogels were prepared via free radical graft copolymerization of crosslinked quaternized hemicelluloses (CQH) and acrylic acid (AA) in the presence of crosslinking agent N,N'-methylenebisacrylamide (MBA). This chain extension strategy significantly improved the mechanical performance of the resulting hydrogels. The crosslinking density, compression modulus, and swelling capacities of hydrogels were tuned by changing the AA/CQH and MBA/CQH contents. Moreover, the biocompatibility test suggests that the hemicelluloses-based hydrogels exhibited no toxicity to cells and allowed cell growth. Taken together, these properties demonstrated that the composite hydrogels have potential applications in the fields of water absorbents, cell culture, and other functional biomaterials. PMID:27634095

  10. A repository based on a dynamically extensible data model supporting multidisciplinary research in neuroscience

    PubMed Central

    2012-01-01

    Background Robust, extensible and distributed databases integrating clinical, imaging and molecular data represent a substantial challenge for modern neuroscience. It is even more difficult to provide extensible software environments able to effectively target the rapidly changing data requirements and structures of research experiments. There is an increasing request from the neuroscience community for software tools addressing technical challenges about: (i) supporting researchers in the medical field to carry out data analysis using integrated bioinformatics services and tools; (ii) handling multimodal/multiscale data and metadata, enabling the injection of several different data types according to structured schemas; (iii) providing high extensibility, in order to address different requirements deriving from a large variety of applications simply through a user runtime configuration. Methods A dynamically extensible data structure supporting collaborative multidisciplinary research projects in neuroscience has been defined and implemented. We have considered extensibility issues from two different points of view. First, the improvement of data flexibility has been taken into account. This has been done through the development of a methodology for the dynamic creation and use of data types and related metadata, based on the definition of “meta” data model. This way, users are not constrainted to a set of predefined data and the model can be easily extensible and applicable to different contexts. Second, users have been enabled to easily customize and extend the experimental procedures in order to track each step of acquisition or analysis. This has been achieved through a process-event data structure, a multipurpose taxonomic schema composed by two generic main objects: events and processes. Then, a repository has been built based on such data model and structure, and deployed on distributed resources thanks to a Grid-based approach. Finally, data

  11. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  12. Field of view extension and truncation correction for MR-based human attenuation correction in simultaneous MR/PET imaging

    SciTech Connect

    Blumhagen, Jan O. Ladebeck, Ralf; Fenchel, Matthias; Braun, Harald; Quick, Harald H.; Faul, David; Scheffler, Klaus

    2014-02-15

    Purpose: In quantitative PET imaging, it is critical to accurately measure and compensate for the attenuation of the photons absorbed in the tissue. While in PET/CT the linear attenuation coefficients can be easily determined from a low-dose CT-based transmission scan, in whole-body MR/PET the computation of the linear attenuation coefficients is based on the MR data. However, a constraint of the MR-based attenuation correction (AC) is the MR-inherent field-of-view (FoV) limitation due to static magnetic field (B{sub 0}) inhomogeneities and gradient nonlinearities. Therefore, the MR-based human AC map may be truncated or geometrically distorted toward the edges of the FoV and, consequently, the PET reconstruction with MR-based AC may be biased. This is especially of impact laterally where the patient arms rest beside the body and are not fully considered. Methods: A method is proposed to extend the MR FoV by determining an optimal readout gradient field which locally compensates B{sub 0} inhomogeneities and gradient nonlinearities. This technique was used to reduce truncation in AC maps of 12 patients, and the impact on the PET quantification was analyzed and compared to truncated data without applying the FoV extension and additionally to an established approach of PET-based FoV extension. Results: The truncation artifacts in the MR-based AC maps were successfully reduced in all patients, and the mean body volume was thereby increased by 5.4%. In some cases large patient-dependent changes in SUV of up to 30% were observed in individual lesions when compared to the standard truncated attenuation map. Conclusions: The proposed technique successfully extends the MR FoV in MR-based attenuation correction and shows an improvement of PET quantification in whole-body MR/PET hybrid imaging. In comparison to the PET-based completion of the truncated body contour, the proposed method is also applicable to specialized PET tracers with little uptake in the arms and might

  13. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  14. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    NASA Astrophysics Data System (ADS)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  15. Non-destructive techniques based on eddy current testing.

    PubMed

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future.

  16. Non-Destructive Techniques Based on Eddy Current Testing

    PubMed Central

    García-Martín, Javier; Gómez-Gil, Jaime; Vázquez-Sánchez, Ernesto

    2011-01-01

    Non-destructive techniques are used widely in the metal industry in order to control the quality of materials. Eddy current testing is one of the most extensively used non-destructive techniques for inspecting electrically conductive materials at very high speeds that does not require any contact between the test piece and the sensor. This paper includes an overview of the fundamentals and main variables of eddy current testing. It also describes the state-of-the-art sensors and modern techniques such as multi-frequency and pulsed systems. Recent advances in complex models towards solving crack-sensor interaction, developments in instrumentation due to advances in electronic devices, and the evolution of data processing suggest that eddy current testing systems will be increasingly used in the future. PMID:22163754

  17. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  18. Fast conjugate gradient algorithm extension for analyzer-based imaging reconstruction

    NASA Astrophysics Data System (ADS)

    Caudevilla, Oriol; Brankov, Jovan G.

    2016-04-01

    This paper presents an extension of the classic Conjugate Gradient Algorithm. Motivated by the Analyzer-Based Imaging inverse problem, the novel method maximizes the Poisson regularized log-likelihood with a non-linear transformation of parameter faster than other solutions. The new approach takes advantage of the special properties of the Poisson log-likelihood to conjugate each ascend direction with respect all the previous directions taken by the algorithm. Our solution is compared with the general solution for non-quadratic unconstrained problems: the Polak- Ribiere formula. Both methods are applied to the ABI reconstruction problem.

  19. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  20. Knowledge based systems: A critical survey of major concepts, issues and techniques. Visuals

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry represents a collection of presentation visuals associated with the companion report entitled, Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques, USL/DBMS NASA/RECON Working Paper Series report number DBMS.NASA/RECON-9. The objectives of the report are to: examine various techniques used to build the KBS; to examine at least one KBS in detail, i.e., a case study; to list and identify limitations and problems with the KBS; to suggest future areas of research; and to provide extensive reference materials.

  1. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  2. A technique to identify solvable dynamical systems, and another solvable extension of the goldfish many-body problem

    NASA Astrophysics Data System (ADS)

    Calogero, Francesco

    2004-12-01

    We take advantage of the simple approach, recently discussed, which associates to (solvable) matrix equations (solvable) dynamical systems interpretable as (interesting) many-body problems, possibly involving auxiliary dependent variables in addition to those identifying the positions of the moving particles. Starting from a solvable matrix evolution equation, we obtain the corresponding many-body model and note that in one case the auxiliary variables can be altogether eliminated, obtaining thereby an (also Hamiltonian) extension of the "goldfish" model. The solvability of this novel model, and of its isochronous variant, is exhibited. A related, as well solvable, model, is also introduced, as well as its isochronous variant. Finally, the small oscillations of the isochronous models around their equilibrium configurations are investigated, and from their isochronicity certain diophantine relations are evinced.

  3. Polynomial optimization techniques for activity scheduling. Optimization based prototype scheduler

    NASA Technical Reports Server (NTRS)

    Reddy, Surender

    1991-01-01

    Polynomial optimization techniques for activity scheduling (optimization based prototype scheduler) are presented in the form of the viewgraphs. The following subject areas are covered: agenda; need and viability of polynomial time techniques for SNC (Space Network Control); an intrinsic characteristic of SN scheduling problem; expected characteristics of the schedule; optimization based scheduling approach; single resource algorithms; decomposition of multiple resource problems; prototype capabilities, characteristics, and test results; computational characteristics; some features of prototyped algorithms; and some related GSFC references.

  4. Evidence-Based Programming within Cooperative Extension: How Can We Maintain Program Fidelity While Adapting to Meet Local Needs?

    ERIC Educational Resources Information Center

    Olson, Jonathan R.; Welsh, Janet A.; Perkins, Daniel F.

    2015-01-01

    In this article, we describe how the recent movement towards evidence-based programming has impacted Extension. We review how the emphasis on implementing such programs with strict fidelity to an underlying program model may be at odds with Extension's strong history of adapting programming to meet the unique needs of children, youth, families,…

  5. In the Field: Increasing Undergraduate Students' Awareness of Extension through a Blended Project-Based Multimedia Production Course

    ERIC Educational Resources Information Center

    Loizzo, Jamie; Lillard, Patrick

    2015-01-01

    Undergraduate students at land-grant institutions across the country are often unaware of the depth and breadth of Extension services and careers. Agricultural communication students collaborated with an Extension programmatic team in a blended and project-based course at Purdue University to develop online videos about small farm agricultural…

  6. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  7. Application of glyph-based techniques for multivariate engineering visualization

    NASA Astrophysics Data System (ADS)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  8. Rule Induction with Extension Matrices.

    ERIC Educational Resources Information Center

    Wu, Xindong

    1998-01-01

    Presents a heuristic, attribute-based, noise-tolerant data mining program, HCV (Version 2.0) based on the newly-developed extension matrix approach. Outlines some techniques implemented in the HCV program for noise handling and discretization of continuous domains; an empirical comparison shows that rules generated by HCV are more compact than the…

  9. Extensive aqueous deposits at the base of the dichotomy boundary in Nilosyrtis Mensae, Mars

    NASA Astrophysics Data System (ADS)

    Bandfield, Joshua L.; Amador, Elena S.

    2016-09-01

    Thermal emission imaging system (THEMIS) and Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) spectral datasets were used to identify high bulk SiO2 and hydrated compositions throughout the Nilosyrtis Mensae region. Four isolated locations were identified across the region showing short wavelength silicate absorptions within the 8-12 μm spectral region, indicating surfaces dominated by high Si phases. Much more extensive exposures of hydrated compositions are present throughout the region, indicated by a spectral absorption near 1.9 μm in CRISM data. Although limited in spatial coverage, detailed spectral observations indicate that the hydrated materials contain Fe/Mg-smectites and hydrated silica along with minor exposures of Mg-carbonates and an unidentified hydrated phase. The high SiO2 and hydrated materials are present in layered sediments near the base of topographic scarps at the hemispheric dichotomy boundary, typically near or within low albedo sand deposits. The source of the high SiO2 and hydrated materials appears to be from groundwater discharge from Nili Fossae and Syrtis Major to the south, where there is evidence for extensive aqueous alteration of the subsurface. Although discontinuous, the exposures of high SiO2 and hydrated materials span a wide area and are present in a similar geomorphological context to previously identified deposits in western Hellas Basin. These regional deposits may reflect aqueous conditions and alteration within the adjacent crust of the martian highlands.

  10. Block Copolymer-Based Supramolecular Elastomers with High Extensibility and Large Stress Generation Capability

    NASA Astrophysics Data System (ADS)

    Noro, Atsushi; Hayashi, Mikihiro

    We prepared block copolymer-based supramolecular elastomers with high extensibility and large stress generation capability. Reversible addition fragmentation chain transfer polymerizations were conducted under normal pressure and high pressure to synthesize several large molecular weight polystyrene-b-[poly(butyl acrylate)-co-polyacrylamide]-b-polystyrene (S-Ba-S) block copolymers. Tensile tests revealed that the largest S-Ba-S with middle block molecular weight of 3140k achieved a breaking elongation of over 2000% with a maximum tensile stress of 3.6 MPa and a toughness of 28 MJ/m3 while the reference sample without any middle block hydrogen bonds, polystyrene-b-poly(butyl acrylate)-b-polystyrene with almost the same molecular weight, was merely viscous and not self-standing. Hence, incorporation of hydrogen bonds into a long soft middle block was found to be beneficial to attain high extensibility and large stress generation capability probably due to concerted combination of entropic changes and internal potential energy changes originaing from the dissociation of multiple hydrogen bonds by elongation. This work was supported by JSPS KAKENHI Grant Numbers 13J02357, 24685035, 15K13785, and 23655213 for M.H. and A.N. A.N. also expresses his gratitude for Tanaka Rubber Science & Technology Award by Enokagaku-Shinko Foundation, Japan.

  11. Sound source localization technique using a seismic streamer and its extension for whale localization during seismic surveys.

    PubMed

    Abadi, Shima H; Wilcock, William S D; Tolstoy, Maya; Crone, Timothy J; Carbotte, Suzanne M

    2015-12-01

    Marine seismic surveys are under increasing scrutiny because of concern that they may disturb or otherwise harm marine mammals and impede their communications. Most of the energy from seismic surveys is low frequency, so concerns are particularly focused on baleen whales. Extensive mitigation efforts accompany seismic surveys, including visual and acoustic monitoring, but the possibility remains that not all animals in an area can be observed and located. One potential way to improve mitigation efforts is to utilize the seismic hydrophone streamer to detect and locate calling baleen whales. This study describes a method to localize low frequency sound sources with data recoded by a streamer. Beamforming is used to estimate the angle of arriving energy relative to sub-arrays of the streamer which constrains the horizontal propagation velocity to each sub-array for a given trial location. A grid search method is then used to minimize the time residual for relative arrival times along the streamer estimated by cross correlation. Results from both simulation and experiment are shown and data from the marine mammal observers and the passive acoustic monitoring conducted simultaneously with the seismic survey are used to verify the analysis.

  12. Typing of 49 autosomal SNPs by single base extension and capillary electrophoresis for forensic genetic testing.

    PubMed

    Børsting, Claus; Tomas, Carmen; Morling, Niels

    2012-01-01

    We describe a method for simultaneous amplification of 49 autosomal single nucleotide polymorphisms (SNPs) by multiplex PCR and detection of the SNP alleles by single base extension (SBE) and capillary electrophoresis. All the SNPs may be amplified from only 100 pg of genomic DNA and the length of the amplicons range from 65 to 115 bp. The high sensitivity and the short amplicon sizes make the assay very suitable for typing of degraded DNA samples, and the low mutation rate of SNPs makes the assay very useful for relationship testing. Combined, these advantages make the assay well suited for disaster victim identifications, where the DNA from the victims may be highly degraded and the victims are identified via investigation of their relatives. The assay was validated according to the ISO 17025 standard and used for routine case work in our laboratory. PMID:22139655

  13. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  14. Multifunction extension of simplex optimization method for mutual information-based registration of ultrasound volumes

    NASA Astrophysics Data System (ADS)

    Zagrodsky, Vladimir; Shekhar, Raj; Cornhill, J. Fredrick

    2001-07-01

    Mutual information has been demonstrated to be an accurate and reliable criterion function to perform registration of medical data. Due to speckle noise, ultrasound volumes do not provide a smooth mutual information function. Consequently the optimization technique used must be robust enough to avoid local maxima and converge on the desired global maximum eventually. While the well-known downhill simplex optimization uses a single criterion function, our extension to multi-function optimization uses three criterion functions, namely mutual information computed at three levels of intensity quantization and hence three degrees of noise suppression. Registration was performed with rigid as well as simple non-rigid transformation modes for real-time 3D ultrasound datasets of the left ventricle. Pairs of frames corresponding to the most stationary end- diastolic cardiac phase were chosen, and an initial misalignment was artificially introduced between them. The multi-function simplex optimization reduced the failure rate by a factor of two in comparison to the standard simplex optimization, while the average accuracy for the successful cases was unchanged. A more robust registration resulted form the parallel use of criterion functions. The additional computational cost was negligible, as each of the three implementations of the mutual information used the same joint histogram and required no extra spatial transformation.

  15. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  16. Bond strength with custom base indirect bonding techniques.

    PubMed

    Klocke, Arndt; Shi, Jianmin; Kahl-Nieke, Bärbel; Bismayer, Ulrich

    2003-04-01

    Different types of adhesives for indirect bonding techniques have been introduced recently. But there is limited information regarding bond strength with these new materials. In this in vitro investigation, stainless steel brackets were bonded to 100 permanent bovine incisors using the Thomas technique, the modified Thomas technique, and light-cured direct bonding for a control group. The following five groups of 20 teeth each were formed: (1) modified Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Maximum Cure), (2) Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Custom I Q), (3) Thomas technique with light-cured base composite (Transbond XT) and chemically cured sealant (Sondhi Rapid Set), (4) modified Thomas technique with chemically cured base adhesive (Phase II) and chemically cured sealant (Maximum Cure), and (5) control group directly bonded with light-cured adhesive (Transbond XT). Mean bond strengths in groups 3, 4, and 5 were 14.99 +/- 2.85, 15.41 +/- 3.21, and 13.88 +/- 2.33 MPa, respectively, and these groups were not significantly different from each other. Groups 1 (mean bond strength 7.28 +/- 4.88 MPa) and 2 (mean bond strength 7.07 +/- 4.11 MPa) showed significantly lower bond strengths than groups 3, 4, and 5 and a higher probability of bond failure. Both the original (group 2) and the modified (group 1) Thomas technique were able to achieve bond strengths comparable to the light-cured direct bonded control group.

  17. Extensibility in local sensor based planning for hyper-redundant manipulators (robot snakes)

    NASA Technical Reports Server (NTRS)

    Choset, Howie; Burdick, Joel

    1994-01-01

    Partial Shape Modification (PSM) is a local sensor feedback method used for hyper-redundant robot manipulators, in which the redundancy is very large or infinite such as that of a robot snake. This aspect of redundancy enables local obstacle avoidance and end-effector placement in real time. Due to the large number of joints or actuators in a hyper-redundant manipulator, small displacement errors of such easily accumulate to large errors in the position of the tip relative to the base. The accuracy could be improved by a local sensor based planning method in which sensors are distributed along the length of the hyper-redundant robot. This paper extends the local sensor based planning strategy beyond the limitations of the fixed length of such a manipulator when its joint limits are met. This is achieved with an algorithm where the length of the deforming part of the robot is variable. Thus , the robot's local avoidance of obstacles is improved through the enhancement of its extensibility.

  18. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia.

    PubMed

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features. PMID:27570653

  19. Community-based Ontology Development, Annotation and Discussion with MediaWiki extension Ontokiwi and Ontokiwi-based Ontobedia

    PubMed Central

    Ong, Edison; He, Yongqun

    2016-01-01

    Hundreds of biological and biomedical ontologies have been developed to support data standardization, integration and analysis. Although ontologies are typically developed for community usage, community efforts in ontology development are limited. To support ontology visualization, distribution, and community-based annotation and development, we have developed Ontokiwi, an ontology extension to the MediaWiki software. Ontokiwi displays hierarchical classes and ontological axioms. Ontology classes and axioms can be edited and added using Ontokiwi form or MediaWiki source editor. Ontokiwi also inherits MediaWiki features such as Wikitext editing and version control. Based on the Ontokiwi/MediaWiki software package, we have developed Ontobedia, which targets to support community-based development and annotations of biological and biomedical ontologies. As demonstrations, we have loaded the Ontology of Adverse Events (OAE) and the Cell Line Ontology (CLO) into Ontobedia. Our studies showed that Ontobedia was able to achieve expected Ontokiwi features. PMID:27570653

  20. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  1. Extensions of the lost letter technique to divisive issues of creationism, darwinism, sex education, and gay and lesbian affiliations.

    PubMed

    Bridges, F Stephen; Anzalone, Debra A; Ryan, Stuart W; Anzalone, Fanancy L

    2002-04-01

    Two field studies using 1,004 "lost letters" were designed to test the hypotheses that returned responses would be greater in small towns than from a city, that addressees' affiliation with a group either (1) opposed to physical education in schools, (2) supporting gay and lesbian teachers, or (3) advocating Creationism or Darwinism would reduce the return rate. Of 504 letters "lost" in Study A, 163 (32.3%) were returned in the mail from residents of southeast Louisiana and indicated across 3 addressees and 2 sizes of community, addressees' affiLiations were not associated with returned responses. Community size and addressees' affiliations were associated with significantly different rates of return in the city. Return rates from sites within a city were lower when letters were addressed to an organization which opposed (teaching) health education in the schools than to one supporting daily health education. Of 500 letters "lost" in Study B, 95 (19.0%) were returned from residents of northwest Florida and indicated across 5 addressees and 2 sizes of community, addressees' affiliations were significantly associated with returned responses overall (5 addressees) and in small towns (control, Creationism, Darwinism addressees), but not with community size. Community size and addressees' affiliations were associated with significantly different rates of return in small towns, with returns greater than or equal to those in the city (except for the addressee advocating teaching Darwinism in public schools). The present findings appear to show that applications of the lost letter technique to other divisive social issues are useful in assessing public opinion. PMID:12061574

  2. Extensions of the lost letter technique to divisive issues of creationism, darwinism, sex education, and gay and lesbian affiliations.

    PubMed

    Bridges, F Stephen; Anzalone, Debra A; Ryan, Stuart W; Anzalone, Fanancy L

    2002-04-01

    Two field studies using 1,004 "lost letters" were designed to test the hypotheses that returned responses would be greater in small towns than from a city, that addressees' affiliation with a group either (1) opposed to physical education in schools, (2) supporting gay and lesbian teachers, or (3) advocating Creationism or Darwinism would reduce the return rate. Of 504 letters "lost" in Study A, 163 (32.3%) were returned in the mail from residents of southeast Louisiana and indicated across 3 addressees and 2 sizes of community, addressees' affiLiations were not associated with returned responses. Community size and addressees' affiliations were associated with significantly different rates of return in the city. Return rates from sites within a city were lower when letters were addressed to an organization which opposed (teaching) health education in the schools than to one supporting daily health education. Of 500 letters "lost" in Study B, 95 (19.0%) were returned from residents of northwest Florida and indicated across 5 addressees and 2 sizes of community, addressees' affiliations were significantly associated with returned responses overall (5 addressees) and in small towns (control, Creationism, Darwinism addressees), but not with community size. Community size and addressees' affiliations were associated with significantly different rates of return in small towns, with returns greater than or equal to those in the city (except for the addressee advocating teaching Darwinism in public schools). The present findings appear to show that applications of the lost letter technique to other divisive social issues are useful in assessing public opinion.

  3. A Lyapunov-Based Extension to Particle Swarm Dynamics for Continuous Function Optimization

    PubMed Central

    Bhattacharya, Sayantani; Konar, Amit; Das, Swagatam; Han, Sang Yong

    2009-01-01

    The paper proposes three alternative extensions to the classical global-best particle swarm optimization dynamics, and compares their relative performance with the standard particle swarm algorithm. The first extension, which readily follows from the well-known Lyapunov's stability theorem, provides a mathematical basis of the particle dynamics with a guaranteed convergence at an optimum. The inclusion of local and global attractors to this dynamics leads to faster convergence speed and better accuracy than the classical one. The second extension augments the velocity adaptation equation by a negative randomly weighted positional term of individual particle, while the third extension considers the negative positional term in place of the inertial term. Computer simulations further reveal that the last two extensions outperform both the classical and the first extension in terms of convergence speed and accuracy. PMID:22303158

  4. An improved method of gene synthesis based on DNA works software and overlap extension PCR.

    PubMed

    Dong, Bingxue; Mao, Runqian; Li, Baojian; Liu, Qiuyun; Xu, Peilin; Li, Gang

    2007-11-01

    A bottleneck in recent gene synthesis technologies is the high cost of oligonucleotide synthesis and post-synthesis sequencing. In this article, a simple and rapid method for low-cost gene synthesis technology was developed based on DNAWorks program and an improved single-step overlap extension PCR (OE-PCR). This method enables any DNA sequence to be synthesized with few errors, then any mutated sites could be corrected by site-specific mutagenesis technology or PCR amplification-assembly method, which can amplify different DNA fragments of target gene followed by assembly into an entire gene through their overlapped region. Eventually, full-length DNA sequence without error was obtained via this novel method. Our method is simple, rapid and low-cost, and also easily amenable to automation based on a DNAWorks design program and defined set of OE-PCR reaction conditions suitable for different genes. Using this method, several genes including Manganese peroxidase gene (Mnp) of Phanerochaete chrysosporium (P. chrysosporium), Laccase gene (Lac) of Trametes versicolor (T. versicolor) and Cip1 peroxidase gene (cip 1) of Coprinus cinereus (C. cinereus) with sizes ranging from 1.0 kb to 1.5 kb have been synthesized successfully.

  5. Spatial representativeness of ground-based solar radiation measurements - Extension to the full Meteosat disk

    NASA Astrophysics Data System (ADS)

    Zyta Hakuba, Maria; Folini, Doris; Sanchez-Lorenzo, Arturo; Wild, Martin

    2015-04-01

    The spatial representativeness of a point measurement of surface solar radiation (SSR) of its larger-scale surrounding, e.g. collocated grid cell, is a potential source of uncertainty in the validation of climate models and satellite products. Here, we expand our previous study over Europe to the entire Meteosat disk, covering additional climate zones in Africa, the Middle east, and South America between -70° to 70° East and -70° to 70° North. Using a high-resolution (0.03°) satellite-based SSR dataset (2001-2005), we quantify the spatial subgrid variability in grids of 1° and 3° resolution and the spatial representativeness of 887 surface sites with respect to site-centered surroundings of variable size. In the multi-annual mean the subgrid variability is the largest in some mountainous and coastal regions, but varies seasonally due to changes in the ITCZ location. The absolute mean representation errors at the surface sites with respect to surroundings of 1° and 3° are on average 1-2 Hakuba, M. Z., D. Folini, A. Sanchez-Lorenzo, and M. Wild, Spatial representativeness of ground-based solar radiation measurements - Extension to the full Meteosat disk, J. Geophys. Res. Atmos., 119, doi:10.1002/2014JD021946,2014.

  6. The Role of Extension Specialists in Helping Entrepreneurs Develop Successful Food-Based Businesses.

    ERIC Educational Resources Information Center

    Holcomb, Rodney; Muske, Glenn

    2000-01-01

    Three areas in which extension specialists can assist food industry entrepreneurs include (1) awareness of the components of a business plan, (2) pro forma financial analysis, and (3) legal issues affecting the food industry. In addition to specialized expertise, extension professionals can help with making contacts, objectively review business…

  7. Promoting Behavior Change Using Social Norms: Applying a Community Based Social Marketing Tool to Extension Programming

    ERIC Educational Resources Information Center

    Chaudhary, Anil Kumar; Warner, Laura A.

    2015-01-01

    Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…

  8. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  9. Contraction-based classification of supersymmetric extensions of kinematical lie algebras

    SciTech Connect

    Campoamor-Stursberg, R.; Rausch de Traubenberg, M.

    2010-02-15

    We study supersymmetric extensions of classical kinematical algebras from the point of view of contraction theory. It is shown that contracting the supersymmetric extension of the anti-de Sitter algebra leads to a hierarchy similar in structure to the classical Bacry-Levy-Leblond classification.

  10. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula.

    PubMed

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P < .0001). The control group reported significantly more days with >3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted.

  11. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula

    PubMed Central

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P < .0001). The control group reported significantly more days with >3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted. PMID:27336009

  12. A Randomized Controlled Trial Assessing Growth of Infants Fed a 100% Whey Extensively Hydrolyzed Formula Compared With a Casein-Based Extensively Hydrolyzed Formula.

    PubMed

    Fields, David; Czerkies, Laura; Sun, Shumei; Storm, Heidi; Saavedra, José; Sorensen, Ricardo

    2016-01-01

    This study compared the growth of healthy infants fed a hypoallergenic 100% whey-based extensively hydrolyzed formula (EHF) with Bifidobacterium lactis (test) with that of infants fed an extensively hydrolyzed casein formula (control). Formula-fed infants (14 ± 3 days) were randomized to test or control groups until 112 days of age. Anthropometrics were assessed at 14, 28, 56, 84, and 112 days, and daily records were kept for 2 days prior to study visits. Serum albumin and plasma amino acids at 84 days were assessed in a subset. A total of 282 infants were randomized (124 test, 158 control). Significantly more infants dropped out of the control (56%) as compared with the test (41%) group. Mean daily weight gain was significantly higher in the test group compared with the control group (27.95 ± 5.91 vs 25.93 ± 6.12 g/d; P = .027) with the test group reporting significantly fewer stools (2.2 vs 3.6 stools/d; P < .0001). The control group reported significantly more days with >3 loose stools/d and a higher incidence of vomiting as compared with the test group. There were no differences in gas, mood, sleep, or serum albumin. Plasma arginine and valine were significantly lower in the test group, whereas leucine and lysine were higher; all values were within normal limits. Significantly more adverse events attributed to the study formula were reported in the control group. The 100% whey-based hypoallergenic EHF containing Bifidobacterium lactis and medium chain triglycerides supported growth of healthy infants. Future studies on the application of this formula in clinically indicated populations are warranted. PMID:27336009

  13. multiplierz: an extensible API based desktop environment for proteomics data analysis

    PubMed Central

    Parikh, Jignesh R; Askenazi, Manor; Ficarro, Scott B; Cashorali, Tanya; Webber, James T; Blank, Nathaniel C; Zhang, Yi; Marto, Jarrod A

    2009-01-01

    Background Efficient analysis of results from mass spectrometry-based proteomics experiments requires access to disparate data types, including native mass spectrometry files, output from algorithms that assign peptide sequence to MS/MS spectra, and annotation for proteins and pathways from various database sources. Moreover, proteomics technologies and experimental methods are not yet standardized; hence a high degree of flexibility is necessary for efficient support of high- and low-throughput data analytic tasks. Development of a desktop environment that is sufficiently robust for deployment in data analytic pipelines, and simultaneously supports customization for programmers and non-programmers alike, has proven to be a significant challenge. Results We describe multiplierz, a flexible and open-source desktop environment for comprehensive proteomics data analysis. We use this framework to expose a prototype version of our recently proposed common API (mzAPI) designed for direct access to proprietary mass spectrometry files. In addition to routine data analytic tasks, multiplierz supports generation of information rich, portable spreadsheet-based reports. Moreover, multiplierz is designed around a "zero infrastructure" philosophy, meaning that it can be deployed by end users with little or no system administration support. Finally, access to multiplierz functionality is provided via high-level Python scripts, resulting in a fully extensible data analytic environment for rapid development of custom algorithms and deployment of high-throughput data pipelines. Conclusion Collectively, mzAPI and multiplierz facilitate a wide range of data analysis tasks, spanning technology development to biological annotation, for mass spectrometry-based proteomics research. PMID:19874609

  14. A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.

    PubMed

    Wei, Shengjing; Chen, Xiang; Yang, Xidong; Cao, Shuai; Zhang, Xu

    2016-01-01

    Sign language recognition (SLR) can provide a helpful tool for the communication between the deaf and the external world. This paper proposed a component-based vocabulary extensible SLR framework using data from surface electromyographic (sEMG) sensors, accelerometers (ACC), and gyroscopes (GYRO). In this framework, a sign word was considered to be a combination of five common sign components, including hand shape, axis, orientation, rotation, and trajectory, and sign classification was implemented based on the recognition of five components. Especially, the proposed SLR framework consisted of two major parts. The first part was to obtain the component-based form of sign gestures and establish the code table of target sign gesture set using data from a reference subject. In the second part, which was designed for new users, component classifiers were trained using a training set suggested by the reference subject and the classification of unknown gestures was performed with a code matching method. Five subjects participated in this study and recognition experiments under different size of training sets were implemented on a target gesture set consisting of 110 frequently-used Chinese Sign Language (CSL) sign words. The experimental results demonstrated that the proposed framework can realize large-scale gesture set recognition with a small-scale training set. With the smallest training sets (containing about one-third gestures of the target gesture set) suggested by two reference subjects, (82.6 ± 13.2)% and (79.7 ± 13.4)% average recognition accuracy were obtained for 110 words respectively, and the average recognition accuracy climbed up to (88 ± 13.7)% and (86.3 ± 13.7)% when the training set included 50~60 gestures (about half of the target gesture set). The proposed framework can significantly reduce the user's training burden in large-scale gesture recognition, which will facilitate the implementation of a practical SLR system. PMID:27104534

  15. From Input to Output: Communication-Based Teaching Techniques.

    ERIC Educational Resources Information Center

    Tschirner, Erwin

    1992-01-01

    Communication-based teaching techniques are described that lead German language students from input to output in a stimulating and motivating learning environment. Input activities are most useful for presenting speech acts, vocabulary, and grammar; output activities, for fine-tuning those areas as well as for expanding students' productive…

  16. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  17. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  18. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  19. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  20. GIS-based assessment of groundwater level on extensive karst areas

    NASA Astrophysics Data System (ADS)

    Kopecskó, Zsanett; Józsa, Edina

    2016-04-01

    Karst topographies represent unique geographical regions containing caves and extensive underground water systems developed especially on soluble rocks such as limestone, marble and gypsum. The significance of these areas is evident considering that 12% of the ice-free continental area consists of landscapes developed on carbonate rocks and 20-25% of the global population depends mostly on groundwater obtained from these systems. Karst water reservoirs already give the 25% of the freshwater resources globally. Comprehensive studies considering these regions are the key to explore chances of the exploitation and to analyze the consequences of contamination, anthropogenic effects and natural processes within these specific hydro-geological characteristics. For the proposed work we chose several of the largest karst regions over the ice-free part of continents, representing diverse climatic and topographic characteristics. An important aspect of the study is that there are no available in situ hydrologic measurements over the entire research area that would provide discrete sampling of soil, ground and surface water. As replacement for the detailed surveys, multi remote sensing data (Gravity Recovery and Climate Experiment (GRACE) satellite derivatives products, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite products and Tropical Rainfall Measuring Mission (TRMM) monthly rainfalls satellite datasets) are used along with model reanalysis data (Global Precipitation Climate Center data (GPCC) and Global Land Data Assimilation System (GLDAS)) to study the variation on extensive karst areas in response to the changing climate and anthropogenic effects. The analyses are carried out within open source software environment to enable sharing of the proposed algorithm. The GRASS GIS geoinformatic software and the R statistical program proved to be adequate choice to collect and analyze the above mentioned datasets by taking advantage of their interoperability

  1. An extensive survey of dayside diffuse aurora based on optical observations at Yellow River Station

    NASA Astrophysics Data System (ADS)

    Han, De-Sheng; Chen, Xiang-Cai; Liu, Jian-Jun; Qiu, Qi; Keika, K.; Hu, Ze-Jun; Liu, Jun-Ming; Hu, Hong-Qiao; Yang, Hui-Gen

    2015-09-01

    By using 7 years optical auroral observations obtained at Yellow River Station (magnetic latitude 76.24°N) at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into two broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAs mainly show patchy, stripy, and irregular forms and are often pulsating and drifting. The drifting directions are mostly westward (with speed ~5 km/s), but there are cases showing eastward or poleward drifting. (5) The stripy DDAs are exclusively observed near the MLN and, most importantly, their alignments are confirmed to be consistent with the direction of ionospheric convection near the MLN. (6) A new auroral form, called throat aurora, is found to be developed from the stripy DDAs. Based on the observational results and previous studies, we proposed our explanations to the DDAs. We suggest that the unstructured DDAs observed in the morning are extensions of the nightside diffuse aurora to the dayside, but that observed in the afternoon are predominantly caused by proton precipitations. The structured DDAs occurred near the MLN are caused by interactions of cold plasma structures, which are supposed to be originated from the ionospheric outflows or plasmaspheric drainage plumes, with hot electrons from the plasma sheet. We suppose that the cold plasma structures for producing the patchy DDAs are in lumpy and are more likely from the plasmaspheric drainage plumes. The cold plasma structure for

  2. CDAPubMed: a browser extension to retrieve EHR-based biomedical literature

    PubMed Central

    2012-01-01

    Background Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs). In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i) load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA), (ii) identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH), automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii) generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard PubMed interface. It has

  3. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    SciTech Connect

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.; THOMAS, EDWARD V.; WUNSCH, DONALD

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.

  4. Extending a GTD-based image formation technique to EUV lithography

    NASA Astrophysics Data System (ADS)

    Khoh, Andrew; Flagello, Donis G.; Milster, Thomas D.; Choi, Byoung-Il; Samudra, Ganesh S.; Wu, Yihong

    2003-06-01

    An image formation technique based on the Geometrical Theory of Diffraction was presented last conference. The technique is a scalar technique and is applicable to infinitely thin and perfectly conducting mask. We explore in this paper the extension of the technique to 1D Extreme Ultra-Violet(EUV) Lithography mask, taking into consideration both the material property and the topography of the mask. Vectorial nature of light is incorporated in the treatment. Results obtained are promising and encouraging. Computation time is relatively much shorter and the technique could simulate irradiance profile for any illumination angle. The technique is simple and elegant and lends understanding to image formation. We conclude that the asymmetry-through-focus characteristic usually found in EUV and Phase Mask imaging is an imaging phenomenon. We also conclude that corrections for proximity effect and pattern infidelity will be needed when EUV Lithography is introduced at the 32 nm node, assuming a system NA of 0.25. Lastly, for a partially coherent illumination, it appears necessary to compute the irradiance corresponding to each illumination point individually.

  5. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  6. Improving self-report measures of medication non-adherence using a cheating detection extension of the randomised-response-technique.

    PubMed

    Ostapczuk, Martin; Musch, Jochen; Moshagen, Morten

    2011-10-01

    Medication non-adherence is a serious problem for medical research and clinical practice. Self-reports are only moderately valid, and objective methods are cumbersome and expensive to administer. We sought to improve self-reports of medication non-adherence using a cheating detection extension of the randomised-response-technique (RRT). This RRT variant encourages more honest responses by offering interviewees a higher degree of anonymity while simultaneously allowing us to estimate the proportion of respondents disobeying the RRT instructions. The 597 patients were asked to report their lifetime prevalence of medication non-adherence under one of two different questioning procedures, direct questioning or randomised-response. When questioned directly, only 20.9% of patients admitted to intentional medication non-adherent behaviour, as opposed to 32.7% of patients under RRT conditions. Additionally, the cheating detection extension revealed a significant proportion of patients (47.1%) disobeying the instructions in the RRT condition. Assuming that either none or all of them were non-adherent, a lower and upper bound of 32.7% and 79.8%, respectively, could be estimated for the lifetime prevalence of non-adherent behaviour. The results demonstrate that self-report measures as well as traditional variants of the RRT, which do not take cheating into account, may provide considerably distorted estimates of the prevalence of medication non-adherence.

  7. An Evaluation of the Relationship between Supervisory Techniques and Organizational Outcomes among the Supervisors in the Agricultural Extension Service in the Eastern Region Districts of Uganda. Summary of Research 81.

    ERIC Educational Resources Information Center

    Padde, Paul; And Others

    A descriptive study examined the relationship between supervisory techniques and organizational outcomes among supervisors in the agricultural extension service in eight districts in eastern Uganda. Self-rating and rater forms of the Multifactor Leadership Questionnaire were sent to 220 extension agents, 8 field supervisors, and 8 deputy field…

  8. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  9. Fabrication of thermoplastics chips through lamination based techniques.

    PubMed

    Miserere, Sandrine; Mottet, Guillaume; Taniga, Velan; Descroix, Stephanie; Viovy, Jean-Louis; Malaquin, Laurent

    2012-04-24

    In this work, we propose a novel strategy for the fabrication of flexible thermoplastic microdevices entirely based on lamination processes. The same low-cost laminator apparatus can be used from master fabrication to microchannel sealing. This process is appropriate for rapid prototyping at laboratory scale, but it can also be easily upscaled to industrial manufacturing. For demonstration, we used here Cycloolefin Copolymer (COC), a thermoplastic polymer that is extensively used for microfluidic applications. COC is a thermoplastic polymer with good chemical resistance to common chemicals used in microfluidics such as acids, bases and most polar solvents. Its optical quality and mechanical resistance make this material suitable for a large range of applications in chemistry or biology. As an example, the electrokinetic separation of pollutants is proposed in the present study.

  10. MEMS-based power generation techniques for implantable biosensing applications.

    PubMed

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  11. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  12. Novel techniques and the future of skull base reconstruction.

    PubMed

    Meier, Joshua C; Bleier, Benjamin S

    2013-01-01

    The field of endoscopic skull base surgery has evolved considerably in recent years fueled largely by advances in both imaging and instrumentation. While the indications for these approaches continue to be extended, the ability to reconstruct the resultant defects has emerged as a rate-limiting obstacle. Postoperative failures with current multilayer grafting techniques remain significant and may increase as the indications for endoscopic resections continue to expand. Laser tissue welding represents a novel method of wound repair in which laser energy is applied to a chromophore doped biologic solder at the wound edge to create a laser weld (fig. 1). These repairs are capable of withstanding forces far exceeding those exerted by intracranial pressure with negligible collateral thermal tissue injury. Recent clinical trials have demonstrated the safety and feasibility of endoscopic laser welding while exposing the limitations of first generation hyaluronic acid based solders. Novel supersaturated gel based solders are currently being tested in clinical trials and appear to possess significantly improved viscoelastic properties. While laser tissue welding remains an experimental technique, continued success with these novel solder formulations may catalyze the widespread adoption of this technique for skull base repair in the near future.

  13. Gabor-based fusion technique for Optical Coherence Microscopy.

    PubMed

    Rolland, Jannick P; Meemon, Panomsak; Murali, Supraja; Thompson, Kevin P; Lee, Kye-sung

    2010-02-15

    We recently reported on an Optical Coherence Microscopy technique, whose innovation intrinsically builds on a recently reported - 2 microm invariant lateral resolution by design throughout a 2 mm cubic full-field of view - liquid-lens-based dynamic focusing optical probe [Murali et al., Optics Letters 34, 145-147, 2009]. We shall report in this paper on the image acquisition enabled by this optical probe when combined with an automatic data fusion method developed and described here to produce an in-focus high resolution image throughout the imaging depth of the sample. An African frog tadpole (Xenopus laevis) was imaged with the novel probe and the Gabor-based fusion technique, demonstrating subcellular resolution in a 0.5 mm (lateral) x 0.5 mm (axial) without the need, for the first time, for x-y translation stages, depth scanning, high-cost adaptive optics, or manual intervention. In vivo images of human skin are also presented.

  14. Development and Implementation of an Extensible Interface-Based Spatiotemporal Geoprocessing and Modeling Toolbox

    NASA Astrophysics Data System (ADS)

    Cao, Y.; Ames, D. P.

    2011-12-01

    This poster presents an object oriented and interface-based spatiotemporal data processing and modeling toolbox that can be extended by third parties to include complete suites of new tools through the implementation of simple interfaces. The resulting software implementation includes both a toolbox and workflow designer or "model builder" constructed using the underlying open source DotSpatial library and MapWindow desktop GIS. The unique contribution of this research and software development activity is in the creation and use of an extensibility architecture for both specific tools (through a so-called "ITool" interface) and batches of tools (through a so-called "IToolProvider" interface.) This concept is introduced to allow for seamless integration of geoprocessing tools from various sources (e.g. distinct libraries of spatiotemporal processing code) - including online sources - within a single user environment. In this way, the IToolProvider interface allows developers to wrap large existing collections of data analysis code without having to re-write it for interoperability. Additionally, developers do not need to design the user interfaces for loading, displaying or interacting with their specific tools, but rather can simply implement the provided interfaces and have their tools and tool collections appear in the toolbox alongside other tools. The demonstration software presented here is based on an implementation of the interfaces and sample tool libraries using the C# .NET programming language. This poster will include a summary of the interfaces as well as a demonstration of the system using the Whitebox Geospatial Analysis Tools (GAT) as an example case of a large number of existing tools that can be exposed to users through this new system. Vector analysis tools which are native in DotSpatial are linked to the Whitebox raster analysis tools in the model builder environment for ease of execution and consistent/repeatable use. We expect that this

  15. A Gabor-based technique for bias removal in MR images.

    PubMed

    Ardizzone, Edoardo; Pirrone, Roberto; Mastrella, Mario; Gambino, Orazio

    2007-01-01

    Magnetic Resonance images are often characterized by irregularly displaced luminance fluctuations that are called bias artifact. This disturb is due to a drop in signal intensity caused by the distance between imaged sample and receiver coil. An original approach to bias removal in Magnetic Resonance images is presented, which is based on the use of Gabor filter to extract the artifact. The proposed technique restores the image using a correction model, which is derived from the attenuation of signal diffusion across the tissues. No hypotheses are made about the structure of the tissues under investigation and the used MR spectrum. The approach is presented in detail, and extensive experimental results are reported along with a comparison with other popular techniques for bias removal.

  16. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    PubMed Central

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  17. An image morphing technique based on optimal mass preserving mapping.

    PubMed

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  18. DEVA: An extensible ontology-based annotation model for visual document collections

    NASA Astrophysics Data System (ADS)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  19. The Intelligent System of Cardiovascular Disease Diagnosis Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Sun, Baiqing; Li, Yange; Zhang, Lin

    This thesis gives the general definition of the concepts of extension knowledge, extension data mining and extension data mining theorem in high dimension space, and also builds the IDSS integrated system by the rough set, expert system and neural network, develops the relevant computer software. From the diagnosis tests, according to the common diseases of myocardial infarctions, angina pectoris and hypertension, and made the test result with physicians, the results shows that the sensitivity, specific and accuracy diagnosis by the IDSS are all higher than the physicians. It can improve the rate of the accuracy diagnosis of physician with the auxiliary help of this system, which have the obvious meaning in low the mortality, disability rate and high the survival rate, and has strong practical values and further social benefits.

  20. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  1. A Review of Financial Accounting Fraud Detection based on Data Mining Techniques

    NASA Astrophysics Data System (ADS)

    Sharma, Anuj; Kumar Panigrahi, Prabin

    2012-02-01

    With an upsurge in financial accounting fraud in the current economic scenario experienced, financial accounting fraud detection (FAFD) has become an emerging topic of great importance for academic, research and industries. The failure of internal auditing system of the organization in identifying the accounting frauds has lead to use of specialized procedures to detect financial accounting fraud, collective known as forensic accounting. Data mining techniques are providing great aid in financial accounting fraud detection, since dealing with the large data volumes and complexities of financial data are big challenges for forensic accounting. This paper presents a comprehensive review of the literature on the application of data mining techniques for the detection of financial accounting fraud and proposes a framework for data mining techniques based accounting fraud detection. The systematic and comprehensive literature review of the data mining techniques applicable to financial accounting fraud detection may provide a foundation to future research in this field. The findings of this review show that data mining techniques like logistic models, neural networks, Bayesian belief network, and decision trees have been applied most extensively to provide primary solutions to the problems inherent in the detection and classification of fraudulent data.

  2. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  3. The Influence of an Extensive Inquiry-Based Field Experience on Pre-Service Elementary Student Teachers' Science Teaching Beliefs

    ERIC Educational Resources Information Center

    Bhattacharyya, Sumita; Volk, Trudi; Lumpe, Andrew

    2009-01-01

    This study examined the effects of an extensive inquiry-based field experience on pre service elementary teachers' personal agency beliefs, a composite measure of context beliefs and capability beliefs related to teaching science. The research combined quantitative and qualitative approaches and included an experimental group that utilized the…

  4. An In-House Prototype for the Implementation of Computer-Based Extensive Reading in a Limited-Resource School

    ERIC Educational Resources Information Center

    Mayora, Carlos A.; Nieves, Idami; Ojeda, Victor

    2014-01-01

    A variety of computer-based models of Extensive Reading have emerged in the last decade. Different Information and Communication Technologies online usually support these models. However, such innovations are not feasible in contexts where the digital breach limits the access to Internet. The purpose of this paper is to report a project in which…

  5. Eat, Grow, Lead 4-H: An Innovative Approach to Deliver Campus- Based Field Experiences to Pre-Entry Extension Educators

    ERIC Educational Resources Information Center

    Weeks, Penny Pennington; Weeks, William G.

    2012-01-01

    Eat, Grow, Lead 4-H Club was created as a pilot program for college students seeking to gain experience as non-formal youth educators, specifically serving pre-entry level Extension educators through a university-based 4-H club. Seventeen student volunteers contributed an estimated 630 hours of service to the club during spring 2011. The club…

  6. 76 FR 12073 - Extension of Web-Based TRICARE Assistance Program Demonstration Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... Register Notice, 74 FR 3667, July 24, 2009. The demonstration was extended to March 31, 2011, as referenced... in 74 FR 3667 July 24, 2009 launched August 1, 2009, to provide the capability for short-term... original Federal Register Notice, 74 FR 3667 July 24, 2009, and the extension Federal Register...

  7. Designing a Competency-Based New County Extension Personnel Training Program: A Novel Approach

    ERIC Educational Resources Information Center

    Brodeur, Cheri Winton; Higgins, Cynthia; Galindo-Gonzalez, Sebastian; Craig, Diane D.; Haile, Tyann

    2011-01-01

    Voluntary county personnel turnover occurs for a multitude of reasons, including the lack of job satisfaction, organizational commitment, and job embeddedness and lack of proper training. Loss of personnel can be costly both economically and in terms of human capital. Retention of Extension professionals can be improved through proper training or…

  8. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  9. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  10. An extension to the constructivist coding hypothesis as a learning model for selective feedback when the base rate is high.

    PubMed

    Ghaffarzadegan, Navid; Stewart, Thomas R

    2011-07-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the .5 base rate condition. This commentary argues that the constructivist coding hypothesis imposes an ever-declining selection rate and overestimates base rate bias for high base rate conditions. We provide support based on a simulation model of learning under selective feedback with different base rates. Then we discuss possible extensions to constructivist coding that can help overcome the problem. PMID:21728470

  11. Reliable and fast allele-specific extension of 3'-LNA modified oligonucleotides covalently immobilized on a plastic base, combined with biotin-dUTP mediated optical detection.

    PubMed

    Michikawa, Yuichi; Fujimoto, Kentaro; Kinoshita, Kenji; Kawai, Seiko; Sugahara, Keisuke; Suga, Tomo; Otsuka, Yoshimi; Fujiwara, Kazuhiko; Iwakawa, Mayumi; Imai, Takashi

    2006-12-01

    In the present work, a convenient microarray SNP typing system has been developed using a plastic base that covalently immobilizes amino-modified oligonucleotides. Reliable SNP allele discrimination was achieved by using allelic specificity-enhanced enzymatic extension of immobilized oligonucleotide primer, with a locked nucleic acid (LNA) modification at the SNP-discriminating 3'-end nucleotide. Incorporation of multiple biotin-dUTP molecules during primer extension, followed by binding of alkaline phosphatase-conjugated streptavidin, allowed optical detection of the genotyping results through precipitation of colored alkaline phosphatase substrates onto the surface of the plastic base. Notably, rapid primer extension was demonstrated without a preliminary annealing step of double-stranded template DNA, allowing overall processes to be performed within a couple of hours. Simultaneous evaluation of three SNPs in the genes TGFB1, SOD2 and APEX1, previously investigated for association with radiation sensitivity, in 25 individuals has shown perfect assignment with data obtained by another established technique (MassARRAY system).

  12. Optical accelerometer based on grating interferometer with phase modulation technique.

    PubMed

    Zhao, Shuangshuang; Zhang, Juan; Hou, Changlun; Bai, Jian; Yang, Guoguang

    2012-10-10

    In this paper, an optical accelerometer based on grating interferometer with phase modulation technique is proposed. This device architecture consists of a laser diode, a sensing chip and an optoelectronic processing circuit. The sensing chip is a sandwich structure, which is composed of a grating, a piezoelectric translator and a micromachined silicon structure consisting of a proof mass and four cantilevers. The detected signal is intensity-modulated with phase modulation technique and processed with a lock-in amplifier for demodulation. Experimental results show that this optical accelerometer has acceleration sensitivity of 619 V/g and high-resolution acceleration detection of 3 μg in the linear region. PMID:23052079

  13. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-01

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper. PMID:25284741

  14. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  15. Mapping lifecycle management activities for blockbuster drugs in Japan based on drug approvals and patent term extensions.

    PubMed

    Yamanaka, Takayuki; Kano, Shingo

    2016-02-01

    Drug lifecycle management (LCM), which entails acquiring drug approvals and patent protections, contributes to maximizing drug discovery investment returns. In a previous survey, a comparative analysis between Japan and the USA indicated that a unique patent term extension system has an important role in Japanese drug LCM. Therefore, in this survey, we focused on drug approvals and patent term extensions, and found that the LCM for blockbuster drugs in Japan can be categorized into three types (drug approval-oriented LCM, patent term extension-oriented LCM, and inactive-type LCM), of which the first two have been implemented recently. Here, we suggest a strategy for selecting a suitable LCM approach among these three types based on the prospects for drug improvements.

  16. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  17. Implementation of obstacle-avoidance control for an autonomous omni-directional mobile robot based on extension theory.

    PubMed

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-10-16

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system.

  18. Implementation of Obstacle-Avoidance Control for an Autonomous Omni-Directional Mobile Robot Based on Extension Theory

    PubMed Central

    Pai, Neng-Sheng; Hsieh, Hung-Hui; Lai, Yi-Chung

    2012-01-01

    The paper demonstrates a following robot with omni-directional wheels, which is able to take action to avoid obstacles. The robot design is based on both fuzzy and extension theory. Fuzzy theory was applied to tune the PMW signal of the motor revolution, and correct path deviation issues encountered when the robot is moving. Extension theory was used to build a robot obstacle-avoidance model. Various mobile models were developed to handle different types of obstacles. The ultrasonic distance sensors mounted on the robot were used to estimate the distance to obstacles. If an obstacle is encountered, the correlation function is evaluated and the robot avoids the obstacle autonomously using the most appropriate mode. The effectiveness of the proposed approach was verified through several tracking experiments, which demonstrates the feasibility of a fuzzy path tracker as well as the extensible collision avoidance system. PMID:23202029

  19. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  20. The Roland Maze Project school-based extensive air shower network

    NASA Astrophysics Data System (ADS)

    Feder, J.; Jȩdrzejczak, K.; Karczmarczyk, J.; Lewandowski, R.; Swarzyński, J.; Szabelska, B.; Szabelski, J.; Wibig, T.

    2006-01-01

    We plan to construct the large area network of extensive air shower detectors placed on the roofs of high school buildings in the city of Łódź. Detection points will be connected by INTERNET to the central server and their work will be synchronized by GPS. The main scientific goal of the project are studies of ultra high energy cosmic rays. Using existing town infrastructure (INTERNET, power supply, etc.) will significantly reduce the cost of the experiment. Engaging high school students in the research program should significantly increase their knowledge of science and modern technologies, and can be a very efficient way of science popularisation. We performed simulations of the projected network capabilities of registering Extensive Air Showers and reconstructing energies of primary particles. Results of the simulations and the current status of project realisation will be presented.

  1. Pseudorandom Noise Code-Based Technique for Cloud and Aerosol Discrimination Applications

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.; Harrison, Fenton Wallace

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a PN code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths. Keywords: ASCENDS, CO2 sensing, O2 sensing, PN codes, CW lidar

  2. Water-based technique to produce porous PZT materials

    NASA Astrophysics Data System (ADS)

    Galassi, C.; Capiani, C.; Craciun, F.; Roncari, E.

    2005-09-01

    Water based colloidal processing of PZT materials was investigated in order to reduce costs and employ more environmental friendly manufacturing. The technique addressed was the production of porous thick samples by the so called “starch consolidation”. PZT “soft” compositions were used. The “starch consolidation” process allows to obtain the green body by raising the temperature of a suspension of PZT powder, soluble starch and water, cast into a metal mould. The influence of the processing parameters and composition on the morphology, pore volumes, pore size distributions and piezoelectric properties are investigated. Zeta potential determination and titration with different deflocculants were essential tools to adjust the slurry formulation.

  3. Foreign fiber detecting system based on multispectral technique

    NASA Astrophysics Data System (ADS)

    Li, Qi; Han, Shaokun; Wang, Ping; Wang, Liang; Xia, Wenze

    2015-08-01

    This paper presents a foreign fiber detecting system based on multi-spectral technique. The absorption rate and the reflectivity of foreign fibers differently under different wavelengths of light so that the characteristics of the image has difference in the different light irradiation. Contrast pyramid image fusion algorithm and adaptive enhancement is improved to extracted the foreign fiber from the cotton background. The experimental results show that the single light source can detect 6 kinds of foreign fiber in cotton and multi-spectral detection can detect eight kinds.

  4. Stopped depletion region extension in an AlGaN/GaN-HEMT: A new technique for improving high-frequency performance

    NASA Astrophysics Data System (ADS)

    Asad, Mohsen; Rahimian, Morteza

    2015-08-01

    We present a novel structure for AlGaN/GaN high electron mobility transistors. The structure consists of a multi-recess AlGaN barrier layer and recessed metal ring (RBRM-HEMT). The barrier thickness narrowing between the gate and the source/drain regions minimizes the depletion region extension, which leads to smaller gate-drain ( C GD ) and gate-source ( C GS ) capacitances. This technique shows a great improvement in high-frequency and high-power applications. In high-frequency operation, the cut-off frequency ( f T ) and the maximum oscillation frequency ( f max ) of the RBRM-HEMT are found to be 133 GHz and 216 GHz respectively, which is significantly higher than the 94 GHz and the 175 GHz obtained for the conventional GaN HEMT (C-HEMT). In addition, a more uniform and low-crowding electric field is obtained under the gate close to the drain side due to the recessed metal-ring structure. A 128% improvement in breakdown voltage ( V BR ) is achieved compared to the C-HEMT. Consequently, the maximum output power density ( P max ) is increased from 11.3 W/mm in the C-HEMT to 24.4 W/mm for the RBRM-HEMT.

  5. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  6. Linear Frequency Estimation Technique for Reducing Frequency Based Signals

    PubMed Central

    Woodbridge, Jonathan; Bui, Alex; Sarrafzadeh, Majid

    2016-01-01

    This paper presents a linear frequency estimation (LFE) technique for data reduction of frequency-based signals. LFE converts a signal to the frequency domain by utilizing the Fourier transform and estimates both the real and imaginary parts with a series of vectors much smaller than the original signal size. The estimation is accomplished by selecting optimal points from the frequency domain and interpolating data between these points with a first order approximation. The difficulty of such a problem lies in determining which points are most significant. LFE is unique in the fact that it is generic to a wide variety of frequency-based signals such as electromyography (EMG), voice, and electrocardiography (ECG). The only requirement is that spectral coefficients are spatially correlated. This paper presents the algorithm and results from both EMG and voice data. We complete the paper with a description of how this method can be applied to pattern types of recognition, signal indexing, and compression.

  7. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression. PMID:24808594

  8. Modern Micro and Nanoparticle-Based Imaging Techniques

    PubMed Central

    Ryvolova, Marketa; Chomoucka, Jana; Drbohlavova, Jana; Kopel, Pavel; Babula, Petr; Hynek, David; Adam, Vojtech; Eckschlager, Tomas; Hubalek, Jaromir; Stiborova, Marie; Kaiser, Jozef; Kizek, Rene

    2012-01-01

    The requirements for early diagnostics as well as effective treatment of insidious diseases such as cancer constantly increase the pressure on development of efficient and reliable methods for targeted drug/gene delivery as well as imaging of the treatment success/failure. One of the most recent approaches covering both the drug delivery as well as the imaging aspects is benefitting from the unique properties of nanomaterials. Therefore a new field called nanomedicine is attracting continuously growing attention. Nanoparticles, including fluorescent semiconductor nanocrystals (quantum dots) and magnetic nanoparticles, have proven their excellent properties for in vivo imaging techniques in a number of modalities such as magnetic resonance and fluorescence imaging, respectively. In this article, we review the main properties and applications of nanoparticles in various in vitro imaging techniques, including microscopy and/or laser breakdown spectroscopy and in vivo methods such as magnetic resonance imaging and/or fluorescence-based imaging. Moreover the advantages of the drug delivery performed by nanocarriers such as iron oxides, gold, biodegradable polymers, dendrimers, lipid based carriers such as liposomes or micelles are also highlighted. PMID:23202187

  9. A polarization-based Thomson scattering technique for burning plasmas

    NASA Astrophysics Data System (ADS)

    Parke, E.; Mirnov, V. V.; Den Hartog, D. J.

    2014-02-01

    The traditional Thomson scattering diagnostic is based on measurement of the wavelength spectrum of scattered light, where electron temperature measurements are inferred from thermal broadening of the spectrum. At sufficiently high temperatures, especially those predicted for ITER and other burning plasmas, relativistic effects cause a change in the degree of polarization (P) of the scattered light; for fully polarized incident laser light, the scattered light becomes partially polarized. The resulting reduction of polarization is temperature dependent and has been proposed by other authors as a potential alternative to the traditional spectral decomposition technique. Following the previously developed Stokes vector approach, we analytically calculate the degree of polarization for incoherent Thomson scattering. For the first time, we obtain exact results valid for the full range of incident laser polarization states, scattering angles, and electron temperatures. While previous work focused only on linear polarization, we show that circularly polarized incident light optimizes the degree of depolarization for a wide range of temperatures relevant to burning plasmas. We discuss the feasibility of a polarization based Thomson scattering diagnostic for ITER-like plasmas with both linearly and circularly polarized light and compare to the traditional technique.

  10. Certification Aspects in Critical Embedded Software Development with Model Based Techniques: Detection of Unintended Functions

    NASA Astrophysics Data System (ADS)

    Atencia Yepez, A.; Autrán Cerqueira, J.; Urueña, S.; Jurado, R.

    2012-01-01

    This paper, developed under contract with European Aviation Safety Agency (EASA), analyses in detail which may be the certification implications in the aeronautic industry associated to the application of model-level verification and validation techniques. Particularly, this paper focuses on the problematic of detecting unintended functions by applying Model Coverage Criteria at model level. This point is significantly important for the future extensive use of Model Based approaches in safety critical software, since the uncertainty in the system performance introduced by the unintended functions, which may also lead to unacceptable hazardous or catastrophic events, prevents the system to be compliance with certification requirements. The paper provides a definition and a categorization of unintended functions and gives some relevant examples to assess the efficiency of model- coverage techniques in the detection of UF. The paper explains how this analysis is supported by a methodology based on the study of sources for introducing unintended functions. Finally it is analysed the feasibility of using Model-level verification techniques to support the software certification process.

  11. An extension of the immersed boundary method based on the distributed Lagrange multiplier approach

    NASA Astrophysics Data System (ADS)

    Feldman, Yuri; Gulberg, Yosef

    2016-10-01

    An extended formulation of the immersed boundary method, which facilitates simulation of incompressible isothermal and natural convection flows around immersed bodies and which may be applied for linear stability analysis of the flows, is presented. The Lagrangian forces and heat sources are distributed on the fluid-structure interface. The method treats pressure, the Lagrangian forces, and heat sources as distributed Lagrange multipliers, thereby implicitly providing the kinematic constraints of no-slip and the corresponding thermal boundary conditions for immersed surfaces. Extensive verification of the developed method for both isothermal and natural convection 2D flows is provided. Strategies for adapting the developed approach to realistic 3D configurations are discussed.

  12. Generalisation and extension of a web-based data collection system for clinical studies using Java and CORBA.

    PubMed

    Eich, H P; Ohmann, C

    1999-01-01

    Inadequate informatical support of multi-centre clinical trials lead to pure quality. In order to support a multi-centre clinical trial a data collection via WWW and Internet based on Java has been developed. In this study a generalization and extension of this prototype has been performed. The prototype has been applied to another clinical trial and a knowledge server based on C+t has been integrated via CORBA. The investigation and implementation of security aspects of web-based data collection is now under evaluation.

  13. An interactive tutorial-based training technique for vertebral morphometry.

    PubMed

    Gardner, J C; von Ingersleben, G; Heyano, S L; Chesnut, C H

    2001-01-01

    The purpose of this work was to develop a computer-based procedure for training technologists in vertebral morphometry. The utility of the resulting interactive, tutorial based training method was evaluated in this study. The training program was composed of four steps: (1) review of an online tutorial, (2) review of analyzed spine images, (3) practice in fiducial point placement and (4) testing. During testing, vertebral heights were measured from digital, lateral spine images containing osteoporotic fractures. Inter-observer measurement precision was compared between research technicians, and between technologists and radiologist. The technologists participating in this study had no prior experience in vertebral morphometry. Following completion of the online training program, good inter-observer measurement precision was seen between technologists, showing mean coefficients of variation of 2.33% for anterior, 2.87% for central and 2.65% for posterior vertebral heights. Comparisons between the technicians and radiologist ranged from 2.19% to 3.18%. Slightly better precision values were seen with height measurements compared with height ratios, and with unfractured compared with fractured vertebral bodies. The findings of this study indicate that self-directed, tutorial-based training for spine image analyses is effective, resulting in good inter-observer measurement precision. The interactive tutorial-based approach provides standardized training methods and assures consistency of instructional technique over time.

  14. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  15. Evaluations of mosquito age grading techniques based on morphological changes.

    PubMed

    Hugo, L E; Quick-Miles, S; Kay, B H; Ryan, P A

    2008-05-01

    Evaluations were made of the accuracy and practicality of mosquito age grading methods based on changes to mosquito morphology; including the Detinova ovarian tracheation, midgut meconium, Polovodova ovariole dilatation, ovarian injection, and daily growth line methods. Laboratory maintained Aedes vigilax (Skuse) and Culex annulirostris (Skuse) females of known chronological and physiological ages were used for these assessments. Application of the Detinova technique to laboratory reared Ae. vigilax females in a blinded trial enabled the successful identification of nulliparous and parous females in 83.7-89.8% of specimens. The success rate for identifying nulliparous females increased to 87.8-98.0% when observations of ovarian tracheation were combined with observations of the presence of midgut meconium. However, application of the Polovodova method only enabled 57.5% of nulliparous, 1-parous, 2-parous, and 3-parous Ae. vigilax females to be correctly classified, and ovarian injections were found to be unfeasible. Poor correlation was observed between the number of growth lines per phragma and the calendar age of laboratory reared Ae. vigilax females. In summary, morphological age grading methods that offer simple two-category predictions (ovarian tracheation and midgut meconium methods) were found to provide high-accuracy classifications, whereas methods that offer the separation of multiple age categories (ovariolar dilatation and growth line methods) were found to be extremely difficult and of low accuracy. The usefulness of the morphology-based methods is discussed in view of the availability of new mosquito age grading techniques based on cuticular hydrocarbon and gene transcription changes. PMID:18533427

  16. Plasma and trap-based techniques for science with positrons

    NASA Astrophysics Data System (ADS)

    Danielson, J. R.; Dubin, D. H. E.; Greaves, R. G.; Surko, C. M.

    2015-01-01

    In recent years, there has been a wealth of new science involving low-energy antimatter (i.e., positrons and antiprotons) at energies ranging from 102 to less than 10-3 eV . Much of this progress has been driven by the development of new plasma-based techniques to accumulate, manipulate, and deliver antiparticles for specific applications. This article focuses on the advances made in this area using positrons. However, many of the resulting techniques are relevant to antiprotons as well. An overview is presented of relevant theory of single-component plasmas in electromagnetic traps. Methods are described to produce intense sources of positrons and to efficiently slow the typically energetic particles thus produced. Techniques are described to trap positrons efficiently and to cool and compress the resulting positron gases and plasmas. Finally, the procedures developed to deliver tailored pulses and beams (e.g., in intense, short bursts, or as quasimonoenergetic continuous beams) for specific applications are reviewed. The status of development in specific application areas is also reviewed. One example is the formation of antihydrogen atoms for fundamental physics [e.g., tests of invariance under charge conjugation, parity inversion, and time reversal (the CPT theorem), and studies of the interaction of gravity with antimatter]. Other applications discussed include atomic and materials physics studies and the study of the electron-positron many-body system, including both classical electron-positron plasmas and the complementary quantum system in the form of Bose-condensed gases of positronium atoms. Areas of future promise are also discussed. The review concludes with a brief summary and a list of outstanding challenges.

  17. Detecting Molecular Properties by Various Laser-Based Techniques

    SciTech Connect

    Hsin, Tse-Ming

    2007-01-01

    Four different laser-based techniques were applied to study physical and chemical characteristics of biomolecules and dye molecules. These techniques are liole burning spectroscopy, single molecule spectroscopy, time-resolved coherent anti-Stokes Raman spectroscopy and laser-induced fluorescence microscopy. Results from hole burning and single molecule spectroscopy suggested that two antenna states (C708 & C714) of photosystem I from cyanobacterium Synechocystis PCC 6803 are connected by effective energy transfer and the corresponding energy transfer time is ~6 ps. In addition, results from hole burning spectroscopy indicated that the chlorophyll dimer of the C714 state has a large distribution of the dimer geometry. Direct observation of vibrational peaks and evolution of coumarin 153 in the electronic excited state was demonstrated by using the fs/ps CARS, a variation of time-resolved coherent anti-Stokes Raman spectroscopy. In three different solvents, methanol, acetonitrile, and butanol, a vibration peak related to the stretch of the carbonyl group exhibits different relaxation dynamics. Laser-induced fluorescence microscopy, along with the biomimetic containers-liposomes, allows the measurement of the enzymatic activity of individual alkaline phosphatase from bovine intestinal mucosa without potential interferences from glass surfaces. The result showed a wide distribution of the enzyme reactivity. Protein structural variation is one of the major reasons that are responsible for this highly heterogeneous behavior.

  18. Investigations on landmine detection by neutron-based techniques.

    PubMed

    Csikai, J; Dóczi, R; Király, B

    2004-07-01

    Principles and techniques of some neutron-based methods used to identify the antipersonnel landmines (APMs) are discussed. New results have been achieved in the field of neutron reflection, transmission, scattering and reaction techniques. Some conclusions are as follows: The neutron hand-held detector is suitable for the observation of anomaly caused by a DLM2-like sample in different soils with a scanning speed of 1m(2)/1.5 min; the reflection cross section of thermal neutrons rendered the determination of equivalent thickness of different soil components possible; a simple method was developed for the determination of the thermal neutron flux perturbation factor needed for multi-elemental analysis of bulky samples; unfolded spectra of elastically backscattered neutrons using broad-spectrum sources render the identification of APMs possible; the knowledge of leakage spectra of different source neutrons is indispensable for the determination of the differential and integrated reaction rates and through it the dimension of the interrogated volume; the precise determination of the C/O atom fraction requires the investigations on the angular distribution of the 6.13MeV gamma-ray emitted in the (16)O(n,n'gamma) reaction. These results, in addition to the identification of landmines, render the improvement of the non-intrusive neutron methods possible.

  19. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.; Mahmood, Q.

    1978-01-01

    A sputtering system was developed to deposit aluminum and aluminum alloys by the dc sputtering technique. This system is designed for a high level of cleanliness and for monitoring the deposition parameters during film preparation. This system is now ready for studying the deposition and annealing parameters upon double-level metal preparation. A technique recently applied for semiconductor analysis, the finite element method, was studied for use in the computer modeling of two dimensional MOS transistor structures. It was concluded that the method has not been sufficiently well developed for confident use at this time. An algorithm was developed for confident use at this time. An algorithm was developed for implementing a computer study which is based upon the finite difference method. The program which was developed was modified and used to calculate redistribution data for boron and phosphorous which had been predeposited by ion implantation with range and straggle conditions. Data were generated for 111 oriented SOS films with redistribution in N2, dry O2 and steam ambients.

  20. Surfer: An Extensible Pull-Based Framework for Resource Selection and Ranking

    NASA Technical Reports Server (NTRS)

    Zolano, Paul Z.

    2004-01-01

    Grid computing aims to connect large numbers of geographically and organizationally distributed resources to increase computational power; resource utilization, and resource accessibility. In order to effectively utilize grids, users need to be connected to the best available resources at any given time. As grids are in constant flux, users cannot be expected to keep up with the configuration and status of the grid, thus they must be provided with automatic resource brokering for selecting and ranking resources meeting constraints and preferences they specify. This paper presents a new OGSI-compliant resource selection and ranking framework called Surfer that has been implemented as part of NASA's Information Power Grid (IPG) project. Surfer is highly extensible and may be integrated into any grid environment by adding information providers knowledgeable about that environment.

  1. A restrained-torque-based motion instructor: forearm flexion/extension-driving exoskeleton

    NASA Astrophysics Data System (ADS)

    Nishimura, Takuya; Nomura, Yoshihiko; Sakamoto, Ryota

    2013-01-01

    When learning complicated movements by ourselves, we encounter such problems as a self-rightness. The self-rightness results in a lack of detail and objectivity, and it may cause to miss essences and even twist the essences. Thus, we sometimes fall into the habits of doing inappropriate motions. To solve these problems or to alleviate the problems as could as possible, we have been developed mechanical man-machine human interfaces to support us learning such motions as cultural gestures and sports form. One of the promising interfaces is a wearable exoskeleton mechanical system. As of the first try, we have made a prototype of a 2-link 1-DOF rotational elbow joint interface that is applied for teaching extension-flexion operations with forearms and have found its potential abilities for teaching the initiating and continuing flection motion of the elbow.

  2. DanteR: an extensible R-based tool for quantitative analysis of -omics data

    SciTech Connect

    Taverner, Thomas; Karpievitch, Yuliya; Polpitiya, Ashoka D.; Brown, Joseph N.; Dabney, Alan R.; Anderson, Gordon A.; Smith, Richard D.

    2012-09-15

    Motivation: The size and complex nature of LC-MS proteomics data sets motivates development of specialized software for statistical data analysis and exploration. We present DanteR, a graphical R package that features extensive statistical and diagnostic functions for quantitative proteomics data analysis, including normalization, imputation, hypothesis testing, interactive visualization and peptide-to-protein rollup. More importantly, users can easily extend the existing functionality by including their own algorithms under the Add-On tab. Availability: DanteR and its associated user guide are available for download at http://omics.pnl.gov/software/. For Windows, a single click automatically installs DanteR along with the R programming environment. For Linux and Mac OS X, users must first install R and then follow instructions on the DanteR web site for package installation.

  3. Testing of Large Diameter Fresnel Optics for Space Based Observations of Extensive Air Showers

    NASA Technical Reports Server (NTRS)

    Adams, James H.; Christl, Mark J.; Young, Roy M.

    2011-01-01

    The JEM-EUSO mission will detect extensive air showers produced by extreme energy cosmic rays. It operates from the ISS looking down on Earth's night time atmosphere to detect the nitrogen fluorescence and Cherenkov produce by the charged particles in the EAS. The JEM-EUSO science objectives require a large field of view, sensitivity to energies below 50 EeV, and must fit within available ISS resources. The JEM-EUSO optic module uses three large diameter, thin plastic lenses with Fresnel surfaces to meet the instrument requirements. A bread-board model of the optic has been manufactured and has undergone preliminary tests. We report the results of optical performance tests and evaluate the present capability to manufacture these optical elements.

  4. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques.

    PubMed

    Parkash, Om; Shueb, Rafidah Hanim

    2015-10-19

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed.

  5. Protein elasticity probed with two synchrotron-based techniques.

    SciTech Connect

    Leu, B. M.; Alatas, A.; Sinn, H.; Alp, E. E.; Said, A.; Yavas, H.; Zhao, J.; Sage, J. T.; Sturhahn, W.; X-Ray Science Division; Hasylab; Northeastern Univ.

    2010-02-25

    Compressibility characterizes three interconnecting properties of a protein: dynamics, structure, and function. The compressibility values for the electron-carrying protein cytochrome c and for other proteins, as well, available in the literature vary considerably. Here, we apply two synchrotron-based techniques - nuclear resonance vibrational spectroscopy and inelastic x-ray scattering - to measure the adiabatic compressibility of this protein. This is the first report of the compressibility of any material measured with this method. Unlike the methods previously used, this novel approach probes the protein globally, at ambient pressure, does not require the separation of protein and solvent contributions to the total compressibility, and uses samples that contain the heme iron, as in the native state. We show, by comparing our results with molecular dynamics predictions, that the compressibility is almost independent of temperature. We discuss potential applications of this method to other materials beyond proteins.

  6. Validation techniques for fault emulation of SRAM-based FPGAs

    DOE PAGES

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA inmore » a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.« less

  7. Validation techniques for fault emulation of SRAM-based FPGAs

    SciTech Connect

    Quinn, Heather; Wirthlin, Michael

    2015-08-07

    A variety of fault emulation systems have been created to study the effect of single-event effects (SEEs) in static random access memory (SRAM) based field-programmable gate arrays (FPGAs). These systems are useful for augmenting radiation-hardness assurance (RHA) methodologies for verifying the effectiveness for mitigation techniques; understanding error signatures and failure modes in FPGAs; and failure rate estimation. For radiation effects researchers, it is important that these systems properly emulate how SEEs manifest in FPGAs. If the fault emulation systems does not mimic the radiation environment, the system will generate erroneous data and incorrect predictions of behavior of the FPGA in a radiation environment. Validation determines whether the emulated faults are reasonable analogs to the radiation-induced faults. In this study we present methods for validating fault emulation systems and provide several examples of validated FPGA fault emulation systems.

  8. Diagnosis of Dengue Infection Using Conventional and Biosensor Based Techniques

    PubMed Central

    Parkash, Om; Hanim Shueb, Rafidah

    2015-01-01

    Dengue is an arthropod-borne viral disease caused by four antigenically different serotypes of dengue virus. This disease is considered as a major public health concern around the world. Currently, there is no licensed vaccine or antiviral drug available for the prevention and treatment of dengue disease. Moreover, clinical features of dengue are indistinguishable from other infectious diseases such as malaria, chikungunya, rickettsia and leptospira. Therefore, prompt and accurate laboratory diagnostic test is urgently required for disease confirmation and patient triage. The traditional diagnostic techniques for the dengue virus are viral detection in cell culture, serological testing, and RNA amplification using reverse transcriptase PCR. This paper discusses the conventional laboratory methods used for the diagnosis of dengue during the acute and convalescent phase and highlights the advantages and limitations of these routine laboratory tests. Subsequently, the biosensor based assays developed using various transducers for the detection of dengue are also reviewed. PMID:26492265

  9. Mars laser altimeter based on a single photon ranging technique

    NASA Technical Reports Server (NTRS)

    Prochazka, Ivan; Hamal, Karel; Sopko, B.; Pershin, S.

    1993-01-01

    The Mars 94/96 Mission will carry, among others things, the balloon probe experiment. The balloon with the scientific cargo in the gondola underneath will drift in the Mars atmosphere, its altitude will range from zero, in the night, up to 5 km at noon. The accurate gondola altitude will be determined by an altimeter. As the Balloon gondola mass is strictly limited, the altimeter total mass and power consumption are critical; maximum allowed is a few hundred grams a few tens of mWatts of average power consumption. We did propose, design, and construct the laser altimeter based on the single photon ranging technique. Topics covered include the following: principle of operation, altimeter construction, and ground tests.

  10. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    NASA Astrophysics Data System (ADS)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  11. CANDU in-reactor quantitative visual-based inspection techniques

    NASA Astrophysics Data System (ADS)

    Rochefort, P. A.

    2009-02-01

    This paper describes two separate visual-based inspection procedures used at CANDU nuclear power generating stations. The techniques are quantitative in nature and are delivered and operated in highly radioactive environments with access that is restrictive, and in one case is submerged. Visual-based inspections at stations are typically qualitative in nature. For example a video system will be used to search for a missing component, inspect for a broken fixture, or locate areas of excessive corrosion in a pipe. In contrast, the methods described here are used to measure characteristic component dimensions that in one case ensure ongoing safe operation of the reactor and in the other support reactor refurbishment. CANDU reactors are Pressurized Heavy Water Reactors (PHWR). The reactor vessel is a horizontal cylindrical low-pressure calandria tank approximately 6 m in diameter and length, containing heavy water as a neutron moderator. Inside the calandria, 380 horizontal fuel channels (FC) are supported at each end by integral end-shields. Each FC holds 12 fuel bundles. The heavy water primary heat transport water flows through the FC pressure tube, removing the heat from the fuel bundles and delivering it to the steam generator. The general design of the reactor governs both the type of measurements that are required and the methods to perform the measurements. The first inspection procedure is a method to remotely measure the gap between FC and other in-core horizontal components. The technique involves delivering vertically a module with a high-radiation-resistant camera and lighting into the core of a shutdown but fuelled reactor. The measurement is done using a line-of-sight technique between the components. Compensation for image perspective and viewing elevation to the measurement is required. The second inspection procedure measures flaws within the reactor's end shield FC calandria tube rolled joint area. The FC calandria tube (the outer shell of the FC) is

  12. Skull base tumours part I: imaging technique, anatomy and anterior skull base tumours.

    PubMed

    Borges, Alexandra

    2008-06-01

    Advances in cross-sectional imaging, surgical technique and adjuvant treatment have largely contributed to ameliorate the prognosis, lessen the morbidity and mortality of patients with skull base tumours and to the growing medical investment in the management of these patients. Because clinical assessment of the skull base is limited, cross-sectional imaging became indispensable in the diagnosis, treatment planning and follow-up of patients with suspected skull base pathology and the radiologist is increasingly responsible for the fate of these patients. This review will focus on the advances in imaging technique; contribution to patient's management and on the imaging features of the most common tumours affecting the anterior skull base. Emphasis is given to a systematic approach to skull base pathology based upon an anatomic division taking into account the major tissue constituents in each skull base compartment. The most relevant information that should be conveyed to surgeons and radiation oncologists involved in patient's management will be discussed.

  13. Knee extension isometric torque production differences based on verbal motivation given to introverted and extroverted female children.

    PubMed

    McWhorter, J Wesley; Landers, Merrill; Young, Daniel; Puentedura, E Louie; Hickman, Robbin A; Brooksby, Candi; Liveratti, Marc; Taylor, Lisa

    2011-08-01

    To date, little research has been conducted to test the efficacy of different forms of motivation based on a female child's personality type. The purpose of this study was to evaluate the ability of female children to perform a maximal knee extension isometric torque test with varying forms of motivation, based on the child's personality type (introvert vs. extrovert). The subjects were asked to perform a maximal isometric knee extension test under three different conditions: 1) with no verbal motivation, 2) with verbal motivation from the evaluator only, and 3) with verbal motivation from a group of their peers and the evaluator combined. A 2×3 mixed ANOVA was significant for an interaction (F 2,62=17.530; p<0.0005). Post hoc testing for the introverted group showed that scores without verbal motivation were significantly higher than with verbal motivation from the evaluator or the evaluator plus the peers. The extroverted group revealed that scores with verbal motivation from the evaluator or the evaluator plus the peers were significantly higher than without verbal motivation. Results suggest that verbal motivation has a varying effect on isometric knee extension torque production in female children with different personality types. Extroverted girls perform better with motivation, whereas introverted girls perform better without motivation from others. PMID:20812856

  14. Knee extension isometric torque production differences based on verbal motivation given to introverted and extroverted female children.

    PubMed

    McWhorter, J Wesley; Landers, Merrill; Young, Daniel; Puentedura, E Louie; Hickman, Robbin A; Brooksby, Candi; Liveratti, Marc; Taylor, Lisa

    2011-08-01

    To date, little research has been conducted to test the efficacy of different forms of motivation based on a female child's personality type. The purpose of this study was to evaluate the ability of female children to perform a maximal knee extension isometric torque test with varying forms of motivation, based on the child's personality type (introvert vs. extrovert). The subjects were asked to perform a maximal isometric knee extension test under three different conditions: 1) with no verbal motivation, 2) with verbal motivation from the evaluator only, and 3) with verbal motivation from a group of their peers and the evaluator combined. A 2×3 mixed ANOVA was significant for an interaction (F 2,62=17.530; p<0.0005). Post hoc testing for the introverted group showed that scores without verbal motivation were significantly higher than with verbal motivation from the evaluator or the evaluator plus the peers. The extroverted group revealed that scores with verbal motivation from the evaluator or the evaluator plus the peers were significantly higher than without verbal motivation. Results suggest that verbal motivation has a varying effect on isometric knee extension torque production in female children with different personality types. Extroverted girls perform better with motivation, whereas introverted girls perform better without motivation from others.

  15. Human resource development for a community-based health extension program: a case study from Ethiopia

    PubMed Central

    2013-01-01

    Introduction Ethiopia is one of the sub-Saharan countries most affected by high disease burden, aggravated by a shortage and imbalance of human resources, geographical distance, and socioeconomic factors. In 2004, the government introduced the Health Extension Program (HEP), a primary care delivery strategy, to address the challenges and achieve the World Health Organization Millennium Development Goals (MDGs) within a context of limited resources. Case description The health system was reformed to create a platform for integration and institutionalization of the HEP with appropriate human capacity, infrastructure, and management structures. Human resources were developed through training of female health workers recruited from their prospective villages, designed to limit the high staff turnover and address gender, social and cultural factors in order to provide services acceptable to each community. The service delivery modalities include household, community and health facility care. Thus, the most basic health post infrastructure, designed to rapidly and cost-effectively scale up HEP, was built in each village. In line with the country’s decentralized management system, the HEP service delivery is under the jurisdiction of the district authorities. Discussion and evaluation The nationwide implementation of HEP progressed in line with its target goals. In all, 40 training institutions were established, and over 30,000 Health Extension Workers have been trained and deployed to approximately 15,000 villages. The potential health service coverage reached 92.1% in 2011, up from 64% in 2004. While most health indicators have improved, performance in skilled delivery and postnatal care has not been satisfactory. While HEP is considered the most important institutional framework for achieving the health MDGs in Ethiopia, quality of service, utilization rate, access and referral linkage to emergency obstetric care, management, and evaluation of the program are the key

  16. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    NASA Astrophysics Data System (ADS)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  17. Introducing Risk Management Techniques Within Project Based Software Engineering Courses

    NASA Astrophysics Data System (ADS)

    Port, Daniel; Boehm, Barry

    2002-03-01

    In 1996, USC switched its core two-semester software engineering course from a hypothetical-project, homework-and-exam course based on the Bloom taxonomy of educational objectives (knowledge, comprehension, application, analysis, synthesis, and evaluation). The revised course is a real-client team-project course based on the CRESST model of learning objectives (content understanding, problem solving, collaboration, communication, and self-regulation). We used the CRESST cognitive demands analysis to determine the necessary student skills required for software risk management and the other major project activities, and have been refining the approach over the last 5 years of experience, including revised versions for one-semester undergraduate and graduate project course at Columbia. This paper summarizes our experiences in evolving the risk management aspects of the project course. These have helped us mature more general techniques such as risk-driven specifications, domain-specific simplifier and complicator lists, and the schedule as an independent variable (SAIV) process model. The largely positive results in terms of review of pass / fail rates, client evaluations, product adoption rates, and hiring manager feedback are summarized as well.

  18. Parameter tuning of PVD process based on artificial intelligence technique

    NASA Astrophysics Data System (ADS)

    Norlina, M. S.; Diyana, M. S. Nor; Mazidah, P.; Rusop, M.

    2016-07-01

    In this study, an artificial intelligence technique is proposed to be implemented in the parameter tuning of a PVD process. Due to its previous adaptation in similar optimization problems, genetic algorithm (GA) is selected to optimize the parameter tuning of the RF magnetron sputtering process. The most optimized parameter combination obtained from GA's optimization result is expected to produce the desirable zinc oxide (ZnO) thin film from the sputtering process. The parameters involved in this study were RF power, deposition time and substrate temperature. The algorithm was tested to optimize the 25 datasets of parameter combinations. The results from the computational experiment were then compared with the actual result from the laboratory experiment. Based on the comparison, GA had shown that the algorithm was reliable to optimize the parameter combination before the parameter tuning could be done to the RF magnetron sputtering machine. In order to verify the result of GA, the algorithm was also been compared to other well known optimization algorithms, which were, particle swarm optimization (PSO) and gravitational search algorithm (GSA). The results had shown that GA was reliable in solving this RF magnetron sputtering process parameter tuning problem. GA had shown better accuracy in the optimization based on the fitness evaluation.

  19. Damage detection technique by measuring laser-based mechanical impedance

    SciTech Connect

    Lee, Hyeonseok; Sohn, Hoon

    2014-02-18

    This study proposes a method for measurement of mechanical impedance using noncontact laser ultrasound. The measurement of mechanical impedance has been of great interest in nondestructive testing (NDT) or structural health monitoring (SHM) since mechanical impedance is sensitive even to small-sized structural defects. Conventional impedance measurements, however, have been based on electromechanical impedance (EMI) using contact-type piezoelectric transducers, which show deteriorated performances induced by the effects of a) Curie temperature limitations, b) electromagnetic interference (EMI), c) bonding layers and etc. This study aims to tackle the limitations of conventional EMI measurement by utilizing laser-based mechanical impedance (LMI) measurement. The LMI response, which is equivalent to a steady-state ultrasound response, is generated by shooting the pulse laser beam to the target structure, and is acquired by measuring the out-of-plane velocity using a laser vibrometer. The formation of the LMI response is observed through the thermo-mechanical finite element analysis. The feasibility of applying the LMI technique for damage detection is experimentally verified using a pipe specimen under high temperature environment.

  20. Movement Analysis of Flexion and Extension of Honeybee Abdomen Based on an Adaptive Segmented Structure

    PubMed Central

    Zhao, Jieliang; Wu, Jianing; Yan, Shaoze

    2015-01-01

    Honeybees (Apis mellifera) curl their abdomens for daily rhythmic activities. Prior to determining this fact, people have concluded that honeybees could curl their abdomen casually. However, an intriguing but less studied feature is the possible unidirectional abdominal deformation in free-flying honeybees. A high-speed video camera was used to capture the curling and to analyze the changes in the arc length of the honeybee abdomen not only in free-flying mode but also in the fixed sample. Frozen sections and environment scanning electron microscope were used to investigate the microstructure and motion principle of honeybee abdomen and to explore the physical structure restricting its curling. An adaptive segmented structure, especially the folded intersegmental membrane (FIM), plays a dominant role in the flexion and extension of the abdomen. The structural features of FIM were utilized to mimic and exhibit movement restriction on honeybee abdomen. Combining experimental analysis and theoretical demonstration, a unidirectional bending mechanism of honeybee abdomen was revealed. Through this finding, a new perspective for aerospace vehicle design can be imitated. PMID:26223946

  1. Shellac and Aloe vera gel based surface coating for shelf life extension of tomatoes.

    PubMed

    Chauhan, O P; Nanjappa, C; Ashok, N; Ravi, N; Roopa, N; Raju, P S

    2015-02-01

    Shellac (S) and Aloe vera gel (AG) were used to develop edible surface coatings for shelf-life extension of tomato fruits. The coating was prepared by dissolving de-waxed and bleached shellac in an alkaline aqueous medium as such as well as in combination with AG. Incorporation of AG in shellac coating improved permeability characteristics of the coating film towards oxygen and carbon dioxide and water vapours. The coatings when applied to tomatoes delayed senescence which was characterized by restricted changes in respiration and ethylene synthesis rates during storage. Texture of the fruits when measured in terms of firmness showed restricted changes as compared to untreated control. Similar observations were also recorded in the case of instrumental colour (L*, a* and b* values). The developed coatings extended shelf-life of tomatoes by 10, 8 and 12 days in case of shellac (S), AG and composite coating (S + AG) coated fruits, respectively; when kept at ambient storage conditions (28 ± 2 °C).

  2. Use of extension-deformation-based crystallisation of silk fibres to differentiate their functions in nature.

    PubMed

    Numata, Keiji; Masunaga, Hiroyasu; Hikima, Takaaki; Sasaki, Sono; Sekiyama, Kazuhide; Takata, Masaki

    2015-08-21

    β-Sheet crystals play an important role in determining the stiffness, strength, and optical properties of silk and in the exhibition of silk-type-specific functions. It is important to elucidate the structural changes that occur during the stretching of silk fibres to understand the functions of different types of fibres. Herein, we elucidate the initial crystallisation behaviour of silk molecules during the stretching of three types of silk fibres using synchrotron radiation X-ray analysis. When spider dragline silk was stretched, it underwent crystallisation and the alignment of the β-sheet crystals became disordered initially but was later recovered. On the other hand, silkworm cocoon silk did not exhibit further crystallisation, whereas capture spiral silk was predominantly amorphous. Structural analyses showed that the crystallisation of silks following extension deformation has a critical effect on their mechanical and optical properties. These findings should aid the production of artificial silk fibres and facilitate the development of silk-inspired functional materials. PMID:26166211

  3. 78 FR 7654 - Extension of Exemptions for Security-Based Swaps

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-04

    ... (Jul. 18, 2012), 77 FR 48208 (Aug. 13, 2012). Title VII amended the Securities Act and the Exchange Act..., Release No. 34-63825 (Feb. 2, 2011), 76 FR 10948 (Feb. 28, 2011) (``Security-Based SEF Proposing Release... Security-Based Swaps Issued By Certain Clearing Agencies, Release No. 33-9308 (Mar. 30, 2012), 77 FR...

  4. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  5. Dynamic digital watermark technique based on neural network

    NASA Astrophysics Data System (ADS)

    Gu, Tao; Li, Xu

    2008-04-01

    An algorithm of dynamic watermark based on neural network is presented which is more robust against attack of false authentication and watermark-tampered operations contrasting with one watermark embedded method. (1) Five binary images used as watermarks are coded into a binary array. The total number of 0s and 1s is 5*N, every 0 or 1 is enlarged fivefold by information-enlarged technique. N is the original total number of the watermarks' binary bits. (2) Choose the seed image pixel p x,y and its 3×3 vicinities pixel p x-1,y-1,p x-1,y,p x-1,y+1,p x,y-1,p x,y+1,p x+1,y-1,p x+1,y,p x+1,y+1 as one sample space. The p x,y is used as the neural network target and the other eight pixel values are used as neural network inputs. (3) To make the neural network learn the sample space, 5*N pixel values and their closely relevant pixel values are randomly chosen with a password from a color BMP format image and used to train the neural network.(4) A four-layer neural network is constructed to describe the nonlinear mapped relationship between inputs and outputs. (5) One bit from the array is embedded by adjusting the polarity between a chosen pixel value and the output value of the model. (6) One randomizer generates a number to ascertain the counts of watermarks for retrieving. The randomly ascertained watermarks can be retrieved by using the restored neural network outputs value, the corresponding image pixels value, and the restore function without knowing the original image and watermarks (The restored coded-watermark bit=1, if ox,y(restored)>p x,y(reconstructed, else coded-watermark bit =0). The retrieved watermarks are different when extracting each time. The proposed technique can offer more watermarking proofs than one watermark embedded algorithm. Experimental results show that the proposed technique is very robust against some image processing operations and JPEG lossy compression. Therefore, the algorithm can be used to protect the copyright of one important image.

  6. Evidence-Based Programming: What Is a Process an Extension Agent Can Use to Evaluate a Program's Effectiveness?

    ERIC Educational Resources Information Center

    Fetsch, Robert J.; MacPhee, David; Boyer, Luann K.

    2012-01-01

    Extension agents and specialists have experienced increased pressure for greater program effectiveness and accountability and especially for evidence-based programs. This article builds on previously published evidence-based programming articles. It provides ideas that address three problems that Extension staff face with EBPs and that Extension…

  7. Pseudo Algebraically Closed Extensions

    NASA Astrophysics Data System (ADS)

    Bary-Soroker, Lior

    2009-07-01

    This PhD deals with the notion of pseudo algebraically closed (PAC) extensions of fields. It develops a group-theoretic machinery, based on a generalization of embedding problems, to study these extensions. Perhaps the main result is that although there are many PAC extensions, the Galois closure of a proper PAC extension is separably closed. The dissertation also contains the following subjects. The group theoretical counterpart of pseudo algebraically closed extensions, the so-called projective pairs. Applications to seemingly unrelated subjects, e.g., an analog of Dirichlet's theorem about primes in arithmetic progression for polynomial rings in one variable over infinite fields.

  8. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review.

    PubMed

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  9. Physical, Chemical and Biochemical Modifications of Protein-Based Films and Coatings: An Extensive Review

    PubMed Central

    Zink, Joël; Wyrobnik, Tom; Prinz, Tobias; Schmid, Markus

    2016-01-01

    Protein-based films and coatings are an interesting alternative to traditional petroleum-based materials. However, their mechanical and barrier properties need to be enhanced in order to match those of the latter. Physical, chemical, and biochemical methods can be used for this purpose. The aim of this article is to provide an overview of the effects of various treatments on whey, soy, and wheat gluten protein-based films and coatings. These three protein sources have been chosen since they are among the most abundantly used and are well described in the literature. Similar behavior might be expected for other protein sources. Most of the modifications are still not fully understood at a fundamental level, but all the methods discussed change the properties of the proteins and resulting products. Mastering these modifications is an important step towards the industrial implementation of protein-based films. PMID:27563881

  10. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  11. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  12. Whole Genome Sequencing Based Characterization of Extensively Drug-Resistant Mycobacterium tuberculosis Isolates from Pakistan

    PubMed Central

    Ali, Asho; Hasan, Zahra; McNerney, Ruth; Mallard, Kim; Hill-Cawthorne, Grant; Coll, Francesc; Nair, Mridul; Pain, Arnab; Clark, Taane G.; Hasan, Rumina

    2015-01-01

    Improved molecular diagnostic methods for detection drug resistance in Mycobacterium tuberculosis (MTB) strains are required. Resistance to first- and second- line anti-tuberculous drugs has been associated with single nucleotide polymorphisms (SNPs) in particular genes. However, these SNPs can vary between MTB lineages therefore local data is required to describe different strain populations. We used whole genome sequencing (WGS) to characterize 37 extensively drug-resistant (XDR) MTB isolates from Pakistan and investigated 40 genes associated with drug resistance. Rifampicin resistance was attributable to SNPs in the rpoB hot-spot region. Isoniazid resistance was most commonly associated with the katG codon 315 (92%) mutation followed by inhA S94A (8%) however, one strain did not have SNPs in katG, inhA or oxyR-ahpC. All strains were pyrazimamide resistant but only 43% had pncA SNPs. Ethambutol resistant strains predominantly had embB codon 306 (62%) mutations, but additional SNPs at embB codons 406, 378 and 328 were also present. Fluoroquinolone resistance was associated with gyrA 91–94 codons in 81% of strains; four strains had only gyrB mutations, while others did not have SNPs in either gyrA or gyrB. Streptomycin resistant strains had mutations in ribosomal RNA genes; rpsL codon 43 (42%); rrs 500 region (16%), and gidB (34%) while six strains did not have mutations in any of these genes. Amikacin/kanamycin/capreomycin resistance was associated with SNPs in rrs at nt1401 (78%) and nt1484 (3%), except in seven (19%) strains. We estimate that if only the common hot-spot region targets of current commercial assays were used, the concordance between phenotypic and genotypic testing for these XDR strains would vary between rifampicin (100%), isoniazid (92%), flouroquinolones (81%), aminoglycoside (78%) and ethambutol (62%); while pncA sequencing would provide genotypic resistance in less than half the isolates. This work highlights the importance of expanded

  13. Research on technique of wavefront retrieval based on Foucault test

    NASA Astrophysics Data System (ADS)

    Yuan, Lvjun; Wu, Zhonghua

    2010-05-01

    During finely grinding the best fit sphere and initial stage of polishing, surface error of large aperture aspheric mirrors is too big to test using common interferometer. Foucault test is widely used in fabricating large aperture mirrors. However, the optical path is disturbed seriously by air turbulence, and changes of light and dark zones can not be identified, which often lowers people's judging ability and results in making mistake to diagnose surface error of the whole mirror. To solve the problem, the research presents wavefront retrieval based on Foucault test through digital image processing and quantitative calculation. Firstly, real Foucault image can be gained through collecting a variety of images by CCD, and then average these image to eliminate air turbulence. Secondly, gray values are converted into surface error values through principle derivation, mathematical modeling, and software programming. Thirdly, linear deviation brought by defocus should be removed by least-square method to get real surface error. At last, according to real surface error, plot wavefront map, gray contour map and corresponding pseudo color contour map. The experimental results indicates that the three-dimensional wavefront map and two-dimensional contour map are able to accurately and intuitively show surface error on the whole mirrors under test, and they are beneficial to grasp surface error as a whole. The technique can be used to guide the fabrication of large aperture and long focal mirrors during grinding and initial stage of polishing the aspheric surface, which improves fabricating efficiency and precision greatly.

  14. Initial planetary base construction techniques and machine implementation

    NASA Technical Reports Server (NTRS)

    Crockford, William W.

    1987-01-01

    Conceptual designs of (1) initial planetary base structures, and (2) an unmanned machine to perform the construction of these structures using materials local to the planet are presented. Rock melting is suggested as a possible technique to be used by the machine in fabricating roads, platforms, and interlocking bricks. Identification of problem areas in machine design and materials processing is accomplished. The feasibility of the designs is contingent upon favorable results of an analysis of the engineering behavior of the product materials. The analysis requires knowledge of several parameters for solution of the constitutive equations of the theory of elasticity. An initial collection of these parameters is presented which helps to define research needed to perform a realistic feasibility study. A qualitative approach to estimating power and mass lift requirements for the proposed machine is used which employs specifications of currently available equipment. An initial, unmanned mission scenario is discussed with emphasis on identifying uncompleted tasks and suggesting design considerations for vehicles and primitive structures which use the products of the machine processing.

  15. Age estimation based on Kvaal's technique using digital panoramic radiographs

    PubMed Central

    Mittal, Samta; Nagendrareddy, Suma Gundareddy; Sharma, Manisha Lakhanpal; Agnihotri, Poornapragna; Chaudhary, Sunil; Dhillon, Manu

    2016-01-01

    Introduction: Age estimation is important for administrative and ethical reasons and also because of legal consequences. Dental pulp undergoes regression in size with increasing age due to secondary dentin deposition and can be used as a parameter of age estimation even beyond 25 years of age. Kvaal et al. developed a method for chronological age estimation based on the pulp size using periapical dental radiographs. There is a need for testing this method of age estimation in the Indian population using simple tools like digital imaging on living individuals not requiring extraction of teeth. Aims and Objectives: Estimation of the chronological age of subjects by Kvaal's method using digital panoramic radiographs and also testing the validity of regression equations as given by Kvaal et al. Materials and Methods: The study sample included a total of 152 subjects in the age group of 14-60 years. Measurements were performed on the standardized digital panoramic radiographs based on Kvaal's method. Different regression formulae were derived and the age was assessed. The assessed age was then correlated to the actual age of the patient using Student's t-test. Results: No significant difference between the mean of the chronological age and the estimated age was observed. However, the values of the mean age estimated by using regression equations as given previously in the study of Kvaal et al. significantly underestimated the chronological age in the present study sample. Conclusion: The results of the study give an inference for the feasibility of this technique by calculation of regression equations on digital panoramic radiographs. However, it negates the applicability of same regression equations as given by Kvaal et al. on the study population. PMID:27555738

  16. Novel technique: a pupillometer-based objective chromatic perimetry

    NASA Astrophysics Data System (ADS)

    Rotenstreich, Ygal; Skaat, Alon; Sher, Ifat; Kolker, Andru; Rosenfeld, Elkana; Melamed, Shlomo; Belkin, Michael

    2014-02-01

    Evaluation of visual field (VF) is important for clinical diagnosis and patient monitoring. The current VF methods are subjective and require patient cooperation. Here we developed a novel objective perimetry technique based on the pupil response (PR) to multifocal chromatic stimuli in normal subjects and in patients with glaucoma and retinitis pigmentosa (RP). A computerized infrared video pupillometer was used to record PR to short- and long-wavelength stimuli (peak 485 nm and 620 nm, respectively) at light intensities of 15-100 cd-s/m2 at thirteen different points of the VF. The RP study included 30 eyes of 16 patients and 20 eyes of 12 healthy participants. The glaucoma study included 22 eyes of 11 patients and 38 eyes of 19 healthy participants. Significantly reduced PR was observed in RP patients in response to short-wavelength stimuli at 40 cd-s/m2 in nearly all perimetric locations (P <0.05). By contrast, RP patients demonstrated nearly normal PR to long-wavelength in majority of perimetric locations. The glaucoma group showed significantly reduced PR to long- and short-wavelength stimuli at high intensity in all perimetric locations (P <0.05). The PR of glaucoma patients was significantly lower than normal in response to short-wavelength stimuli at low intensity mostly in central and 20° locations (p<0.05). This study demonstrates the feasibility of using pupillometer-based chromatic perimetry for objectively assessing VF defects and retinal function and optic nerve damage in patients with retinal dystrophies and glaucoma. Furthermore, this method may be used to distinguish between the damaged cells underlying the VF defect.

  17. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    NASA Technical Reports Server (NTRS)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  18. A Spatial Division Clustering Method and Low Dimensional Feature Extraction Technique Based Indoor Positioning System

    PubMed Central

    Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao

    2014-01-01

    Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470

  19. Water-based oligochitosan and nanowhisker chitosan as potential food preservatives for shelf-life extension of minced pork.

    PubMed

    Chantarasataporn, Patomporn; Tepkasikul, Preenapha; Kingcha, Yutthana; Yoksan, Rangrong; Pichyangkura, Rath; Visessanguan, Wonnop; Chirachanchai, Suwabun

    2014-09-15

    Water-based chitosans in the forms of oligochitosan (OligoCS) and nanowhisker chitosan (CSWK) are proposed as a novel food preservative based on a minced pork model study. The high surface area with a positive charge over the neutral pH range (pH 5-8) of OligoCS and CSWK lead to an inhibition against Gram-positive (Staphylococcus aureus, Listeria monocytogenes, and Bacillus cereus) and Gram-negative microbes (Salmonella enteritidis and Escherichia coli O157:H7). In the minced pork model, OligoCS effectively performs a food preservative for shelf-life extension as clarified from the retardation of microbial growth, biogenic amine formation and lipid oxidation during the storage. OligoCS maintains almost all myosin heavy chain protein degradation as observed in the electrophoresis. The present work points out that water-based chitosan with its unique morphology not only significantly inhibits antimicrobial activity but also maintains the meat quality with an extension of shelf-life, and thus has the potential to be used as a food preservative.

  20. Extension of a Kolmogorov Atmospheric Turbulence Model for Time-Based Simulation Implementation

    NASA Technical Reports Server (NTRS)

    McMinn, John D.

    1997-01-01

    The development of any super/hypersonic aircraft requires the interaction of a wide variety of technical disciplines to maximize vehicle performance. For flight and engine control system design and development on this class of vehicle, realistic mathematical simulation models of atmospheric turbulence, including winds and the varying thermodynamic properties of the atmosphere, are needed. A model which has been tentatively selected by a government/industry group of flight and engine/inlet controls representatives working on the High Speed Civil Transport is one based on the Kolmogorov spectrum function. This report compares the Dryden and Kolmogorov turbulence forms, and describes enhancements that add functionality to the selected Kolmogorov model. These added features are: an altitude variation of the eddy dissipation rate based on Dryden data, the mapping of the eddy dissipation rate database onto a regular latitude and longitude grid, a method to account for flight at large vehicle attitude angles, and a procedure for transitioning smoothly across turbulence segments.

  1. Nanobatteries in redox-based resistive switches require extension of memristor theory.

    PubMed

    Valov, I; Linn, E; Tappertzhofen, S; Schmelzer, S; van den Hurk, J; Lentz, F; Waser, R

    2013-01-01

    Redox-based nanoionic resistive memory cells are one of the most promising emerging nanodevices for future information technology with applications for memory, logic and neuromorphic computing. Recently, the serendipitous discovery of the link between redox-based nanoionic-resistive memory cells and memristors and memristive devices has further intensified the research in this field. Here we show on both a theoretical and an experimental level that nanoionic-type memristive elements are inherently controlled by non-equilibrium states resulting in a nanobattery. As a result, the memristor theory must be extended to fit the observed non-zero-crossing I-V characteristics. The initial electromotive force of the nanobattery depends on the chemistry and the transport properties of the materials system but can also be introduced during redox-based nanoionic-resistive memory cell operations. The emf has a strong impact on the dynamic behaviour of nanoscale memories, and thus, its control is one of the key factors for future device development and accurate modelling.

  2. Nanobatteries in redox-based resistive switches require extension of memristor theory

    PubMed Central

    Valov, I.; Linn, E.; Tappertzhofen, S.; Schmelzer, S.; van den Hurk, J.; Lentz, F.; Waser, R.

    2013-01-01

    Redox-based nanoionic resistive memory cells are one of the most promising emerging nanodevices for future information technology with applications for memory, logic and neuromorphic computing. Recently, the serendipitous discovery of the link between redox-based nanoionic-resistive memory cells and memristors and memristive devices has further intensified the research in this field. Here we show on both a theoretical and an experimental level that nanoionic-type memristive elements are inherently controlled by non-equilibrium states resulting in a nanobattery. As a result, the memristor theory must be extended to fit the observed non-zero-crossing I–V characteristics. The initial electromotive force of the nanobattery depends on the chemistry and the transport properties of the materials system but can also be introduced during redox-based nanoionic-resistive memory cell operations. The emf has a strong impact on the dynamic behaviour of nanoscale memories, and thus, its control is one of the key factors for future device development and accurate modelling. PMID:23612312

  3. Adhesive-based bonding technique for PDMS microfluidic devices.

    PubMed

    Thompson, C Shea; Abate, Adam R

    2013-02-21

    We present a simple and inexpensive technique for bonding PDMS microfluidic devices. The technique uses only adhesive tape and an oven; plasma bonders and cleanroom facilities are not required. It also produces channels that are immediately hydrophobic, allowing formation of aqueous-in-oil emulsions.

  4. Comparison Of Four FFT-Based Frequency-Acquisition Techniques

    NASA Technical Reports Server (NTRS)

    Shah, Biren N.; Hinedi, Sami M.; Holmes, Jack K.

    1993-01-01

    Report presents comparative theoretical analysis of four conceptual techniques for initial estimation of carrier frequency of suppressed-carrier, binary-phase-shift-keyed radio signal. Each technique effected by open-loop analog/digital signal-processing subsystem part of Costas-loop phase-error detector functioning in closed-loop manner overall.

  5. Anterolateral Ligament Reconstruction Technique: An Anatomic-Based Approach.

    PubMed

    Chahla, Jorge; Menge, Travis J; Mitchell, Justin J; Dean, Chase S; LaPrade, Robert F

    2016-06-01

    Restoration of anteroposterior laxity after an anterior cruciate ligament reconstruction has been predictable with traditional open and endoscopic techniques. However, anterolateral rotational stability has been difficult to achieve in a subset of patients, even with appropriate anatomic techniques. Therefore, differing techniques have attempted to address this rotational laxity by augmenting or reconstructing lateral-sided structures about the knee. In recent years, there has been a renewed interest in the anterolateral ligament as a potential contributor to residual anterolateral rotatory instability in anterior cruciate ligament-deficient patients. Numerous anatomic and biomechanical studies have been performed to further define the functional importance of the anterolateral ligament, highlighting the need for surgical techniques to address these injuries in the unstable knee. This article details our technique for an anatomic anterolateral ligament reconstruction using a semitendinosus tendon allograft. PMID:27656361

  6. Carbon Storage in an Extensive Karst-distributed Region of Southwestern China based on Multiple Methods

    NASA Astrophysics Data System (ADS)

    Guo, C.; Wu, Y.; Yang, H.; Ni, J.

    2015-12-01

    Accurate estimation of carbon storage is crucial to better understand the processes of global and regional carbon cycles and to more precisely project ecological and economic scenarios for the future. Southwestern China has broadly and continuously distribution of karst landscapes with harsh and fragile habitats which might lead to rocky desertification, an ecological disaster which has significantly hindered vegetation succession and economic development in karst regions of southwestern China. In this study we evaluated the carbon storage in eight political divisions of southwestern China based on four methods: forest inventory, carbon density based on field investigations, CASA model driven by remote sensing data, and BIOME4/LPJ global vegetation models driven by climate data. The results show that: (1) The total vegetation carbon storage (including agricultural ecosystem) is 6763.97 Tg C based on the carbon density, and the soil organic carbon (SOC) storage (above 20cm depth) is 12475.72 Tg C. Sichuan Province (including Chongqing) possess the highest carbon storage in both vegetation and soil (1736.47 Tg C and 4056.56 Tg C, respectively) among the eight political divisions because of the higher carbon density and larger distribution area. The vegetation carbon storage in Hunan Province is the smallest (565.30 Tg C), and the smallest SOC storage (1127.40 Tg C) is in Guangdong Province; (2) Based on forest inventory data, the total aboveground carbon storage in the woody vegetation is 2103.29 Tg C. The carbon storage in Yunnan Province (819.01 Tg C) is significantly higher than other areas while tropical rainforests and seasonal forests in Yunnan contribute the maximum of the woody vegetation carbon storage (account for 62.40% of the total). (3) The net primary production (NPP) simulated by the CASA model is 68.57 Tg C/yr, while the forest NPP in the non-karst region (account for 72.50% of the total) is higher than that in the karst region. (4) BIOME4 and LPJ

  7. Development of a Flexible and Extensible Computer-based Simulation Platform for Healthcare Students.

    PubMed

    Bindoff, Ivan; Cummings, Elizabeth; Ling, Tristan; Chalmers, Leanne; Bereznicki, Luke

    2015-01-01

    Accessing appropriate clinical placement positions for all health profession students can be expensive and challenging. Increasingly simulation, in a range of modes, is being used to enhance student learning and prepare them for clinical placement. Commonly these simulations are focused on the use of simulated patient mannequins which typically presented as single-event scenarios, difficult to organise, and usually scenarios include only a single healthcare profession. Computer based simulation is relatively under-researched and under-utilised but is beginning to demonstrate potential benefits. This paper describes the development and trialling of an entirely virtual 3D simulated environment for inter-professional student education. PMID:25676952

  8. Application of Condition-Based Monitoring Techniques for Remote Monitoring of a Simulated Gas Centrifuge Enrichment Plant

    SciTech Connect

    Hooper, David A; Henkel, James J; Whitaker, Michael

    2012-01-01

    This paper presents research into the adaptation of monitoring techniques from maintainability and reliability (M&R) engineering for remote unattended monitoring of gas centrifuge enrichment plants (GCEPs) for international safeguards. Two categories of techniques are discussed: the sequential probability ratio test (SPRT) for diagnostic monitoring, and sequential Monte Carlo (SMC or, more commonly, particle filtering ) for prognostic monitoring. Development and testing of the application of condition-based monitoring (CBM) techniques was performed on the Oak Ridge Mock Feed and Withdrawal (F&W) facility as a proof of principle. CBM techniques have been extensively developed for M&R assessment of physical processes, such as manufacturing and power plants. These techniques are normally used to locate and diagnose the effects of mechanical degradation of equipment to aid in planning of maintenance and repair cycles. In a safeguards environment, however, the goal is not to identify mechanical deterioration, but to detect and diagnose (and potentially predict) attempts to circumvent normal, declared facility operations, such as through protracted diversion of enriched material. The CBM techniques are first explained from the traditional perspective of maintenance and reliability engineering. The adaptation of CBM techniques to inspector monitoring is then discussed, focusing on the unique challenges of decision-based effects rather than equipment degradation effects. These techniques are then applied to the Oak Ridge Mock F&W facility a water-based physical simulation of a material feed and withdrawal process used at enrichment plants that is used to develop and test online monitoring techniques for fully information-driven safeguards of GCEPs. Advantages and limitations of the CBM approach to online monitoring are discussed, as well as the potential challenges of adapting CBM concepts to safeguards applications.

  9. Improved mesh based photon sampling techniques for neutron activation analysis

    SciTech Connect

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-07-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  10. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    NASA Astrophysics Data System (ADS)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  11. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    SciTech Connect

    Almansouri, Hani; Clayton, Dwight A; Kisner, Roger A; Polsky, Yarom; Bouman, Charlie; Santos-Villalobos, Hector J

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  12. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    SciTech Connect

    Almansouri, Hani; Clayton, Dwight A; Kisner, Roger A; Polsky, Yarom; Bouman, Charlie; Santos-Villalobos, Hector J

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  13. A fast and accurate PCA based radiative transfer model: Extension to the broadband shortwave region

    NASA Astrophysics Data System (ADS)

    Kopparla, Pushkar; Natraj, Vijay; Spurr, Robert; Shia, Run-Lie; Crisp, David; Yung, Yuk L.

    2016-04-01

    Accurate radiative transfer (RT) calculations are necessary for many earth-atmosphere applications, from remote sensing retrieval to climate modeling. A Principal Component Analysis (PCA)-based spectral binning method has been shown to provide an order of magnitude increase in computational speed while maintaining an overall accuracy of 0.01% (compared to line-by-line calculations) over narrow spectral bands. In this paper, we have extended the PCA method for RT calculations over the entire shortwave region of the spectrum from 0.3 to 3 microns. The region is divided into 33 spectral fields covering all major gas absorption regimes. We find that the RT performance runtimes are shorter by factors between 10 and 100, while root mean square errors are of order 0.01%.

  14. Research Extension and Education Programs on Bio-based Energy Technologies and Products

    SciTech Connect

    Jackson, Sam; Harper, David; Womac, Al

    2010-03-02

    The overall objectives of this project were to provide enhanced educational resources for the general public, educational and development opportunities for University faculty in the Southeast region, and enhance research knowledge concerning biomass preprocessing and deconstruction. All of these efforts combine to create a research and education program that enhances the biomass-based industries of the United States. This work was broken into five primary objective areas: • Task A - Technical research in the area of biomass preprocessing, analysis, and evaluation. • Tasks B&C - Technical research in the areas of Fluidized Beds for the Chemical Modification of Lignocellulosic Biomass and Biomass Deconstruction and Evaluation. • Task D - Analyses for the non-scientific community to provides a comprehensive analysis of the current state of biomass supply, demand, technologies, markets and policies; identify a set of feasible alternative paths for biomass industry development and quantify the impacts associated with alternative path. • Task E - Efforts to build research capacity and develop partnerships through faculty fellowships with DOE national labs The research and education programs conducted through this grant have led to three primary results. They include: • A better knowledge base related to and understanding of biomass deconstruction, through both mechanical size reduction and chemical processing • A better source of information related to biomass, bioenergy, and bioproducts for researchers and general public users through the BioWeb system. • Stronger research ties between land-grant universities and DOE National Labs through the faculty fellowship program. In addition to the scientific knowledge and resources developed, funding through this program produced a minimum of eleven (11) scientific publications and contributed to the research behind at least one patent.

  15. Blood culture technique based on centrifugation: clinical evaluation.

    PubMed Central

    Dorn, G L; Burson, G G; Haynes, J R

    1976-01-01

    A total of 1,000 blood samples from patients suspected of having a bacteremia were analyzed concurrently, where possible, by three methods: (i) Trypticase soy broth with sodium polyanethol sulfonate and a CO2 atmosphere: (ii) pour plates with either brain heart infusion agar or Sabouraud dextrose agar; and (iii) centrifugation of the suspected organism in a hypertonic solution. There were 176 positive cultures. The centrifugation technique recovered 73% of the positive cultures. The broth and pour plate techniques recovered 38 and 49%, respectively. The centrifugation technique showed an increased isolation rate for Pseudomonas, fungi, and gram-positive cocci. In general, for each organism the time required for the detection of a positive culture was shortest for the centrifugation technique. PMID:1270591

  16. A local technique based on vectorized surfaces for craniofacial reconstruction.

    PubMed

    Tilotta, Françoise M; Glaunès, Joan A; Richard, Frédéric J P; Rozenholc, Yves

    2010-07-15

    In this paper, we focus on the automation of facial reconstruction. Since they consider the whole head as the object of interest, usual reconstruction techniques are global and involve a large number of parameters to be estimated. We present a local technique which aims at reaching a good trade-off between bias and variance following the paradigm of non-parametric statistics. The estimation is localized on patches delimited by surface geodesics between anatomical points of the skull. The technique relies on a continuous representation of the individual surfaces embedded in the vectorial space of extended normal vector fields. This allows to compute deformations and averages of surfaces. It consists in estimating the soft-tissue surface over patches. Using a homogeneous database described in [31], we obtain results on the chin and nasal regions with an average error below 1mm, outperforming the global reconstruction techniques.

  17. Evaluation of a school-based diabetes education intervention, an extension of Program ENERGY

    NASA Astrophysics Data System (ADS)

    Conner, Matthew David

    Background: The prevalence of both obesity and type 2 diabetes in the United States has increased over the past two decades and rates remain high. The latest data from the National Center for Health Statistics estimates that 36% of adults and 17% of children and adolescents in the US are obese (CDC Adult Obesity, CDC Childhood Obesity). Being overweight or obese greatly increases one's risk of developing several chronic diseases, such as type 2 diabetes. Approximately 8% of adults in the US have diabetes, type 2 diabetes accounts for 90-95% of these cases. Type 2 diabetes in children and adolescents is still rare, however clinical reports suggest an increase in the frequency of diagnosis (CDC Diabetes Fact Sheet, 2011). Results from the Diabetes Prevention Program show that the incidence of type 2 diabetes can be reduced through the adoption of a healthier lifestyle among high-risk individuals (DPP, 2002). Objectives: This classroom-based intervention included scientific coverage of energy balance, diabetes, diabetes prevention strategies, and diabetes management. Coverage of diabetes management topics were included in lesson content to further the students' understanding of the disease. Measurable short-term goals of the intervention included increases in: general diabetes knowledge, diabetes management knowledge, and awareness of type 2 diabetes prevention strategies. Methods: A total of 66 sixth grade students at Tavelli Elementary School in Fort Collins, CO completed the intervention. The program consisted of nine classroom-based lessons; students participated in one lesson every two weeks. The lessons were delivered from November of 2005 to May of 2006. Each bi-weekly lesson included a presentation and interactive group activities. Participants completed two diabetes knowledge questionnaires at baseline and post intervention. A diabetes survey developed by Program ENERGY measured general diabetes knowledge and awareness of type 2 diabetes prevention strategies

  18. A study of trends and techniques for space base electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1978-01-01

    Furnaces and photolithography related equipment were applied to experiments on double layer metal. The double layer metal activity emphasized wet chemistry techniques. By incorporating the following techniques: (1) ultrasonic etching of the vias; (2) premetal clean using a modified buffered hydrogen fluoride; (3) phosphorus doped vapor; and (4) extended sintering, yields of 98 percent were obtained using the standard test pattern. The two dimensional modeling problems have stemmed from, alternately, instability and too much computation time to achieve convergence.

  19. Biogeosystem technique as a base of Sustainable Irrigated Agriculture

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdulmalik

    2016-04-01

    The world water strategy is to be changed because the current imitational gravitational frontal isotropic-continual paradigm of irrigation is not sustainable. This paradigm causes excessive consumption of fresh water - global deficit - up to 4-15 times, adverse effects on soils and landscapes. Current methods of irrigation does not control the water spread throughout the soil continuum. The preferable downward fluxes of irrigation water are forming, up to 70% and more of water supply loses into vadose zone. The moisture of irrigated soil is high, soil loses structure in the process of granulometric fractions flotation decomposition, the stomatal apparatus of plant leaf is fully open, transpiration rate is maximal. We propose the Biogeosystem technique - the transcendental, uncommon and non-imitating methods for Sustainable Natural Resources Management. New paradigm of irrigation is based on the intra-soil pulse discrete method of water supply into the soil continuum by injection in small discrete portions. Individual volume of water is supplied as a vertical cylinder of soil preliminary watering. The cylinder position in soil is at depth form 10 to 30 cm. Diameter of cylinder is 1-2 cm. Within 5-10 min after injection the water spreads from the cylinder of preliminary watering into surrounding soil by capillary, film and vapor transfer. Small amount of water is transferred gravitationally to the depth of 35-40 cm. The soil watering cylinder position in soil profile is at depth of 5-50 cm, diameter of the cylinder is 2-4 cm. Lateral distance between next cylinders along the plant raw is 10-15 cm. The soil carcass which is surrounding the cylinder of non-watered soil remains relatively dry and mechanically stable. After water injection the structure of soil in cylinder restores quickly because of no compression from the stable adjoining volume of soil and soil structure memory. The mean soil thermodynamic water potential of watered zone is -0.2 MPa. At this potential

  20. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings

    PubMed Central

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-01-01

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance. PMID:25946630

  1. WSN- and IOT-Based Smart Homes and Their Extension to Smart Buildings.

    PubMed

    Ghayvat, Hemant; Mukhopadhyay, Subhas; Gui, Xiang; Suryadevara, Nagender

    2015-05-04

    Our research approach is to design and develop reliable, efficient, flexible, economical, real-time and realistic wellness sensor networks for smart home systems. The heterogeneous sensor and actuator nodes based on wireless networking technologies are deployed into the home environment. These nodes generate real-time data related to the object usage and movement inside the home, to forecast the wellness of an individual. Here, wellness stands for how efficiently someone stays fit in the home environment and performs his or her daily routine in order to live a long and healthy life. We initiate the research with the development of the smart home approach and implement it in different home conditions (different houses) to monitor the activity of an inhabitant for wellness detection. Additionally, our research extends the smart home system to smart buildings and models the design issues related to the smart building environment; these design issues are linked with system performance and reliability. This research paper also discusses and illustrates the possible mitigation to handle the ISM band interference and attenuation losses without compromising optimum system performance.

  2. Retention of denture bases fabricated by three different processing techniques – An in vivo study

    PubMed Central

    Chalapathi Kumar, V. H.; Surapaneni, Hemchand; Ravikiran, V.; Chandra, B. Sarat; Balusu, Srilatha; Reddy, V. Naveen

    2016-01-01

    Aim: Distortion due to Polymerization shrinkage compromises the retention. To evaluate the amount of retention of denture bases fabricated by conventional, anchorized, and injection molding polymerization techniques. Materials and Methods: Ten completely edentulous patients were selected, impressions were made, and master cast obtained was duplicated to fabricate denture bases by three polymerization techniques. Loop was attached to the finished denture bases to estimate the force required to dislodge them by retention apparatus. Readings were subjected to nonparametric Friedman two-way analysis of variance followed by Bonferroni correction methods and Wilcoxon matched-pairs signed-ranks test. Results: Denture bases fabricated by injection molding (3740 g), anchorized techniques (2913 g) recorded greater retention values than conventional technique (2468 g). Significant difference was seen between these techniques. Conclusions: Denture bases obtained by injection molding polymerization technique exhibited maximum retention, followed by anchorized technique, and least retention was seen in conventional molding technique. PMID:27382542

  3. GC-Based Techniques for Breath Analysis: Current Status, Challenges, and Prospects.

    PubMed

    Xu, Mingjun; Tang, Zhentao; Duan, Yixiang; Liu, Yong

    2016-07-01

    Breath analysis is a noninvasive diagnostic method that profiles a person's physical state by volatile organic compounds in the breath. It has huge potential in the field of disease diagnosis. In order to offer opportunities for practical applications, various GC-based techniques have been investigated for on-line breath analysis since GC is the most preferred technique for mixed gas separation. This article reviews the development of breath analysis and GC-based techniques in basic breath research, involving sampling methods, preconcentration methods, conventional GC-based techniques, and newly developed GC techniques for breath analysis. The combination of GC and newly developed detection techniques takes advantages of the virtues of each. In addition, portable GC or micro GC are poised to become field GC-based techniques in breath analysis. Challenges faced in GC-based techniques for breath analysis are discussed candidly. Effective cooperation of experts from different fields is urgent to promote the development of breath analysis.

  4. Assessing the Utility of the Nominal Group Technique as a Consensus-Building Tool in Extension-Led Avian Influenza Response Planning

    ERIC Educational Resources Information Center

    Kline, Terence R.

    2013-01-01

    The intent of the project described was to apply the Nominal Group Technique (NGT) to achieve a consensus on Avian Influenza (AI) planning in Northeastern Ohio. Nominal Group Technique is a process first developed by Delbecq, Vande Ven, and Gustafsen (1975) to allow all participants to have an equal say in an open forum setting. A very diverse…

  5. C. elegans lifespan extension by osmotic stress requires FUdR, base excision repair, FOXO, and sirtuins.

    PubMed

    Anderson, Edward N; Corkins, Mark E; Li, Jia-Cheng; Singh, Komudi; Parsons, Sadé; Tucey, Tim M; Sorkaç, Altar; Huang, Huiyan; Dimitriadi, Maria; Sinclair, David A; Hart, Anne C

    2016-03-01

    Moderate stress can increase lifespan by hormesis, a beneficial low-level induction of stress response pathways. 5'-fluorodeoxyuridine (FUdR) is commonly used to sterilize Caenorhabditis elegans in aging experiments. However, FUdR alters lifespan in some genotypes and induces resistance to thermal and proteotoxic stress. We report that hypertonic stress in combination with FUdR treatment or inhibition of the FUdR target thymidylate synthase, TYMS-1, extends C. elegans lifespan by up to 30%. By contrast, in the absence of FUdR, hypertonic stress decreases lifespan. Adaptation to hypertonic stress requires diminished Notch signaling and loss of Notch co-ligands leads to lifespan extension only in combination with FUdR. Either FUdR treatment or TYMS-1 loss induced resistance to acute hypertonic stress, anoxia, and thermal stress. FUdR treatment increased expression of DAF-16 FOXO and the osmolyte biosynthesis enzyme GPDH-1. FUdR-induced hypertonic stress resistance was partially dependent on sirtuins and base excision repair (BER) pathways, while FUdR-induced lifespan extension under hypertonic stress conditions requires DAF-16, BER, and sirtuin function. Combined, these results demonstrate that FUdR, through inhibition of TYMS-1, activates stress response pathways in somatic tissues to confer hormetic resistance to acute and chronic stress. C. elegans lifespan studies using FUdR may need re-interpretation in light of this work.

  6. Effects of a group-based reproductive management extension programme on key management outcomes affecting reproductive performance.

    PubMed

    Brownlie, Tom S; Morton, John M; Heuer, Cord; McDougall, Scott

    2015-02-01

    A group-based reproductive management extension programme has been designed to help managers of dairy herds improve herd reproductive performance. The aims of this study were, firstly, to assess effects of participation by key decision makers (KDMs) in a farmer action group programme in 2009 and 2010 on six key management outcomes (KMOs) that affect reproductive performance over 2 years (2009-2010 and 2010-2011), and secondly, to describe KDM intentions to change management behaviour(s) affecting each management outcome after participation in the programme. Seasonal calving dairy herds from four regions of New Zealand were enrolled in the study. Intentions to modify management behaviour were recorded using the formal written action plans developed during the extension programme. KMOs assessed were calving pattern of the herd, pre-calving heifer liveweight, pre-calving and premating body condition score (BCS), oestrus detection, anoestrus cow management and bull management. Participation was associated with improvements in heifer liveweight, more heifers calving in the first 6 weeks of the seasonal calving period, premating BCS and oestrus detection. No significant effects were observed on anoestrus cow management or bull management. KDMs with greater numbers of proposed actions had lower 6 week in-calf rates in the second study year than KDMs who proposed fewer actions. A more effective strategy to ensure more appropriate objectives is proposed. Strategies to help KDMs to implement proposed actions more successfully should be investigated to improve the programme further.

  7. Copyright protection for multimedia data based on asymmetric cryptographic techniques

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander

    1998-09-01

    This paper presents a new approach for the copyright protection of digital multimedia data. The system applies cryptographic protocols and a public key technique for different purposes, namely encoding/decoding a digital watermark generated by any spread spectrum technique and the secure transfer of watermarked data from the sender to the receiver in a commercial business process. The public key technique is applied for the construction of a one-way watermark embedding and verification function to identify and prove the uniqueness of the watermark. In addition, our approach provides secure owner authentication data who has initiated the watermark process for a specific data set. Legal dispute resolution is supported for multiple watermarking of digital data without revealing the confidential keying information.

  8. AGRICULTURAL EXTENSION.

    ERIC Educational Resources Information Center

    FARQUHAR, R.N.

    AUSTRALIAN AGRICULTURAL EXTENSION HAS LONG EMPHASIZED TECHNICAL ADVISORY SERVICE AT THE EXPENSE OF THE SOCIOECONOMIC ASPECTS OF FARM PRODUCTION AND FARM LIFE. ONLY IN TASMANIA HAS FARM MANAGEMENT BEEN STRESSED. DEMANDS FOR THE WHOLE-FARM APPROACH HAVE PRODUCED A TREND TOWARD GENERALISM FOR DISTRICT OFFICERS IN MOST STATES. THE FEDERAL GOVERNMENT,…

  9. Intraoperative Vagus Nerve Monitoring: A Transnasal Technique during Skull Base Surgery

    PubMed Central

    Schutt, Christopher A.; Paskhover, Boris; Judson, Benjamin L.

    2014-01-01

    Objectives Intraoperative vagus nerve monitoring during skull base surgery has been reported with the use of an oral nerve monitoring endotracheal tube. However, the intraoral presence of an endotracheal tube can limit exposure by its location in the operative field during transfacial approaches and by limiting superior mobilization of the mandible during transcervical approaches. We describe a transnasal vagus nerve monitoring technique. Design and Participants Ten patients underwent open skull base surgery. Surgical approaches included transcervical (five), transfacial/maxillary swing (three), and double mandibular osteotomy (two). The vagus nerve was identified, stimulated, and monitored in all cases. Main Outcome Measures Intraoperative nerve stimulation, pre- and postoperative vagus nerve function through the use of flexible laryngoscopy in conjunction with assessment of subjective symptoms of hoarseness, voice change, and swallowing difficulty. Results Three patients had extensive involvement of the nerve by tumor with complete postoperative nerve deficit, one patient had a transient deficit following dissection of tumor off of nerve with resolution, and the remaining patients had nerve preservation. One patient experienced minor epistaxis during monitor tube placement that was managed conservatively. Conclusions Transnasal vagal nerve monitoring is a simple method that allows for intraoperative monitoring during nerve preservation surgery without limiting surgical exposure. PMID:25844292

  10. A Cost Benefit Technique for R & D Based Information.

    ERIC Educational Resources Information Center

    Stern, B. T.

    A cost benefit technique consisting of the following five phases is proposed: (a) specific objectives of the service, (b) measurement of work flow, (c) work costing, (d) charge to users of the information service, and (e) equating demand and cost. In this approach, objectives are best stated by someone not routinely concerned with the individual…

  11. Ground-based intercomparison of nitric acid measurement techniques

    NASA Astrophysics Data System (ADS)

    Fehsenfeld, Fred C.; Huey, L. Greg; Sueper, Donna T.; Norton, Richard B.; Williams, Eric J.; Eisele, Fred L.; Mauldin, R. Lee; Tanner, David J.

    1998-02-01

    An informal intercomparison of gas-phase nitric acid (HNO3) measuring techniques was carried out. The intercomparison involved two new chemical ionization mass spectrometers (CIMSs) that have been developed for the measurement of HNO3 along with an older, more established filter pack (FP) technique. The filter pack was composed of a teflon prefilter which collected aerosols followed by a nylon filter which collected the gas-phase HNO3. The study was carried out during the late winter and early spring of 1996 at a site located on the western edge of the Denver metropolitan area. Throughout the study the two CIMS techniques were in general agreement. However, under certain conditions the HNO3 levels obtained from the nylon filter of the FP gave values for the gas-phase concentration of HNO3 that were somewhat higher than that recorded by the two CIMS systems. The formation of ammonium nitrate (NH4NO3) containing aerosols is common during the colder months in this area. An analysis of these results suggests that the HNO3 collected by the nylon filter in the FP suffers an interference associated with the disproportionation of NH4NO3 from aerosols containing that compound that were initially collected on the teflon prefilter. This problem with the FP technique has been suggested from results obtained in previous intercomparisons.

  12. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  13. Wavelet-based techniques for the gamma-ray sky

    NASA Astrophysics Data System (ADS)

    McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias; Lee, Samuel K.

    2016-07-01

    We demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from dark matter annihilation and extended gamma-ray point source populations in a data-driven way.

  14. The Influence of an Extensive Inquiry-Based Field Experience on Pre-Service Elementary Student Teachers' Science Teaching Beliefs

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sumita; Volk, Trudi; Lumpe, Andrew

    2009-06-01

    This study examined the effects of an extensive inquiry-based field experience on pre service elementary teachers’ personal agency beliefs, a composite measure of context beliefs and capability beliefs related to teaching science. The research combined quantitative and qualitative approaches and included an experimental group that utilized the inquiry method and a control group that used traditional teaching methods. Pre- and post-test scores for the experimental and control groups were compared. The context beliefs of both groups showed no significant change as a result of the experience. However, the control group’s capability belief scores, lower than those of the experimental group to start with, declined significantly; the experimental group’s scores remained unchanged. Thus, the inquiry-based field experience led to an increase in personal agency beliefs. The qualitative data suggested a new hypothesis that there is a spiral relationship among teachers’ ability to establish communicative relationships with students, desire for personal growth and improvement, ability to implement multiple instructional strategies, and possession of substantive content knowledge. The study concludes that inquiry-based student teaching should be encouraged in the training of elementary school science teachers. However, the meaning and practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom.

  15. Research and development of LANDSAT-based crop inventory techniques

    NASA Technical Reports Server (NTRS)

    Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)

    1982-01-01

    A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.

  16. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    PubMed

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results.

  17. Estimations of One Repetition Maximum and Isometric Peak Torque in Knee Extension Based on the Relationship Between Force and Velocity.

    PubMed

    Sugiura, Yoshito; Hatanaka, Yasuhiko; Arai, Tomoaki; Sakurai, Hiroaki; Kanada, Yoshikiyo

    2016-04-01

    We aimed to investigate whether a linear regression formula based on the relationship between joint torque and angular velocity measured using a high-speed video camera and image measurement software is effective for estimating 1 repetition maximum (1RM) and isometric peak torque in knee extension. Subjects comprised 20 healthy men (mean ± SD; age, 27.4 ± 4.9 years; height, 170.3 ± 4.4 cm; and body weight, 66.1 ± 10.9 kg). The exercise load ranged from 40% to 150% 1RM. Peak angular velocity (PAV) and peak torque were used to estimate 1RM and isometric peak torque. To elucidate the relationship between force and velocity in knee extension, the relationship between the relative proportion of 1RM (% 1RM) and PAV was examined using simple regression analysis. The concordance rate between the estimated value and actual measurement of 1RM and isometric peak torque was examined using intraclass correlation coefficients (ICCs). Reliability of the regression line of PAV and % 1RM was 0.95. The concordance rate between the actual measurement and estimated value of 1RM resulted in an ICC(2,1) of 0.93 and that of isometric peak torque had an ICC(2,1) of 0.87 and 0.86 for 6 and 3 levels of load, respectively. Our method for estimating 1RM was effective for decreasing the measurement time and reducing patients' burden. Additionally, isometric peak torque can be estimated using 3 levels of load, as we obtained the same results as those reported previously. We plan to expand the range of subjects and examine the generalizability of our results. PMID:26382131

  18. Techniques for characterizing waveguide gratings and grating-based devices

    NASA Astrophysics Data System (ADS)

    Brinkmeyer, Ernst; Kieckbusch, Sven; Knappe, Frank

    2006-09-01

    Waveguide gratings used in laser technology, optical sensing or optical communications have to serve different specific purposes and, hence, have to have specific optical properties which can be tailored to a large extent. Characterization methods are required not only to measure the actual effect of a Bragg grating or long period grating under consideration but also to unveil the cause, i.e. to determine its spatial structure. This paper reviews the present status of the respective experimental characterization techniques. The methods emphasized rely on phase sensitive reflectometry together with advanced inverse scattering evaluation algorithms.

  19. Surgical Approaches Based on Biological Objectives: GTR versus GBR Techniques

    PubMed Central

    Pagni, Giorgio; Rasperini, Giulio

    2013-01-01

    Guided tissue regenerative (GTR) therapies are performed to regenerate the previously lost tooth supporting structure, thus maintaining the aesthetics and masticatory function of the available dentition. Alveolar ridge augmentation procedures (GBR) intend to regain the alveolar bone lost following tooth extraction and/or periodontal disease. Several biomaterials and surgical approaches have been proposed. In this paper we report biomaterials and surgical techniques used for periodontal and bone regenerative procedures. Particular attention will be adopted to highlight the biological basis for the different therapeutic approaches. PMID:23843792

  20. Fast Multigrid Techniques in Total Variation-Based Image Reconstruction

    NASA Technical Reports Server (NTRS)

    Oman, Mary Ellen

    1996-01-01

    Existing multigrid techniques are used to effect an efficient method for reconstructing an image from noisy, blurred data. Total Variation minimization yields a nonlinear integro-differential equation which, when discretized using cell-centered finite differences, yields a full matrix equation. A fixed point iteration is applied with the intermediate matrix equations solved via a preconditioned conjugate gradient method which utilizes multi-level quadrature (due to Brandt and Lubrecht) to apply the integral operator and a multigrid scheme (due to Ewing and Shen) to invert the differential operator. With effective preconditioning, the method presented seems to require Omicron(n) operations. Numerical results are given for a two-dimensional example.

  1. Applying Knowledge-Based Techniques to Software Development.

    ERIC Educational Resources Information Center

    Harandi, Mehdi T.

    1986-01-01

    Reviews overall structure and design principles of a knowledge-based programming support tool, the Knowledge-Based Programming Assistant, which is being developed at University of Illinois Urbana-Champaign. The system's major units (program design program coding, and intelligent debugging) and additional functions are described. (MBR)

  2. Bioluminescence-based imaging technique for pressure measurement in water

    NASA Astrophysics Data System (ADS)

    Watanabe, Yasunori; Tanaka, Yasufumi

    2011-07-01

    The dinoflagellate Pyrocystis lunula emits light in response to water motion. We developed a new imaging technique for measuring pressure using plankton that emits light in response to mechanical stimulation. The bioluminescence emitted by P. lunula was used to measure impact water pressure produced using weight-drop tests. The maximum mean luminescence intensity correlated with the maximum impact pressure that the cells receive when the circadian and diurnal biological rhythms are appropriately controlled. Thus, with appropriate calibration of experimentally determined parameters, the dynamic impact pressure can be estimated by measuring the cell-flash distribution. Statistical features of the evolution of flash intensity and the probability distribution during the impacting event, which are described by both biological and mechanical response parameters, are also discussed in this paper. The practical applicability of this bioluminescence imaging technique is examined through a water drop test. The maximum dynamic pressure, occurring at the impact of a water jet against a wall, was estimated from the flash intensity of the dinoflagellate.

  3. Computer-vision-based registration techniques for augmented reality

    NASA Astrophysics Data System (ADS)

    Hoff, William A.; Nguyen, Khoi; Lyon, Torsten

    1996-10-01

    Augmented reality is a term used to describe systems in which computer-generated information is superimposed on top of the real world; for example, through the use of a see- through head-mounted display. A human user of such a system could still see and interact with the real world, but have valuable additional information, such as descriptions of important features or instructions for performing physical tasks, superimposed on the world. For example, the computer could identify and overlay them with graphic outlines, labels, and schematics. The graphics are registered to the real-world objects and appear to be 'painted' onto those objects. Augmented reality systems can be used to make productivity aids for tasks such as inspection, manufacturing, and navigation. One of the most critical requirements for augmented reality is to recognize and locate real-world objects with respect to the person's head. Accurate registration is necessary in order to overlay graphics accurately on top of the real-world objects. At the Colorado School of Mines, we have developed a prototype augmented reality system that uses head-mounted cameras and computer vision techniques to accurately register the head to the scene. The current system locates and tracks a set of pre-placed passive fiducial targets placed on the real-world objects. The system computes the pose of the objects and displays graphics overlays using a see-through head-mounted display. This paper describes the architecture of the system and outlines the computer vision techniques used.

  4. Planning/scheduling techniques for VQ-based image compression

    NASA Technical Reports Server (NTRS)

    Short, Nicholas M., Jr.; Manohar, Mareboyana; Tilton, James C.

    1994-01-01

    The enormous size of the data holding and the complexity of the information system resulting from the EOS system pose several challenges to computer scientists, one of which is data archival and dissemination. More than ninety percent of the data holdings of NASA is in the form of images which will be accessed by users across the computer networks. Accessing the image data in its full resolution creates data traffic problems. Image browsing using a lossy compression reduces this data traffic, as well as storage by factor of 30-40. Of the several image compression techniques, VQ is most appropriate for this application since the decompression of the VQ compressed images is a table lookup process which makes minimal additional demands on the user's computational resources. Lossy compression of image data needs expert level knowledge in general and is not straightforward to use. This is especially true in the case of VQ. It involves the selection of appropriate codebooks for a given data set and vector dimensions for each compression ratio, etc. A planning and scheduling system is described for using the VQ compression technique in the data access and ingest of raw satellite data.

  5. Radiation synthesized protein-based nanoparticles: A technique overview

    NASA Astrophysics Data System (ADS)

    Varca, Gustavo H. C.; Perossi, Gabriela G.; Grasselli, Mariano; Lugão, Ademar B.

    2014-12-01

    Seeking for alternative routes for protein engineering a novel technique - radiation induced synthesis of protein nanoparticles - to achieve size controlled particles with preserved bioactivity has been recently reported. This work aimed to evaluate different process conditions to optimize and provide an overview of the technique using γ-irradiation. Papain was used as model protease and the samples were irradiated in a gamma cell irradiator in phosphate buffer (pH=7.0) containing ethanol (0-35%). The dose effect was evaluated by exposure to distinct γ-irradiation doses (2.5, 5, 7.5 and 10 kGy) and scale up experiments involving distinct protein concentrations (12.5-50 mg mL-1) were also performed. Characterization involved size monitoring using dynamic light scattering. Bityrosine detection was performed using fluorescence measurements in order to provide experimental evidence of the mechanism involved. Best dose effects were achieved at 10 kGy with regard to size and no relevant changes were observed as a function of papain concentration, highlighting very broad operational concentration range. Bityrosine changes were identified for the samples as a function of the process confirming that such linkages play an important role in the nanoparticle formation.

  6. Satellite communication performance evaluation: Computational techniques based on moments

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1980-01-01

    Computational techniques that efficiently compute bit error probabilities when only moments of the various interference random variables are available are presented. The approach taken is a generalization of the well known Gauss-Quadrature rules used for numerically evaluating single or multiple integrals. In what follows, basic algorithms are developed. Some of its properties and generalizations are shown and its many potential applications are described. Some typical interference scenarios for which the results are particularly applicable include: intentional jamming, adjacent and cochannel interferences; radar pulses (RFI); multipath; and intersymbol interference. While the examples presented stress evaluation of bit error probilities in uncoded digital communication systems, the moment techniques can also be applied to the evaluation of other parameters, such as computational cutoff rate under both normal and mismatched receiver cases in coded systems. Another important application is the determination of the probability distributions of the output of a discrete time dynamical system. This type of model occurs widely in control systems, queueing systems, and synchronization systems (e.g., discrete phase locked loops).

  7. A neighbourhood analysis based technique for real-time error concealment in H.264 intra pictures

    NASA Astrophysics Data System (ADS)

    Beesley, Steven T. C.; Grecos, Christos; Edirisinghe, Eran

    2007-02-01

    H.264s extensive use of context-based adaptive binary arithmetic or variable length coding makes streams highly susceptible to channel errors, a common occurrence over networks such as those used by mobile devices. Even a single bit error will cause a decoder to discard all stream data up to the next fixed length resynchronisation point, the worst scenario is that an entire slice is lost. In cases where retransmission and forward error concealment are not possible, a decoder should conceal any erroneous data in order to minimise the impact on the viewer. Stream errors can often be spotted early in the decode cycle of a macroblock which if aborted can provide unused processor cycles, these can instead be used to conceal errors at minimal cost, even as part of a real time system. This paper demonstrates a technique that utilises Sobel convolution kernels to quickly analyse the neighbourhood surrounding erroneous macroblocks before performing a weighted multi-directional interpolation. This generates significantly improved statistical (PSNR) and visual (IEEE structural similarity) results when compared to the commonly used weighted pixel value averaging. Furthermore it is also computationally scalable, both during analysis and concealment, achieving maximum performance from the spare processing power available.

  8. Combining Phylogenetic Profiling-Based and Machine Learning-Based Techniques to Predict Functional Related Proteins

    PubMed Central

    Lin, Tzu-Wen; Wu, Jian-Wei; Chang, Darby Tien-Hao

    2013-01-01

    Annotating protein functions and linking proteins with similar functions are important in systems biology. The rapid growth rate of newly sequenced genomes calls for the development of computational methods to help experimental techniques. Phylogenetic profiling (PP) is a method that exploits the evolutionary co-occurrence pattern to identify functional related proteins. However, PP-based methods delivered satisfactory performance only on prokaryotes but not on eukaryotes. This study proposed a two-stage framework to predict protein functional linkages, which successfully enhances a PP-based method with machine learning. The experimental results show that the proposed two-stage framework achieved the best overall performance in comparison with three PP-based methods. PMID:24069454

  9. Study of systems and techniques for data base management

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Data management areas were studied to identify pertinent problems and issues that will affect future NASA data users in terms of performance and cost. Specific topics discussed include the identifications of potential NASA data users other than those normally discussed, consideration affecting the clustering of minicomputers, low cost computer system for information retrieval and analysis, the testing of minicomputer based data base management systems, ongoing work related to the use of dedicated systems for data base management, and the problems of data interchange among a community of NASA data users.

  10. Tornado wind-loading requirements based on risk assessment techniques

    SciTech Connect

    Deobald, T.L.; Coles, G.A.; Smith, G.L.

    1991-06-01

    Regulations require that nuclear power plants be protected from tornado winds. If struck by a tornado, a plant must be capable of safely shutting down and removing decay heat. Probabilistic techniques are used to show that risk to the public from the US Department of Energy (DOE) SP-100 reactor is acceptable without tornado hardening parts of the secondary system. Relaxed requirements for design wind loadings will result in significant cost savings. To demonstrate an acceptable level of risk, this document examines tornado-initiated accidents. The two tornado-initiated accidents examined in detail are loss of cooling resulting in core damage and loss of secondary system boundary integrity leading to sodium release. Loss of core cooling is analyzed using fault/event tree models. Loss of secondary system boundary integrity is analyzed by comparing the consequences to acceptance criteria for the release of radioactive material or alkali metal aerosol. 4 refs., 4 figs.

  11. A fast Stokes inversion technique based on quadratic regression

    NASA Astrophysics Data System (ADS)

    Teng, Fei; Deng, Yuan-Yong

    2016-05-01

    Stokes inversion calculation is a key process in resolving polarization information on radiation from the Sun and obtaining the associated vector magnetic fields. Even in the cases of simple local thermodynamic equilibrium (LTE) and where the Milne-Eddington approximation is valid, the inversion problem may not be easy to solve. The initial values for the iterations are important in handling the case with multiple minima. In this paper, we develop a fast inversion technique without iterations. The time taken for computation is only 1/100 the time that the iterative algorithm takes. In addition, it can provide available initial values even in cases with lower spectral resolutions. This strategy is useful for a filter-type Stokes spectrograph, such as SDO/HMI and the developed two-dimensional real-time spectrograph (2DS).

  12. Conductivity-Based Detection Techniques in Nanofluidic Devices

    PubMed Central

    Harms, Zachary D.; Haywood, Daniel G.; Kneller, Andrew R.

    2016-01-01

    This review covers conductivity detection in fabricated nanochannels and nanopores. Improvements in nanoscale sensing are a direct result of advances in fabrication techniques, which produce devices with channels and pores with reproducible dimensions and in a variety of materials. Analytes of interest are detected by measuring changes in conductance as the analyte accumulates in the channel or passes transiently through the pore. These detection methods take advantage of phenomena enhanced at the nanoscale, such as ion current rectification, surface conductance, and dimensions comparable to the analytes of interest. The end result is the development of sensing technologies for a broad range of analytes, e.g., ions, small molecules, proteins, nucleic acids, and particles. PMID:25988434

  13. An optical image segmentor using neural based wavelet filtering techniques

    NASA Astrophysics Data System (ADS)

    Veronin, Christopher P.; Rogers, Steven K.; Kabrisky, Matthew; Priddy, Kevin L.; Ayer, Kevin W.

    1991-10-01

    This paper presents a neural based optical image segmentation scheme for locating potential targets in cluttered FLIR images. The advantage of such a scheme is speed, i.e., the speed of light. Such a design is critical to achieve real-time segmentation and classification for machine vision applications. The segmentation scheme used was based on texture discrimination and employed biologically based orientation specific filters (wavelet filters) as its main component. These filters are well understood impulse response functions of mammalian vision systems from input to striate cortex. By using the proper choice of aperture pair separation, dilation, and orientation, targets in FLIR imagery were optically segmented. Wavelet filtering is illustrated for glass template slides, as well as segmentation for static and real-time FLIR imagery displayed on a liquid crystal television.

  14. Optical image segmentation using neural-based wavelet filtering techniques

    NASA Astrophysics Data System (ADS)

    Veronin, Christopher P.; Priddy, Kevin L.; Rogers, Steven K.; Ayer, Kevin W.; Kabrisky, Matthew; Welsh, Byron M.

    1992-02-01

    This paper presents a neural based optical image segmentation scheme for locating potential targets in cluttered FLIR images. The advantage of such a scheme is speed, i.e., the speed of light. Such a design is critical to achieve real-time segmentation and classification for machine vision applications. The segmentation scheme used was based on texture discrimination and employed biologically based orientation specific filters (wavelet filters) as its main component. These filters are well understood impulse response functions of mammalian vision systems from input to striate cortex. By using the proper choice of aperture pair separation, dilation, and orientation, targets in FLIR imagery were optically segmented. Wavelet filtering is illustrated for glass template slides, as well as segmentation for static and real-time FLIR imagery displayed on a liquid crystal television.

  15. Large area photodetector based on microwave cavity perturbation techniques

    SciTech Connect

    Braggio, C. Carugno, G.; Sirugudu, R. K.; Lombardi, A.; Ruoso, G.

    2014-07-28

    We present a preliminary study to develop a large area photodetector, based on a semiconductor crystal placed inside a superconducting resonant cavity. Laser pulses are detected through a variation of the cavity impedance, as a consequence of the conductivity change in the semiconductor. A novel method, whereby the designed photodetector is simulated by finite element analysis, makes it possible to perform pulse-height spectroscopy on the reflected microwave signals. We measure an energy sensitivity of 100 fJ in the average mode without the employment of low noise electronics and suggest possible ways to further reduce the single-shot detection threshold, based on the results of the described method.

  16. BCC skin cancer diagnosis based on texture analysis techniques

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Chang, Wen-Yu; Chen, Gwo-Shing; Huang, Adam; Li, Jiang; McKenzie, Frederic D.

    2011-03-01

    In this paper, we present a texture analysis based method for diagnosing the Basal Cell Carcinoma (BCC) skin cancer using optical images taken from the suspicious skin regions. We first extracted the Run Length Matrix and Haralick texture features from the images and used a feature selection algorithm to identify the most effective feature set for the diagnosis. We then utilized a Multi-Layer Perceptron (MLP) classifier to classify the images to BCC or normal cases. Experiments showed that detecting BCC cancer based on optical images is feasible. The best sensitivity and specificity we achieved on our data set were 94% and 95%, respectively.

  17. Evaluation of SGML-based Information through Fuzzy Techniques.

    ERIC Educational Resources Information Center

    Fontana, Francesca Arcelli

    2001-01-01

    Discussion of knowledge management, information retrieval, information filtering, and information evaluation focuses on knowledge evaluation and proposes some evaluation methods based on L-grammars which are fuzzy grammars. Applies these methods to the evaluation of documents in SGML and to the evaluation of pages in HTML in the World Wide Web.…

  18. Problem-Based Learning Supported by Semantic Techniques

    ERIC Educational Resources Information Center

    Lozano, Esther; Gracia, Jorge; Corcho, Oscar; Noble, Richard A.; Gómez-Pérez, Asunción

    2015-01-01

    Problem-based learning has been applied over the last three decades to a diverse range of learning environments. In this educational approach, different problems are posed to the learners so that they can develop different solutions while learning about the problem domain. When applied to conceptual modelling, and particularly to Qualitative…

  19. Affinity-based screening techniques for enhancing lead discovery.

    PubMed

    Comess, Kenneth M; Schurdak, Mark E

    2004-07-01

    Contemporary, rational small-molecule lead discovery methods, comprising target identification, assay development, high-throughput screening (HTS), hit characterization and medicinal chemistry optimization, dominate early-stage drug discovery strategies in many pharmaceutical companies. There is a growing disparity between the increasing cost of funding these methods and the decreasing number of new drugs reaching the market. New strategies must be adopted to reverse this trend. The use of genomics- and proteomics-based target discovery efforts can aid the process by dramatically increasing the number of novel, more highly validated targets entering the discovery process, but HTS must meet this increased demand with faster, cheaper technologies. Although activity-based screening strategies are typically efficient, allowing one scientist to interrogate tens of thousands of compounds per day, affinity-based screening strategies can allow much greater efficiency in the overall process. Affinity-based methods can play a role in both facilitating the screening of a greater number of targets and in efficiently characterizing the primary hits discovered.

  20. Key techniques for space-based solar pumped semiconductor lasers

    NASA Astrophysics Data System (ADS)

    He, Yang; Xiong, Sheng-jun; Liu, Xiao-long; Han, Wei-hua

    2014-12-01

    In space, the absence of atmospheric turbulence, absorption, dispersion and aerosol factors on laser transmission. Therefore, space-based laser has important values in satellite communication, satellite attitude controlling, space debris clearing, and long distance energy transmission, etc. On the other hand, solar energy is a kind of clean and renewable resources, the average intensity of solar irradiation on the earth is 1353W/m2, and it is even higher in space. Therefore, the space-based solar pumped lasers has attracted much research in recent years, most research focuses on solar pumped solid state lasers and solar pumped fiber lasers. The two lasing principle is based on stimulated emission of the rare earth ions such as Nd, Yb, Cr. The rare earth ions absorb light only in narrow bands. This leads to inefficient absorption of the broad-band solar spectrum, and increases the system heating load, which make the system solar to laser power conversion efficiency very low. As a solar pumped semiconductor lasers could absorb all photons with energy greater than the bandgap. Thus, solar pumped semiconductor lasers could have considerably higher efficiencies than other solar pumped lasers. Besides, solar pumped semiconductor lasers has smaller volume chip, simpler structure and better heat dissipation, it can be mounted on a small satellite platform, can compose satellite array, which can greatly improve the output power of the system, and have flexible character. This paper summarizes the research progress of space-based solar pumped semiconductor lasers, analyses of the key technologies based on several application areas, including the processing of semiconductor chip, the design of small and efficient solar condenser, and the cooling system of lasers, etc. We conclude that the solar pumped vertical cavity surface-emitting semiconductor lasers will have a wide application prospects in the space.

  1. Kernel-based machine learning techniques for infrasound signal classification

    NASA Astrophysics Data System (ADS)

    Tuma, Matthias; Igel, Christian; Mialle, Pierrick

    2014-05-01

    Infrasound monitoring is one of four remote sensing technologies continuously employed by the CTBTO Preparatory Commission. The CTBTO's infrasound network is designed to monitor the Earth for potential evidence of atmospheric or shallow underground nuclear explosions. Upon completion, it will comprise 60 infrasound array stations distributed around the globe, of which 47 were certified in January 2014. Three stages can be identified in CTBTO infrasound data processing: automated processing at the level of single array stations, automated processing at the level of the overall global network, and interactive review by human analysts. At station level, the cross correlation-based PMCC algorithm is used for initial detection of coherent wavefronts. It produces estimates for trace velocity and azimuth of incoming wavefronts, as well as other descriptive features characterizing a signal. Detected arrivals are then categorized into potentially treaty-relevant versus noise-type signals by a rule-based expert system. This corresponds to a binary classification task at the level of station processing. In addition, incoming signals may be grouped according to their travel path in the atmosphere. The present work investigates automatic classification of infrasound arrivals by kernel-based pattern recognition methods. It aims to explore the potential of state-of-the-art machine learning methods vis-a-vis the current rule-based and task-tailored expert system. To this purpose, we first address the compilation of a representative, labeled reference benchmark dataset as a prerequisite for both classifier training and evaluation. Data representation is based on features extracted by the CTBTO's PMCC algorithm. As classifiers, we employ support vector machines (SVMs) in a supervised learning setting. Different SVM kernel functions are used and adapted through different hyperparameter optimization routines. The resulting performance is compared to several baseline classifiers. All

  2. Acrylic Resin Molding Based Head Fixation Technique in Rodents.

    PubMed

    Roh, Mootaek; Lee, Kyungmin; Jang, Il-Sung; Suk, Kyoungho; Lee, Maan-Gee

    2016-01-12

    Head fixation is a technique of immobilizing animal's head by attaching a head-post on the skull for rigid clamping. Traditional head fixation requires surgical attachment of metallic frames on the skull. The attached frames are then clamped to a stationary platform resulting in immobilization of the head. However, metallic frames for head fixation have been technically difficult to design and implement in general laboratory environment. In this study, we provide a novel head fixation method. Using a custom-made head fixation bar, head mounter is constructed during implantation surgery. After the application of acrylic resin for affixing implants such as electrodes and cannula on the skull, additional resins applied on top of that to build a mold matching to the port of the fixation bar. The molded head mounter serves as a guide rails, investigators conveniently fixate the animal's head by inserting the head mounter into the port of the fixation bar. This method could be easily applicable if implantation surgery using dental acrylics is necessary and might be useful for laboratories that cannot easily fabricate CNC machined metal head-posts.

  3. Calculation of free fall trajectories based on numerical optimization techniques

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The development of a means of computing free-fall (nonthrusting) trajectories from one specified point in the solar system to another specified point in the solar system in a given amount of time was studied. The problem is that of solving a two-point boundary value problem for which the initial slope is unknown. Two standard methods of attack exist for solving two-point boundary value problems. The first method is known as the initial value or shooting method. The second method of attack for two-point boundary value problems is to approximate the nonlinear differential equations by an appropriate linearized set. Parts of both boundary value problem solution techniques described above are used. A complete velocity history is guessed such that the corresponding position history satisfies the given boundary conditions at the appropriate times. An iterative procedure is then followed until the last guessed velocity history and the velocity history obtained from integrating the acceleration history agree to some specified tolerance everywhere along the trajectory.

  4. Active-charging based powertrain control in series hybrid electric vehicles for efficiency improvement and battery lifetime extension

    NASA Astrophysics Data System (ADS)

    Zhang, Xi; Mi, Chris Chunting; Yin, Chengliang

    2014-01-01

    This paper presents a powertrain control strategy for a series hybrid electric vehicle (SHEV) based on the integrated design of an active charging scenario and fixed-boundary-layer sliding mode controllers (FBLSMCs). An optimized charging curve for the battery is predetermined rather than subject to engine output and vehicle power demand, which is a total inverse of normal SHEV powertrain control process. This is aimed to remove surge and high-frequency charge current, keep the battery staying in a high state-of-charge (SOC) region and avoid persistently-high charge power, which are positive factors to battery lifetime extension. Then two robust chattering-free FBLSMCs are designed to locate the engine operation in the optimal efficiency area. One is in charge of engine speed control, and the other is for engine/generator torque control. Consequently, not only fuel economy is improved but also battery life expectancy could be extended. Finally, simulation and experimental results confirm the validity and application feasibility of the proposed strategy.

  5. Pathological leucocyte segmentation algorithm based on hyperspectral imaging technique

    NASA Astrophysics Data System (ADS)

    Guan, Yana; Li, Qingli; Wang, Yiting; Liu, Hongying; Zhu, Ziqiang

    2012-05-01

    White blood cells (WBC) are comparatively significant components in the human blood system, and they have a pathological relationship with some blood-related diseases. To analyze the disease information accurately, the most essential work is to segment WBCs. We propose a new method for pathological WBC segmentation based on a hyperspectral imaging system. This imaging system is used to capture WBC images, which is characterized by acquiring 1-D spectral information and 2-D spatial information for each pixel. A spectral information divergence algorithm is presented to segment pathological WBCs into four parts. In order to evaluate the performance of the new approach, K-means and spectral angle mapper-based segmental methods are tested in contrast on six groups of blood smears. Experimental results show that the presented method can segment pathological WBCs more accurately, regardless of their irregular shapes, sizes, and gray-values.

  6. Remaining Creep Life Assessment Techniques Based on Creep Cavitation Modeling

    NASA Astrophysics Data System (ADS)

    Ankit, Kumar

    2009-05-01

    The boiler and its components are built with assumed nominal design and reasonable life of operation about two to three decades (one or two hundred thousand hours). These units are generally replaced or life is extended at the end of this period. Under normal operating conditions, after the initial period of teething troubles, the reliability of these units remains fairly constant up to about two decades of normal operation. The failure rate then increases as a result of their time-dependent material damage. Further running of these units may become uneconomical and dangerous in some cases. In the following article, step-by-step methodology to quantify creep cavitation based on statistical probability analysis and continuum damage mechanics has been described. The concepts of creep cavity nucleation have also been discussed with a special emphasis on the need for development of a model based on creep cavity growth kinetics.

  7. Compressive spectrum sensing of radar pulses based on photonic techniques.

    PubMed

    Guo, Qiang; Liang, Yunhua; Chen, Minghua; Chen, Hongwei; Xie, Shizhong

    2015-02-23

    We present a photonic-assisted compressive sampling (CS) system which can acquire about 10(6) radar pulses per second spanning from 500 MHz to 5 GHz with a 520-MHz analog-to-digital converter (ADC). A rectangular pulse, a linear frequency modulated (LFM) pulse and a pulse stream is respectively reconstructed faithfully through this system with a sliding window-based recovery algorithm, demonstrating the feasibility of the proposed photonic-assisted CS system in spectral estimation for radar pulses.

  8. Clinical Relevance of Multiple Single-Nucleotide Polymorphisms in Pneumocystis jirovecii Pneumonia: Development of a Multiplex PCR-Single-Base-Extension Methodology▿

    PubMed Central

    Esteves, F.; Gaspar, J.; De Sousa, B.; Antunes, F.; Mansinho, K.; Matos, O.

    2011-01-01

    Pneumocystis jirovecii pneumonia (PcP) is a major cause of respiratory illness in patients with AIDS. The identification of multiple single-nucleotide polymorphisms (SNPs) at three distinct P. jirovecii loci encoding dihydrofolate reductase (DHFR), mitochondrial large-subunit rRNA (mtLSU rRNA), and superoxide dismutase (SOD) was achieved using multiplex-PCR (MPCR) followed by direct sequencing and two single-base extension (SBE) techniques. Four SNPs (DHFR312, mt85, SOD215, and SOD110), correlated previously with parameters of disease, were amplified and genotyped simultaneously. The concordance of results between the standard sequencing technique (direct sequencing) and SBE analysis was 96.9% for the acrylamide gel electrophoresis and 98.4% for the capillary electrophoresis. The cross-genetic analysis established several statistical associations among the SNPs studied: mt85C-SOD110T, SOD110T-SOD215C, and SOD110C-SOD215T. These results were confirmed by cluster analysis. Data showed that among the isolates with low to moderate parasite burden, the highest percentages of DHFR312C, mt85C, SOD110T, and SOD215C were detected, whereas for high parasite burden cases the highest frequencies were observed among isolates with DHFR312T, mt85T, SOD110C, and SOD215T. The polymorphisms studied were shown to be suitable genetic targets potentially correlated with PcP clinical data that can be used as predictors of outcome in further studies to help clinical decision-making in the management of PcP. The MPCR/SBE protocol described for the first time in the present study was shown to be a rapid, highly accurate method for genotyping P. jirovecii SNPs encoded by different loci that could be used for epidemiological studies and as an additional procedure for the prognostic classification and diagnosis of PcP. PMID:21389160

  9. Clinical relevance of multiple single-nucleotide polymorphisms in Pneumocystis jirovecii Pneumonia: development of a multiplex PCR-single-base-extension methodology.

    PubMed

    Esteves, F; Gaspar, J; De Sousa, B; Antunes, F; Mansinho, K; Matos, O

    2011-05-01

    Pneumocystis jirovecii pneumonia (PcP) is a major cause of respiratory illness in patients with AIDS. The identification of multiple single-nucleotide polymorphisms (SNPs) at three distinct P. jirovecii loci encoding dihydrofolate reductase (DHFR), mitochondrial large-subunit rRNA (mtLSU rRNA), and superoxide dismutase (SOD) was achieved using multiplex-PCR (MPCR) followed by direct sequencing and two single-base extension (SBE) techniques. Four SNPs (DHFR312, mt85, SOD215, and SOD110), correlated previously with parameters of disease, were amplified and genotyped simultaneously. The concordance of results between the standard sequencing technique (direct sequencing) and SBE analysis was 96.9% for the acrylamide gel electrophoresis and 98.4% for the capillary electrophoresis. The cross-genetic analysis established several statistical associations among the SNPs studied: mt85C-SOD110T, SOD110T-SOD215C, and SOD110C-SOD215T. These results were confirmed by cluster analysis. Data showed that among the isolates with low to moderate parasite burden, the highest percentages of DHFR312C, mt85C, SOD110T, and SOD215C were detected, whereas for high parasite burden cases the highest frequencies were observed among isolates with DHFR312T, mt85T, SOD110C, and SOD215T. The polymorphisms studied were shown to be suitable genetic targets potentially correlated with PcP clinical data that can be used as predictors of outcome in further studies to help clinical decision-making in the management of PcP. The MPCR/SBE protocol described for the first time in the present study was shown to be a rapid, highly accurate method for genotyping P. jirovecii SNPs encoded by different loci that could be used for epidemiological studies and as an additional procedure for the prognostic classification and diagnosis of PcP.

  10. Study of hydrogen in coals, polymers, oxides, and muscle water by nuclear magnetic resonance; extension of solid-state high-resolution techniques. [Hydrogen molybdenum bronze

    SciTech Connect

    Ryan, L.M.

    1981-10-01

    Nuclear magnetic resonance (NMR) spectroscopy has been an important analytical and physical research tool for several decades. One area of NMR which has undergone considerable development in recent years is high resolution NMR of solids. In particular, high resolution solid state /sup 13/C NMR spectra exhibiting features similar to those observed in liquids are currently achievable using sophisticated pulse techniques. The work described in this thesis develops analogous methods for high resolution /sup 1/H NMR of rigid solids. Applications include characterization of hydrogen aromaticities in fossil fuels, and studies of hydrogen in oxides and bound water in muscle.

  11. An Extensive Survey of Dayside Diffuse Aurora (DDA) Based on Optical Observations at Yellow River Station (YRS)

    NASA Astrophysics Data System (ADS)

    Desheng, H.

    2015-12-01

    By using 7 years optical auroral observations obtained at Yellow River Station at Ny-Alesund, Svalbard, we performed the first extensive survey for the dayside diffuse auroras (DDAs) and acquired observational results as follows. (1) The DDAs can be classified into 2 broad categories, i.e., unstructured and structured DDAs. The unstructured DDAs are mainly distributed in morning and afternoon, but the structured DDAs predominantly occurred around the magnetic local noon (MLN). (2) The unstructured DDAs observed in morning and afternoon present obviously different properties. The afternoon ones are much stable and seldom show pulsating property. (3) The DDAs are more easily observed under geomagnetically quiet times. (4) The structured DDAs mainly show patchy, stripy, and irregular forms, and are often pulsating and drifting. The drifting directions are mostly westward (with speed ~5 km/s), but there are cases showing eastward or poleward drifting. (5) The stripy DDAs are exclusively observed near the MLN and, most importantly, their alignments are confirmed to be consistent with the direction of ionospheric convection near the MLN. (6) A new auroral form, called throat aurora, is found to be developed from the stripy DDAs. Based on the observational results and previous studies, we proposed our explanations to the DDAs. We suggest that the unstructured DDAs observed in the morning are extensions of the nightside diffuse aurora to the dayside, but that observed in the afternoon are predominantly caused by proton precipitations. The structured DDAs occurred near the MLN are caused by interactions of cold plasma structures, which are supposed to be originated from the ionospheric outflows or plasmaspheric drainage plumes, with hot electrons from the plasma sheet. We suppose that the cold plasma structures for producing the patchy DDAs are in lumpy and are more likely from the plasmaspheric drainage plumes. The cold plasm structure for producing the stripy DDAs should

  12. [A Terahertz Spectral Database Based on Browser/Server Technique].

    PubMed

    Zhang, Zhuo-yong; Song, Yue

    2015-09-01

    With the solution of key scientific and technical problems and development of instrumentation, the application of terahertz technology in various fields has been paid more and more attention. Owing to the unique characteristic advantages, terahertz technology has been showing a broad future in the fields of fast, non-damaging detections, as well as many other fields. Terahertz technology combined with other complementary methods can be used to cope with many difficult practical problems which could not be solved before. One of the critical points for further development of practical terahertz detection methods depends on a good and reliable terahertz spectral database. We developed a BS (browser/server) -based terahertz spectral database recently. We designed the main structure and main functions to fulfill practical requirements. The terahertz spectral database now includes more than 240 items, and the spectral information was collected based on three sources: (1) collection and citation from some other abroad terahertz spectral databases; (2) collected from published literatures; and (3) spectral data measured in our laboratory. The present paper introduced the basic structure and fundament functions of the terahertz spectral database developed in our laboratory. One of the key functions of this THz database is calculation of optical parameters. Some optical parameters including absorption coefficient, refractive index, etc. can be calculated based on the input THz time domain spectra. The other main functions and searching methods of the browser/server-based terahertz spectral database have been discussed. The database search system can provide users convenient functions including user registration, inquiry, displaying spectral figures and molecular structures, spectral matching, etc. The THz database system provides an on-line searching function for registered users. Registered users can compare the input THz spectrum with the spectra of database, according to

  13. Grid Based Techniques for Visualization in the Geosciences

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Sowell, B.; Lu, Z.; Erlebacher, G.; Yuen, D. A.

    2005-12-01

    As experiments and simulations in the geosciences grow larger and more complex, it has become increasingly important to develop methods of processing and sharing data in a distributed computing environment. In recent years, the scientific community has shown growing interest in exploiting the powerful assets of Grid computing to this end, but the complexity of the Grid has prevented many scientists from converting their applications and embracing this possibility. We are investigating methods for development and deployment of data extraction and visualization services across the NaradaBrokering [1] Grid infrastructure. With the help of gSOAP [2], we have developed a series of C/C++ services for wavelet transforms, earthquake clustering, and basic 3D visualization. We will demonstrate the deployment and collaboration of these services across a network of NaradaBrokering nodes, concentrating on the challenges faced in inter-service communication, service/client division, and particularly web service visualization. Renderings in a distributed environment can be handled in three ways: 1) the data extraction service computes and renders everything locally and sends results to the client as a bitmap image, 2) the data extraction service sends results to a separate visualization service for rendering, which in turn sends results to a client as a bitmap image, and 3) the client itself renders images locally. The first two options allow for large visualizations in a distributed and collaborative environment, but limit interactivity of the client. To address this problem we are investigating the advantages of the JOGL OpenGL library [3] to perform renderings on the client side using the client's hardware for increased performance. We will present benchmarking results to ascertain the relative advantage of the three aforementioned techniques as a function of datasize and visualization task. [1] The NaradaBrokering Project, http://www.naradabrokering.org [2] gSOAP: C/C++ Web

  14. An RSA-Based Leakage-Resilient Authenticated Key Exchange Protocol Secure against Replacement Attacks, and Its Extensions

    NASA Astrophysics Data System (ADS)

    Shin, Seonghan; Kobara, Kazukuni; Imai, Hideki

    Secure channels can be realized by an authenticated key exchange (AKE) protocol that generates authenticated session keys between the involving parties. In [32], Shin et al., proposed a new kind of AKE (RSA-AKE) protocol whose goal is to provide high efficiency and security against leakage of stored secrets as much as possible. Let us consider more powerful attacks where an adversary completely controls the communications and the stored secrets (the latter is denoted by “replacement” attacks). In this paper, we first show that the RSA-AKE protocol [32] is no longer secure against such an adversary. The main contributions of this paper are as follows: (1) we propose an RSA-based leakage-resilient AKE (RSA-AKE2) protocol that is secure against active attacks as well as replacement attacks; (2) we prove that the RSA-AKE2 protocol is secure against replacement attacks based on the number theory results; (3) we show that it is provably secure in the random oracle model, by showing the reduction to the RSA one-wayness, under an extended model that covers active attacks and replacement attacks; (4) in terms of efficiency, the RSA-AKE2 protocol is comparable to [32] in the sense that the client needs to compute only one modular multiplication with pre-computation; and (5) we also discuss about extensions of the RSA-AKE2 protocol for several security properties (i.e., synchronization of stored secrets, privacy of client and solution to server compromise-impersonation attacks).

  15. Antenna pointing compensation based on precision optical measurement techniques

    NASA Technical Reports Server (NTRS)

    Schumacher, L. L.; Vivian, H. C.

    1988-01-01

    The pointing control loops of the Deep Space Network 70 meter antennas extend only to the Intermediate Reference Structure (IRS). Thus, distortion of the structure forward of the IRS due to unpredictable environmental loads can result in uncompensated boresight shifts which degrade blind pointing accuracy. A system is described which can provide real time bias commands to the pointing control system to compensate for environmental effects on blind pointing performance. The bias commands are computed in real time based on optical ranging measurements of the structure from the IRS to a number of selected points on the primary and secondary reflectors.

  16. Pseudorandom Noise Code-Based Technique for Thin Cloud Discrimination with CO2 and O2 Absorption Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Prasad, Narasimha S.; Flood, Michael A.

    2011-01-01

    NASA Langley Research Center is working on a continuous wave (CW) laser based remote sensing scheme for the detection of CO2 and O2 from space based platforms suitable for ACTIVE SENSING OF CO2 EMISSIONS OVER NIGHTS, DAYS, AND SEASONS (ASCENDS) mission. ASCENDS is a future space-based mission to determine the global distribution of sources and sinks of atmospheric carbon dioxide (CO2). A unique, multi-frequency, intensity modulated CW (IMCW) laser absorption spectrometer (LAS) operating at 1.57 micron for CO2 sensing has been developed. Effective aerosol and cloud discrimination techniques are being investigated in order to determine concentration values with accuracies less than 0.3%. In this paper, we discuss the demonstration of a pseudo noise (PN) code based technique for cloud and aerosol discrimination applications. The possibility of using maximum length (ML)-sequences for range and absorption measurements is investigated. A simple model for accomplishing this objective is formulated, Proof-of-concept experiments carried out using SONAR based LIDAR simulator that was built using simple audio hardware provided promising results for extension into optical wavelengths.

  17. Image processing technique based on image understanding architecture

    NASA Astrophysics Data System (ADS)

    Kuvychko, Igor

    2000-12-01

    Effectiveness of image applications is directly based on its abilities to resolve ambiguity and uncertainty in the real images. That requires tight integration of low-level image processing with high-level knowledge-based reasoning, which is the solution of the image understanding problem. This article presents a generic computational framework necessary for the solution of image understanding problem -- Spatial Turing Machine. Instead of tape of symbols, it works with hierarchical networks dually represented as discrete and continuous structures. Dual representation provides natural transformation of the continuous image information into the discrete structures, making it available for analysis. Such structures are data and algorithms at the same time and able to perform graph and diagrammatic operations being the basis of intelligence. They can create derivative structures that play role of context, or 'measurement device,' giving the ability to analyze, and run top-bottom algorithms. Symbols naturally emerge there, and symbolic operations work in combination with new simplified methods of computational intelligence. That makes images and scenes self-describing, and provides flexible ways of resolving uncertainty. Classification of images truly invariant to any transformation could be done via matching their derivative structures. New proposed architecture does not require supercomputers, opening ways to the new image technologies.

  18. Immobilization, stabilization and patterning techniques for enzyme based sensor systems.

    SciTech Connect

    Flounders, A.W.; Carichner, S.C.; Singh, A.K.; Volponi, J.V.; Schoeniger, J.S.; Wally, K.

    1997-01-01

    Sandia National Laboratories has recently opened the Chemical and Radiation Detection Laboratory (CRDL) in Livermore CA to address the detection needs of a variety of government agencies (e.g., Department of Energy, Environmental Protection Agency, Department of Agriculture) as well as provide a fertile environment for the cooperative development of new industrial technologies. This laboratory consolidates a variety of existing chemical and radiation detection efforts and enables Sandia to expand into the novel area of biochemically based sensors. One aspect of this biosensor effort is further development and optimization of enzyme modified field effect transistors (EnFETs). Recent work has focused upon covalent attachment of enzymes to silicon dioxide and silicon nitride surfaces for EnFET fabrication. They are also investigating methods to pattern immobilized proteins; a critical component for development of array-based sensor systems. Novel enzyme stabilization procedures are key to patterning immobilized enzyme layers while maintaining enzyme activity. Results related to maximized enzyme loading, optimized enzyme activity and fluorescent imaging of patterned surfaces will be presented.

  19. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    PubMed

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  20. On-line hydrogen-isotope measurements of organic samples using elemental chromium: an extension for high temperature elemental-analyzer techniques.

    PubMed

    Gehre, Matthias; Renpenning, Julian; Gilevska, Tetyana; Qi, Haiping; Coplen, Tyler B; Meijer, Harro A J; Brand, Willi A; Schimmelmann, Arndt

    2015-01-01

    The high temperature conversion (HTC) technique using an elemental analyzer with a glassy carbon tube and filling (temperature conversion/elemental analysis, TC/EA) is a widely used method for hydrogen isotopic analysis of water and many solid and liquid organic samples with analysis by isotope-ratio mass spectrometry (IRMS). However, the TC/EA IRMS method may produce inaccurate δ(2)H results, with values deviating by more than 20 mUr (milliurey = 0.001 = 1‰) from the true value for some materials. We show that a single-oven, chromium-filled elemental analyzer coupled to an IRMS substantially improves the measurement quality and reliability for hydrogen isotopic compositions of organic substances (Cr-EA method). Hot chromium maximizes the yield of molecular hydrogen in a helium carrier gas by irreversibly and quantitatively scavenging all reactive elements except hydrogen. In contrast, under TC/EA conditions, heteroelements like nitrogen or chlorine (and other halogens) can form hydrogen cyanide (HCN) or hydrogen chloride (HCl) and this can cause isotopic fractionation. The Cr-EA technique thus expands the analytical possibilities for on-line hydrogen-isotope measurements of organic samples significantly. This method yielded reproducibility values (1-sigma) for δ(2)H measurements on water and caffeine samples of better than 1.0 and 0.5 mUr, respectively. To overcome handling problems with water as the principal calibration anchor for hydrogen isotopic measurements, we have employed an effective and simple strategy using reference waters or other liquids sealed in silver-tube segments. These crimped silver tubes can be employed in both the Cr-EA and TC/EA techniques. They simplify considerably the normalization of hydrogen-isotope measurement data to the VSMOW-SLAP (Vienna Standard Mean Ocean Water-Standard Light Antarctic Precipitation) scale, and their use improves accuracy of the data by eliminating evaporative loss and associated isotopic fractionation while

  1. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    PubMed Central

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  2. Office-based rapid prototyping in orthopedic surgery: a novel planning technique and review of the literature.

    PubMed

    Schwartz, Adam; Money, Kyle; Spangehl, Mark; Hattrup, Steven; Claridge, Richard J; Beauchamp, Christopher

    2015-01-01

    Three-dimensional (3-D) prototyping, based on high-quality axial images, may allow for more accurate and extensive preoperative planning and may even allow surgeons to perform procedures as part of preoperative preparation. In this article, we describe 7 cases of complex orthopedic disorders that were surgically treated after preoperative planning that was based on both industry-provided models and use of our in-house 3-D printer. Commercially available 3-D printers allow for rapid in-office production of a high-quality realistic prototype at relatively low per-case cost. Using this technique, surgeons can assess the accuracy of their original surgical plans and, if necessary, correct them preoperatively. The ability to "perform surgery preoperatively" adds another element to surgeons' perceptions of the potential issues that may arise. PMID:25566552

  3. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  4. Developing Visualization Techniques for Semantics-based Information Networks

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Hall, David R.

    2003-01-01

    Information systems incorporating complex network structured information spaces with a semantic underpinning - such as hypermedia networks, semantic networks, topic maps, and concept maps - are being deployed to solve some of NASA s critical information management problems. This paper describes some of the human interaction and navigation problems associated with complex semantic information spaces and describes a set of new visual interface approaches to address these problems. A key strategy is to leverage semantic knowledge represented within these information spaces to construct abstractions and views that will be meaningful to the human user. Human-computer interaction methodologies will guide the development and evaluation of these approaches, which will benefit deployed NASA systems and also apply to information systems based on the emerging Semantic Web.

  5. Constellation choosing based on multi-dimensional sphere packing technique

    NASA Astrophysics Data System (ADS)

    Jinghe, Li; Guijun, Hu; Kashero, Enock; Zhaoxi, Li

    2016-09-01

    In this paper we address the sphere packing lattice points selection problem being used as constellation points in high-dimensional modulation. We propose a new type of points selection method based on threshold theory. Theoretically, this method improves the transmission performance of high-dimensional signal modulation systems. We find that the BER of a 4D modulation signal using the threshold value points selection method reduces. We also compared random and distant points selection methods in a BER of 10-3 and obtained a reduced SNR of about 2 db. At a 10-3 BER, a 8D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3 db. At a 10-3 BER, a 16D modulation signal with points selected using the threshold selection methods obtained a reduced SNR of about 3.5 db.

  6. Study of dynamic weighing system based on photoelectric detecting technique

    NASA Astrophysics Data System (ADS)

    Song, Gui-cai; Na, Yan-xiang; Cao, Shi-hao; Yang, Fei-yu

    2011-08-01

    Dynamic weighing is a process that it reckons the weight of vehicles according to measuring the tires which are moving. It makes use of sensors and some others auxiliary apparatus to measure the appearance of a certain vehicle and tires, then calculates the weight and the speed of vehicles. Finally it can note and read this information. To analyze the dynamic weighing system at home and abroad, it can be easily discovered that these are based on the sensors of electricity. The disadvantages of those sensors are very obvious. For example, when vehicles are dynamic weighed, the speed and accuracy can not be ensured at the same time. Dynamic weighing system is designed in the research of papers. Linear CCD can be used as Sensor to be applied in the mold of weighing. This paper describes the dynamic weighing system, analyses the dynamic of the system, and also investigates the modules of the dynamic weighing system.

  7. Techniques to derive geometries for image-based Eulerian computations

    PubMed Central

    Dillard, Seth; Buchholz, James; Vigmostad, Sarah; Kim, Hyunggun; Udaykumar, H.S.

    2014-01-01

    Purpose The performance of three frequently used level set-based segmentation methods is examined for the purpose of defining features and boundary conditions for image-based Eulerian fluid and solid mechanics models. The focus of the evaluation is to identify an approach that produces the best geometric representation from a computational fluid/solid modeling point of view. In particular, extraction of geometries from a wide variety of imaging modalities and noise intensities, to supply to an immersed boundary approach, is targeted. Design/methodology/approach Two- and three-dimensional images, acquired from optical, X-ray CT, and ultrasound imaging modalities, are segmented with active contours, k-means, and adaptive clustering methods. Segmentation contours are converted to level sets and smoothed as necessary for use in fluid/solid simulations. Results produced by the three approaches are compared visually and with contrast ratio, signal-to-noise ratio, and contrast-to-noise ratio measures. Findings While the active contours method possesses built-in smoothing and regularization and produces continuous contours, the clustering methods (k-means and adaptive clustering) produce discrete (pixelated) contours that require smoothing using speckle-reducing anisotropic diffusion (SRAD). Thus, for images with high contrast and low to moderate noise, active contours are generally preferable. However, adaptive clustering is found to be far superior to the other two methods for images possessing high levels of noise and global intensity variations, due to its more sophisticated use of local pixel/voxel intensity statistics. Originality/value It is often difficult to know a priori which segmentation will perform best for a given image type, particularly when geometric modeling is the ultimate goal. This work offers insight to the algorithm selection process, as well as outlining a practical framework for generating useful geometric surfaces in an Eulerian setting. PMID

  8. A Temperature-Based Gain Calibration Technique for Precision Radiometry

    NASA Astrophysics Data System (ADS)

    Parashare, Chaitali Ravindra

    Detecting extremely weak signals in radio astronomy demands high sensitivity and stability of the receivers. The gain of a typical radio astronomy receiver is extremely large, and therefore, even very small gain instabilities can dominate the received noise power and degrade the instrument sensitivity. Hence, receiver stabilization is of prime importance. Gain variations occur mainly due to ambient temperature fluctuations. We take a new approach to receiver stabilization, which makes use of active temperature monitoring and corrects for the gain fluctuations in post processing. This approach is purely passive and does not include noise injection or switching for calibration. This system is to be used for the Precision Array for Probing the Epoch of Reionization (PAPER), which is being developed to detect the extremely faint neutral hydrogen (HI) signature of the Epoch of Reionization (EoR). The epoch of reionization refers to the period in the history of the Universe when the first stars and galaxies started to form. When there are N antenna elements in the case of a large scale array, all elements may not be subjected to the same environmental conditions at a given time. Hence, we expect to mitigate the gain variations by monitoring the physical temperature of each element of the array. This stabilization approach will also benefit experiments like EDGES (Experiment to Detect the Global EoR Signature) and DARE (Dark Ages Radio Explorer), which involve a direct measurement of the global 21 cm signal using a single antenna element and hence, require an extremely stable system. This dissertation focuses on the development and evaluation of a calibration technique that compensates for the gain variations caused due to temperature fluctuations of the RF components. It carefully examines the temperature dependence of the components in the receiver chain. The results from the first-order field instrument, called a Gainometer (GoM), highlight the issue with the cable

  9. In situ synchrotron based x-ray techniques as monitoring tools for atomic layer deposition

    SciTech Connect

    Devloo-Casier, Kilian Detavernier, Christophe; Dendooven, Jolien

    2014-01-15

    Atomic layer deposition (ALD) is a thin film deposition technique that has been studied with a variety of in situ techniques. By exploiting the high photon flux and energy tunability of synchrotron based x-rays, a variety of new in situ techniques become available. X-ray reflectivity, grazing incidence small angle x-ray scattering, x-ray diffraction, x-ray fluorescence, x-ray absorption spectroscopy, and x-ray photoelectron spectroscopy are reviewed as possible in situ techniques during ALD. All these techniques are especially sensitive to changes on the (sub-)nanometer scale, allowing a unique insight into different aspects of the ALD growth mechanisms.

  10. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  11. BEaST: brain extraction based on nonlocal segmentation technique.

    PubMed

    Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir; Manjón, José V; Leung, Kelvin K; Guizard, Nicolas; Wassef, Shafik N; Østergaard, Lasse Riis; Collins, D Louis

    2012-02-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors.

  12. Laser-based techniques for living cell pattern formation

    NASA Astrophysics Data System (ADS)

    Hopp, Béla; Smausz, Tomi; Papdi, Bence; Bor, Zsolt; Szabó, András; Kolozsvári, Lajos; Fotakis, Costas; Nógrádi, Antal

    2008-10-01

    In the production of biosensors or artificial tissues a basic step is the immobilization of living cells along the required pattern. In this paper the ability of some promising laser-based methods to influence the interaction between cells and various surfaces is presented. In the first set of experiments laser-induced patterned photochemical modification of polymer foils was used to achieve guided adherence and growth of cells to the modified areas: (a) Polytetrafluoroethylene was irradiated with ArF excimer laser ( λ=193 nm, FWHM=20 ns, F=9 mJ/cm2) in presence of triethylene tetramine liquid photoreagent; (b) a thin carbon layer was produced by KrF excimer laser ( λ=248 nm, FWHM=30 ns, F=35 mJ/cm2) irradiation on polyimide surface to influence the cell adherence. It was found that the incorporation of amine groups in the PTFE polymer chain instead of the fluorine atoms can both promote and prevent the adherence of living cells (depending on the applied cell types) on the treated surfaces, while the laser generated carbon layer on polyimide surface did not effectively improve adherence. Our attempts to influence the cell adherence by morphological modifications created by ArF laser irradiation onto polyethylene terephtalate surface showed a surface roughness dependence. This method was effective only when the Ra roughness parameter of the developed structure did not exceed the 0.1 micrometer value. Pulsed laser deposition with femtosecond KrF excimer lasers ( F=2.2 J/cm2) was effectively used to deposit structured thin films from biomaterials (endothelial cell growth supplement and collagen embedded in starch matrix) to promote the adherence and growth of cells. These results present evidence that some surface can be successfully altered to induce guided cell growth.

  13. Analysis of ISO/IEEE 11073 built-in security and its potential IHE-based extensibility.

    PubMed

    Rubio, Óscar J; Trigo, Jesús D; Alesanco, Álvaro; Serrano, Luis; García, José

    2016-04-01

    The ISO/IEEE 11073 standard for Personal Health Devices (X73PHD) aims to ensure interoperability between Personal Health Devices and aggregators-e.g. health appliances, routers-in ambulatory setups. The Integrating the Healthcare Enterprise (IHE) initiative promotes the coordinated use of different standards in healthcare systems (e.g. Personal/Electronic Health Records, alert managers, Clinical Decision Support Systems) by defining profiles intended for medical use cases. X73PHD provides a robust syntactic model and a comprehensive terminology, but it places limited emphasis on security and on interoperability with IHE-compliant systems and frameworks. However, the implementation of eHealth/mHealth applications in environments such as health and fitness monitoring, independent living and disease management (i.e. the X73PHD domains) increasingly requires features such as secure connections to mobile aggregators-e.g. smartphones, tablets-, the sharing of devices among different users with privacy, and interoperability with certain IHE-compliant healthcare systems. This work proposes a comprehensive IHE-based X73PHD extension consisting of additive layers adapted to different eHealth/mHealth applications, after having analyzed the features of X73PHD (especially its built-in security), IHE profiles related with these applications and other research works. Both the new features proposed for each layer and the procedures to support them have been carefully chosen to minimize the impact on X73PHD, on its architecture (in terms of delays and overhead) and on its framework. Such implications are thoroughly analyzed in this paper. As a result, an extended model of X73PHD is proposed, preserving its essential features while extending them with added value.

  14. The efficacy and toxicity of individualized intensity-modulated radiotherapy based on the tumor extension patterns of nasopharyngeal carcinoma

    PubMed Central

    Zhou, Guan-Qun; Guo, Rui; Zhang, Fan; Zhang, Yuan; Xu, Lin; Zhang, Lu-Lu; Lin, Ai-Hua; Ma, Jun; Sun, Ying

    2016-01-01

    Background To evaluate the efficacy and toxicity of intensity-modulated radiotherapy (IMRT) using individualized clinical target volumes (CTVs) based on the loco-regional extension patterns of nasopharyngeal carcinoma (NPC). Methods From December 2009 to February 2012, 220 patients with histologically-proven, non-disseminated NPC were prospectively treated with IMRT according to an individualized delineation protocol. CTV1 encompassed the gross tumor volume, entire nasopharyngeal mucosa and structures within the pharyngobasilar fascia with a margin. CTV2 encompassed bilateral high risk anatomic sites and downstream anatomic sites adjacent to primary tumor, bilateral retropharyngeal regions, levels II, III and Va, and prophylactic irradiation was gave to one or two levels beyond clinical lymph nodes involvement. Clinical outcomes and toxicities were evaluated. Results Median follow-up was 50.8 (range, 1.3–68.0) months, four-year local relapse-free, regional relapse-free, distant metastasis-free, disease-free and overall survival rates were 94.7%, 97.0%, 91.7%, 87.2% and 91.9%, respectively. Acute severe (≥ grade 3) mucositis, dermatitis and xerostomia were observed in 27.6%, 3.6% and zero patients, respectively. At 1 year, xerostomia was mild, with frequencies of Grade 0, 1, 2 and 3 xerostomia of 27.9%, 63.3%, 8.3% and 0.5%, respectively. Conclusions IMRT using individualized CTVs provided high rates of local and regional control and a favorable toxicity profile in NPC. Individualized CTV delineation strategy is a promising one that may effectively avoid unnecessary or missed irradiation, and deserve optimization to define more precise individualized CTVs. PMID:26980744

  15. Accuracy of a contour-based biplane fluoroscopy technique for tracking knee joint kinematics of different speeds.

    PubMed

    Giphart, J Erik; Zirker, Christopher A; Myers, Casey A; Pennington, W Wesley; LaPrade, Robert F

    2012-11-15

    While measuring knee motion in all six degrees of freedom is important for understanding and treating orthopaedic knee pathologies, traditional motion capture techniques lack the required accuracy. A variety of model-based biplane fluoroscopy techniques have been developed with sub-millimeter accuracy. However, no studies have statistically evaluated the consistency of the accuracy across motions of varying intensity or between degrees of freedom. Therefore, this study evaluated the bias and precision of a contour-based tracking technique by comparing it to a marker-based method (gold standard) during three movements with increasing intensity. Six cadaveric knees with implanted tantalum markers were used to simulate knee extension, walking and drop landings, while motion was recorded by a custom biplane fluoroscopy system. The 3D geometries of the bones were reconstructed from CT scans and anatomical coordinate systems were assigned. The position and orientation of the bone and marker models were determined for an average of 27 frames for each trial and knee joint kinematics were compared. The average bias and precision was 0.01 ± 0.65° for rotations and 0.01 ± 0.59 mm for joint translations. Rotational precision was affected by motion (p=0.04) and depended on the axis of rotation (p=0.02). However, the difference in average precision among motions or axes was small (≤ 0.13°) and not likely of consequence for kinematic measurements. No other differences were found. The contour-based technique demonstrated sub-millimeter and sub-degree accuracy, indicating it is a highly accurate tool for measuring complex three dimensional knee movements of any intensity.

  16. PDE-based Non-Linear Diffusion Techniques for Denoising Scientific and Industrial Images: An Empirical Study

    SciTech Connect

    Weeratunga, S K; Kamath, C

    2001-12-20

    Removing noise from data is often the first step in data analysis. Denoising techniques should not only reduce the noise, but do so without blurring or changing the location of the edges. Many approaches have been proposed to accomplish this; in this paper, they focus on one such approach, namely the use of non-linear diffusion operators. This approach has been studied extensively from a theoretical viewpoint ever since the 1987 work of Perona and Malik showed that non-linear filters outperformed the more traditional linear Canny edge detector. They complement this theoretical work by investigating the performance of several isotropic diffusion operators on test images from scientific domains. They explore the effects of various parameters such as the choice of diffusivity function, explicit and implicit methods for the discretization of the PDE, and approaches for the spatial discretization of the non-linear operator etc. They also compare these schemes with simple spatial filters and the more complex wavelet-based shrinkage techniques. The empirical results show that, with an appropriate choice of parameters, diffusion-based schemes can be as effective as competitive techniques.

  17. A content-based image retrieval method for optical colonoscopy images based on image recognition techniques

    NASA Astrophysics Data System (ADS)

    Nosato, Hirokazu; Sakanashi, Hidenori; Takahashi, Eiichi; Murakawa, Masahiro

    2015-03-01

    This paper proposes a content-based image retrieval method for optical colonoscopy images that can find images similar to ones being diagnosed. Optical colonoscopy is a method of direct observation for colons and rectums to diagnose bowel diseases. It is the most common procedure for screening, surveillance and treatment. However, diagnostic accuracy for intractable inflammatory bowel diseases, such as ulcerative colitis (UC), is highly dependent on the experience and knowledge of the medical doctor, because there is considerable variety in the appearances of colonic mucosa within inflammations with UC. In order to solve this issue, this paper proposes a content-based image retrieval method based on image recognition techniques. The proposed retrieval method can find similar images from a database of images diagnosed as UC, and can potentially furnish the medical records associated with the retrieved images to assist the UC diagnosis. Within the proposed method, color histogram features and higher order local auto-correlation (HLAC) features are adopted to represent the color information and geometrical information of optical colonoscopy images, respectively. Moreover, considering various characteristics of UC colonoscopy images, such as vascular patterns and the roughness of the colonic mucosa, we also propose an image enhancement method to highlight the appearances of colonic mucosa in UC. In an experiment using 161 UC images from 32 patients, we demonstrate that our method improves the accuracy of retrieving similar UC images.

  18. Extension Russian: A Few Notes

    ERIC Educational Resources Information Center

    Matthews, Irene J.

    1974-01-01

    Presents observations on the teaching of Russian in an evening extension course and four teaching techniques which involve variety in lesson planning, flexibility in presentation, a relaxed atmosphere and encouragement. (LG)

  19. Influence of an extensive inquiry-based field experience on pre-service elementary student teachers' science teaching beliefs

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sumita

    This study examined the effects of an extensive inquiry-based field experience on pre-service elementary teachers' personal agency beliefs (PAB) about teaching science and their ability to effectively implement science instruction. The research combined quantitative and qualitative approaches within an ethnographic research tradition. A comparison was made between the pre and posttest scores for two groups. The experimental group utilized the inquiry method; the control group did not. The experimental group had the stronger PAB pattern. The field experience caused no significant differences to the context beliefs of either groups, but did to the capability beliefs. The number of college science courses taken by pre-service elementary teachers' was positively related to their post capability belief (p = .0209). Qualitative information was collected through case studies which included observation of classrooms, assessment of lesson plans and open-ended, extended interviews of the participants about their beliefs in their teaching abilities (efficacy beliefs), and in teaching environments (context beliefs). The interview data were analyzed by the analytic induction method to look for themes. The emerging themes were then grouped under several attributes. Following a review of the attributes a number of hypotheses were formulated. Each hypothesis was then tested across all the cases by the constant comparative method. The pattern of relationship that emerged from the hypotheses testing clearly suggests a new hypothesis that there is a spiral relationship among the ability to establish communicative relationship with students, desire for personal growth and improvement, and greater content knowledge. The study concluded that inquiry based student teaching should be encouraged to train school science teachers. But the meaning and the practice of the inquiry method should be clearly delineated to ensure its correct implementation in the classroom. A survey should be

  20. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad

  1. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  2. The extension of school-based inter- and intraracial children's friendships: influences on psychosocial well-being.

    PubMed

    Fletcher, Anne C; Rollins, Alethea; Nickerson, Pamela

    2004-07-01

    Children's (N=142) school friendships with same versus different race peers were coded for prevalence and the extent to which parents maintained social relationships with these friends (a proxy for extension of friendships beyond the school context). Membership in integrated versus nonintegrated social networks at school was unassociated with psychosocial well-being. Out-of-school extension of interracial friendships was linked with greater social competence among Black children. Black children whose friendships with both same and different race peers were extended beyond the school context reported higher levels of self-esteem.

  3. [A Detection Technique for Gas Concentration Based on the Spectral Line Shape Function].

    PubMed

    Zhou, Mo; Yang, Bing-chu; Tao, Shao-hua

    2015-04-01

    The methods that can rapidly and precisely measure concentrations of various gases have extensive applications in the fields such as air quality analysis, environmental pollution detection, and so on. The gas detection method based on the tunable laser absorption spectroscopy is considered a promising technique. For the infrared spectrum detection techniques, the line shape function of an absorption spectrum of a gas is an important parameter in qualitative and quantitative analysis of a gas. Specifically, how to obtain the line shape function of an absorption spectrum of a gas quickly and accurately is a key problem in the gas detection fields. In this paper we analyzed several existing line shape functions and proposed a method to calculate precisely the line shape function of a gas, and investigated the relation between the gas concentration and the peak value of a line shape function. Then we experimentally measured the absorption spectra of an acetylene gas in the wavelength range of 1,515-1,545 nm with a tunable laser source and a built-in spectrometer. With Lambert-Beer law we calculated the peak values of the line shape function of the gas at the given frequencies, and obtained a fitting curve for the line shape function in the whole waveband by using a computer program. Comparing the measured results with the calculated results of the Voigt function, we found that there was a deviation-between the experimental results and the calculated results. And we found that the measured concentration of the acetylene gas by using the fitting curve of the line shape function was more accurate and compatible with the actual situation. Hence, the empirical formula for the line shape function obtained from the experimental results would be more suitable for the concentration measurement of a gas. As the fitting curve for the line shape function of the acetylene gas has been deduced from the experiment, the corresponding peak values of the spectral lines can be

  4. Bispectrum-based feature extraction technique for devising a practical brain-computer interface

    NASA Astrophysics Data System (ADS)

    Shahid, Shahjahan; Prasad, Girijesh

    2011-04-01

    The extraction of distinctly separable features from electroencephalogram (EEG) is one of the main challenges in designing a brain-computer interface (BCI). Existing feature extraction techniques for a BCI are mostly developed based on traditional signal processing techniques assuming that the signal is Gaussian and has linear characteristics. But the motor imagery (MI)-related EEG signals are highly non-Gaussian, non-stationary and have nonlinear dynamic characteristics. This paper proposes an advanced, robust but simple feature extraction technique for a MI-related BCI. The technique uses one of the higher order statistics methods, the bispectrum, and extracts the features of nonlinear interactions over several frequency components in MI-related EEG signals. Along with a linear discriminant analysis classifier, the proposed technique has been used to design an MI-based BCI. Three performance measures, classification accuracy, mutual information and Cohen's kappa have been evaluated and compared with a BCI using a contemporary power spectral density-based feature extraction technique. It is observed that the proposed technique extracts nearly recording-session-independent distinct features resulting in significantly much higher and consistent MI task detection accuracy and Cohen's kappa. It is therefore concluded that the bispectrum-based feature extraction is a promising technique for detecting different brain states.

  5. Computer-based video digitizer analysis of surface extension in maize roots: kinetics of growth rate changes during gravitropism

    NASA Technical Reports Server (NTRS)

    Ishikawa, H.; Hasenstein, K. H.; Evans, M. L.

    1991-01-01

    We used a video digitizer system to measure surface extension and curvature in gravistimulated primary roots of maize (Zea mays L.). Downward curvature began about 25 +/- 7 min after gravistimulation and resulted from a combination of enhanced growth along the upper surface and reduced growth along the lower surface relative to growth in vertically oriented controls. The roots curved at a rate of 1.4 +/- 0.5 degrees min-1 but the pattern of curvature varied somewhat. In about 35% of the samples the roots curved steadily downward and the rate of curvature slowed as the root neared 90 degrees. A final angle of about 90 degrees was reached 110 +/- 35 min after the start of gravistimulation. In about 65% of the samples there was a period of backward curvature (partial reversal of curvature) during the response. In some cases (about 15% of those showing a period of reverse bending) this period of backward curvature occurred before the root reached 90 degrees. Following transient backward curvature, downward curvature resumed and the root approached a final angle of about 90 degrees. In about 65% of the roots showing a period of reverse curvature, the roots curved steadily past the vertical, reaching maximum curvature about 205 +/- 65 min after gravistimulation. The direction of curvature then reversed back toward the vertical. After one or two oscillations about the vertical the roots obtained a vertical orientation and the distribution of growth within the root tip became the same as that prior to gravistimulation. The period of transient backward curvature coincided with and was evidently caused by enhancement of growth along the concave and inhibition of growth along the convex side of the curve, a pattern opposite to that prevailing in the earlier stages of downward curvature. There were periods during the gravitropic response when the normally unimodal growth-rate distribution within the elongation zone became bimodal with two peaks of rapid elongation separated by

  6. AQA-PM: Extension of the Air-Quality Model For Austria with Satellite based Particulate Matter Estimates

    NASA Astrophysics Data System (ADS)

    Hirtl, Marcus; Mantovani, Simone; Krüger, Bernd C.; Triebnig, Gerhard; Flandorfer, Claudia

    2013-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using regression- and assimilation techniques. For the model simulations WRF/Chem is used with a resolution of 3 km over the alpine region. Interfaces have been developed to account for the different measurements as input data. The available local emission inventories provided by the different Austrian regional governments were harmonized and used for the model simulations. An episode in February 2010 is chosen for the model evaluation. During that month exceedances of PM10-thresholds occurred at many measurement stations of the Austrian network. Different model runs (only model/only ground stations assimilated/satellite and ground stations assimilated) are compared to the respective measurements. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  7. AQA-PM: Extension of the Air-Quality model for Austria with satellite based Particulate Matter estimates

    NASA Astrophysics Data System (ADS)

    Hirtl, M.; Mantovani, S.; Krüger, B. C.; Triebnig, G.

    2012-04-01

    Air quality is a key element for the well-being and quality of life of European citizens. Air pollution measurements and modeling tools are essential for assessment of air quality according to EU legislation. The responsibilities of ZAMG as the national weather service of Austria include the support of the federal states and the public in questions connected to the protection of the environment in the frame of advisory and counseling services as well as expert opinions. The Air Quality model for Austria (AQA) is operated at ZAMG in cooperation with the University of Natural Resources and Applied Life Sciences in Vienna (BOKU) by order of the regional governments since 2005. AQA conducts daily forecasts of gaseous and particulate (PM10) air pollutants over Austria. In the frame of the project AQA-PM (funded by FFG), satellite measurements of the Aerosol Optical Thickness (AOT) and ground-based PM10-measurements are combined to highly-resolved initial fields using assimilation techniques. It is expected that the assimilation of satellite measurements will significantly improve the quality of AQA. Currently no observations are considered in the modeling system. At the current stage of the project, different datasets have been collected (ground measurements, satellite measurements, fine resolved regional emission inventories) and are analyzed and prepared for further processing. This contribution gives an overview of the project working plan and the upcoming developments. The goal of this project is to improve the PM10-forecasts for Austria with the integration of satellite based measurements and to provide a comprehensive product-platform.

  8. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    PubMed Central

    Elia, Angela; Lugarà, Pietro Mario; Di Franco, Cinzia; Spagnolo, Vincenzo

    2009-01-01

    The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques. PMID:22303143

  9. Traditional versus rule-based programming techniques - Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    A traditional programming technique for controlling the display of optional flight information in a civil transport cockpit is compared to a rule-based technique for the same function. This application required complex decision logic and a frequently modified rule base. The techniques are evaluated for execution efficiency and implementation ease; the criterion used to calculate the execution efficiency is the total number of steps required to isolate hypotheses that were true and the criteria used to evaluate the implementability are ease of modification and verification and explanation capability. It is observed that the traditional program is more efficient than the rule-based program; however, the rule-based programming technique is more applicable for improving programmer productivity.

  10. Effects of Extensive Engagement with Computer-Based Reading and Language Arts Instructional Software on Reading Achievement for Sixth Graders

    ERIC Educational Resources Information Center

    Securro, Samuel, Jr.; Jones, Jerry D.; Cantrell, Danny R.

    2010-01-01

    K-12 school practitioners and schools administrators need reliable results about the effects of instructional technology products as they strive to meet achievement compliance levels in politically accountable local and national contexts in the U.S. This study presents evidence regarding the effects of extensive engagement with computer-based…

  11. A quality control technique based on UV-VIS absorption spectroscopy for tequila distillery factories

    NASA Astrophysics Data System (ADS)

    Barbosa Garcia, O.; Ramos Ortiz, G.; Maldonado, J. L.; Pichardo Molina, J.; Meneses Nava, M. A.; Landgrave, Enrique; Cervantes, M. J.

    2006-02-01

    A low cost technique based on the UV-VIS absorption spectroscopy is presented for the quality control of the spirit drink known as tequila. It is shown that such spectra offer enough information to discriminate a given spirit drink from a group of bottled commercial tequilas. The technique was applied to white tequilas. Contrary to the reference analytic methods, such as chromatography, for this technique neither special personal training nor sophisticated instrumentations is required. By using hand-held instrumentation this technique can be applied in situ during the production process.

  12. A system identification technique based on the random decrement signatures. Part 2: Experimental results

    NASA Technical Reports Server (NTRS)

    Bedewi, Nabih E.; Yang, Jackson C. S.

    1987-01-01

    Identification of the system parameters of a randomly excited structure may be treated using a variety of statistical techniques. Of all these techniques, the Random Decrement is unique in that it provides the homogeneous component of the system response. Using this quality, a system identification technique was developed based on a least-squares fit of the signatures to estimate the mass, damping, and stiffness matrices of a linear randomly excited system. The results of an experiment conducted on an offshore platform scale model to verify the validity of the technique and to demonstrate its application in damage detection are presented.

  13. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    PubMed

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  14. Finite element modelling of non-bonded piezo sensors for biomedical health monitoring of bones based on EMI technique

    NASA Astrophysics Data System (ADS)

    Srivastava, Shashank; Bhalla, Suresh; Madan, Alok; Gupta, Ashok

    2016-04-01

    Extensive research is currently underway across the world for employing piezo sensors for biomedical health monitoring in view of their obvious advantages such as low cost,fast dynamics response and bio-compatibility.However,one of the limitations of the piezo sensor in bonded mode based on the electro-mechanical impedance (EMI) technique is that it can cause harmful effects to the humans in terms of irritation ,bone and skin disease. This paper which is in continuation of the recent demonstration of non-bonded configuration is a step towards simulating and analyzing the non-bonded configuration of the piezo sensor for gauging its effectiveness using FEA software. It has been noted that the conductance signatures obtained in non-bonded mode are significantly close to the conventional bonded configuration, thus giving a positive indication of its field use.

  15. Multi technique amalgamation for enhanced information identification with content based image data.

    PubMed

    Das, Rik; Thepade, Sudeep; Ghosh, Saurav

    2015-01-01

    Image data has emerged as a resourceful foundation for information with proliferation of image capturing devices and social media. Diverse applications of images in areas including biomedicine, military, commerce, education have resulted in huge image repositories. Semantically analogous images can be fruitfully recognized by means of content based image identification. However, the success of the technique has been largely dependent on extraction of robust feature vectors from the image content. The paper has introduced three different techniques of content based feature extraction based on image binarization, image transform and morphological operator respectively. The techniques were tested with four public datasets namely, Wang Dataset, Oliva Torralba (OT Scene) Dataset, Corel Dataset and Caltech Dataset. The multi technique feature extraction process was further integrated for decision fusion of image identification to boost up the recognition rate. Classification result with the proposed technique has shown an average increase of 14.5 % in Precision compared to the existing techniques and the retrieval result with the introduced technique has shown an average increase of 6.54 % in Precision over state-of-the art techniques. PMID:26798574

  16. Multi technique amalgamation for enhanced information identification with content based image data.

    PubMed

    Das, Rik; Thepade, Sudeep; Ghosh, Saurav

    2015-01-01

    Image data has emerged as a resourceful foundation for information with proliferation of image capturing devices and social media. Diverse applications of images in areas including biomedicine, military, commerce, education have resulted in huge image repositories. Semantically analogous images can be fruitfully recognized by means of content based image identification. However, the success of the technique has been largely dependent on extraction of robust feature vectors from the image content. The paper has introduced three different techniques of content based feature extraction based on image binarization, image transform and morphological operator respectively. The techniques were tested with four public datasets namely, Wang Dataset, Oliva Torralba (OT Scene) Dataset, Corel Dataset and Caltech Dataset. The multi technique feature extraction process was further integrated for decision fusion of image identification to boost up the recognition rate. Classification result with the proposed technique has shown an average increase of 14.5 % in Precision compared to the existing techniques and the retrieval result with the introduced technique has shown an average increase of 6.54 % in Precision over state-of-the art techniques.

  17. Using Fuzzy Logic Techniques for Assertion-Based Software Testing Metrics

    PubMed Central

    Alakeel, Ali M.

    2015-01-01

    Software testing is a very labor intensive and costly task. Therefore, many software testing techniques to automate the process of software testing have been reported in the literature. Assertion-Based automated software testing has been shown to be effective in detecting program faults as compared to traditional black-box and white-box software testing methods. However, the applicability of this approach in the presence of large numbers of assertions may be very costly. Therefore, software developers need assistance while making decision to apply Assertion-Based testing in order for them to get the benefits of this approach at an acceptable level of costs. In this paper, we present an Assertion-Based testing metrics technique that is based on fuzzy logic. The main goal of the proposed technique is to enhance the performance of Assertion-Based software testing in the presence of large numbers of assertions. To evaluate the proposed technique, an experimental study was performed in which the proposed technique is applied on programs with assertions. The result of this experiment shows that the effectiveness and performance of Assertion-Based software testing have improved when applying the proposed testing metrics technique. PMID:26060839

  18. A novel background subtraction technique based on grayscale morphology for weld defect detection

    NASA Astrophysics Data System (ADS)

    Aminzadeh, Masoumeh; Kurfess, Thomas

    2016-04-01

    Optical inspection is a non-destructive quality monitoring technique to detect defects in manufactured parts. Automating the defect detection, by application of image processing, prevents the presence of human operators making the inspection more reliable, reproducible and faster. In this paper, a background subtraction technique, based on morphological operations, is proposed. The low-computational load associated with the used morphological operations makes this technique more computationally effective than background subtraction techniques such as spline approximation and surface-fitting. The performance of the technique is tested by applying to detect defects in a weld seam with non-uniform intensity distribution where the defects are precisely segmented. The proposed background subtraction technique is generalizable to sheet, surface, or part defect detection in various applications of manufacturing.

  19. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  20. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  1. Vocabulary Extension through Poetry.

    ERIC Educational Resources Information Center

    Surajlal, K. C.

    1986-01-01

    Based on the notion that teaching vocabulary extension in isolation makes little impact on students, a three-part exercise, designed to develop students' vocabulary through poetry while providing meaningful enjoyment, uses the poem "The Hawk" by A. C. Benson. In the first class period, students are introduced to both the exercise and the poem and…

  2. Multiplex fluorescence-based primer extension method for quantitative mutation analysis of mitochondrial DNA and its diagnostic application for Alzheimer's disease.

    PubMed Central

    Fahy, E; Nazarbaghi, R; Zomorrodi, M; Herrnstadt, C; Parker, W D; Davis, R E; Ghosh, S S

    1997-01-01

    A sensitive and highly reproducible multiplexed primer extension assay is described for quantitative mutation analysis of heterogeneous DNA populations. Wild-type and mutant target DNA are simultaneously probed in competitive primer extension reactions using fluorophor-labeled primers and high fidelity, thermostable DNA polymerases in the presence of defined mixtures of deoxy- and dideoxynucleotides. Primers are differentially extended and the resulting products are distinguished by size and dye label. Wild-type:mutant DNA ratios are determined from the fluorescence intensities associated with electrophoretically resolved reaction products. Multiple nucleotide sites can be simultaneously interrogated with uniquely labeled primers of different lengths. The application of this quantitative technique is shown in the analysis of heteroplasmic point mutations in mitochondrial DNA that are associated with Alzheimer's disease. PMID:9224611

  3. Pre-drift extension of the Atlantic margins of North America and Europe based on paths of Permo-Triassic apparent polar wander

    NASA Astrophysics Data System (ADS)

    Beck, Myrl E.; Housen, Bernard A.

    2003-01-01

    We reconstruct the relative configuration of North America and Europe prior to separation using paths of apparent polar wander (APW) for the interval 300 to 200 Ma. The Bullard et al. (1965) reconstruction closely superimposes the 300 Ma points on the two APW paths but leaves the 200 Ma points far apart. Conversely, anomaly-based reconstructions for later times approximately superimpose the 200 Ma ends of the paths but leave the older ends far apart. This indicates that separation of the interiors of the two continents began during the interval 300 to 200 Ma, long before surficial rifting commenced in the late Mesozoic. This in turn requires pre-rift extension in the two continental margins. Extension appears to have occurred in two phases of approximately equal magnitude but significantly different direction; the change in direction occurred at about 200 Ma. The earlier (300 to 200 Ma) episode of extension appears to have involved a strong element of sinistral shear. Based on our preferred reconstruction, the total amount of pre-rift extension of the two continental margins may have been as much as 1400 km.

  4. Detection of alpha electro-encephalogram onset following eye closure using four location-based techniques.

    PubMed

    Searle, A; Kirkup, L

    2001-07-01

    Detection of alpha activity in the electro-encephalogram (EEG) has been used extensively in neurophysiological studies. Previously applied alpha parameterisation techniques, which utilise the amplitude information from a pair of differential electrodes, are often susceptible to interference from artifact signals. This is an issue if the purpose of detecting the change in alpha wave synchronisation is the basis of an environmental control system (ECS). An alternative approach to alpha activity detection is proposed that utilises the information from an array of electrodes on the scalp to estimate the apparent location of alpha activity in the brain. Four methods are described that successfully detect the onset of alpha EEG increase following eye closure by monitoring the apparent location of alpha activity in the head. The methods use Bartlett beamforming, a four-sphere anatomical head model, the MUSIC algorithm and a new 'power vector' technique. Of the methods described, the power vector technique is found to be the most successful. The power vector technique detects the alpha increase associated with eye closure in times that are, on average, 33% lower than previously applied alpha detection methods. PMID:11523732

  5. Repeat Customer Success in Extension

    ERIC Educational Resources Information Center

    Bess, Melissa M.; Traub, Sarah M.

    2013-01-01

    Four multi-session research-based programs were offered by two Extension specialist in one rural Missouri county. Eleven participants who came to multiple Extension programs could be called "repeat customers." Based on the total number of participants for all four programs, 25% could be deemed as repeat customers. Repeat customers had…

  6. A Load-based Micro-indentation Technique for Mechanical Property and NDE Evaluation

    SciTech Connect

    Bruce S. Kang; Chuanyu Feng; Jared M. Tannenbaum; M.A. Alvin

    2009-06-04

    A load-based micro-indentation technique has been developed for evaluating mechanical properties of materials. Instead of using measured indentation depth or contact area as a necessary parameter, the new technique is based on the indentation load, coupled with a multiple-partial unloading procedure for mechanical property evaluation. The proposed load-based micro-indentation method is capable of determining Young’s modulus of metals, superalloys, and single crystal matrices, and stiffness of coated material systems with flat, tubular, or curved architectures. This micro-indentation technique can be viewed as a viable non-destructive evaluation (NDE) technique for determining as-manufactured and process-exposed metal, superalloy, single crystal, and TBC-coated material properties. Based on this technique, several bond coated substrates were tested at various stages of thermal cycles. The time-series evaluation of test material surface stiffness reveals the status of coating strength without any alternation of the coating surface, making it a true time-series NDE investigation. The microindentation test results show good correlation with post mortem microstructural analyses. This technique also shows promise for the development of a portable instrument for on-line, in-situ NDE and mechanical properties measurement of structural components.

  7. Simultaneous and integrated neutron-based techniques for material analysis of a metallic ancient flute

    NASA Astrophysics Data System (ADS)

    Festa, G.; Pietropaolo, A.; Grazzi, F.; Sutton, L. F.; Scherillo, A.; Bognetti, L.; Bini, A.; Barzagli, E.; Schooneveld, E.; Andreani, C.

    2013-09-01

    A metallic 19th century flute was studied by means of integrated and simultaneous neutron-based techniques: neutron diffraction, neutron radiative capture analysis and neutron radiography. This experiment follows benchmark measurements devoted to assessing the effectiveness of a multitask beamline concept for neutron-based investigation on materials. The aim of this study is to show the potential application of the approach using multiple and integrated neutron-based techniques for musical instruments. Such samples, in the broad scenario of cultural heritage, represent an exciting research field. They may represent an interesting link between different disciplines such as nuclear physics, metallurgy and acoustics.

  8. An optoelectrokinetic technique for programmable particle manipulation and bead-based biosignal enhancement.

    PubMed

    Wang, Kuan-Chih; Kumar, Aloke; Williams, Stuart J; Green, Nicolas G; Kim, Kyung Chun; Chuang, Han-Sheng

    2014-10-21

    Technologies that can enable concentration of low-abundance biomarkers are essential for early diagnosis of diseases. In this study, an optoelectrokinetic technique, termed Rapid Electrokinetic Patterning (REP), was used to enable dynamic particle manipulation in bead-based bioassays. Various manipulation capabilities, such as micro/nanoparticle aggregation, translation, sorting and patterning, were developed. The technique allows for versatile multi-parameter (voltage, light intensity and frequency) based modulation and dynamically addressable manipulation with simple device fabrication. Signal enhancement of a bead-based bioassay was demonstrated using dilute biotin-fluorescein isothiocyanate (FITC) solutions mixed with streptavidin-conjugated particles and rapidly concentrated with the technique. As compared with a conventional ELISA reader, the REP-enabled detection achieved a minimal readout of 3.87 nM, which was a 100-fold improvement in sensitivity. The multi-functional platform provides an effective measure to enhance detection levels in more bead-based bioassays. PMID:25109364

  9. A spread-spectrum based synchronization technique for digital broadcast systems

    NASA Astrophysics Data System (ADS)

    Holden, Thomas P.; Feher, Kamilo

    1990-09-01

    An experimentally verified technique is presented to improve the synchronization efficiency of digital communication systems over present systems without sacrificing reliability. This technique is called the spread-spectrum pilot technique (SSPT) and incorporates elements of tone-calibrated techniques and spread-spectrum systems. In the SSPT system, the desired pilot tone is modulated by a pseudorandom binary sequence (PRBS) generator of finite duration and then linearly added to the source (customer) data. At the receiver, the same finite PRBS sequence is used to decode the received signal. Based on the research presented, it is expected that this type of system will lead to better performance than the techniques currently used in multipath fading environments, which are especially problematic in mobile broadcast and communication applications.

  10. An efficient technique for nuclei segmentation based on ellipse descriptor analysis and improved seed detection algorithm.

    PubMed

    Xu, Hongming; Lu, Cheng; Mandal, Mrinal

    2014-09-01

    In this paper, we propose an efficient method for segmenting cell nuclei in the skin histopathological images. The proposed technique consists of four modules. First, it separates the nuclei regions from the background with an adaptive threshold technique. Next, an elliptical descriptor is used to detect the isolated nuclei with elliptical shapes. This descriptor classifies the nuclei regions based on two ellipticity parameters. Nuclei clumps and nuclei with irregular shapes are then localized by an improved seed detection technique based on voting in the eroded nuclei regions. Finally, undivided nuclei regions are segmented by a marked watershed algorithm. Experimental results on 114 different image patches indicate that the proposed technique provides a superior performance in nuclei detection and segmentation.

  11. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  12. Interferometric dynamic measurement: techniques based on high-speed imaging or a single photodetector.

    PubMed

    Fu, Yu; Pedrini, Giancarlo; Li, Xide

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  13. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    PubMed Central

    Fu, Yu; Pedrini, Giancarlo

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  14. Gene expression data clustering using a multiobjective symmetry based clustering technique.

    PubMed

    Saha, Sriparna; Ekbal, Asif; Gupta, Kshitija; Bandyopadhyay, Sanghamitra

    2013-11-01

    The invention of microarrays has rapidly changed the state of biological and biomedical research. Clustering algorithms play an important role in clustering microarray data sets where identifying groups of co-expressed genes are a very difficult task. Here we have posed the problem of clustering the microarray data as a multiobjective clustering problem. A new symmetry based fuzzy clustering technique is developed to solve this problem. The effectiveness of the proposed technique is demonstrated on five publicly available benchmark data sets. Results are compared with some widely used microarray clustering techniques. Statistical and biological significance tests have also been carried out. PMID:24209942

  15. Radiation Effects Investigations Based on Atmospheric Radiation Model (ATMORAD) Considering GEANT4 Simulations of Extensive Air Showers and Solar Modulation Potential.

    PubMed

    Hubert, Guillaume; Cheminet, Adrien

    2015-07-01

    The natural radiative atmospheric environment is composed of secondary cosmic rays produced when primary cosmic rays hit the atmosphere. Understanding atmospheric radiations and their dynamics is essential for evaluating single event effects, so that radiation risks in aviation and the space environment (space weather) can be assessed. In this article, we present an atmospheric radiation model, named ATMORAD (Atmospheric Radiation), which is based on GEANT4 simulations of extensive air showers according to primary spectra that depend only on the solar modulation potential (force-field approximation). Based on neutron spectrometry, solar modulation potential can be deduced using neutron spectrometer measurements and ATMORAD. Some comparisons between our methodology and standard approaches or measurements are also discussed. This work demonstrates the potential for using simulations of extensive air showers and neutron spectroscopy to monitor solar activity.

  16. Repeatable construction method for engineered zinc finger nuclease based on overlap extension PCR and TA-cloning.

    PubMed

    Fujii, Wataru; Kano, Kiyoshi; Sugiura, Koji; Naito, Kunihiko

    2013-01-01

    Zinc finger nuclease (ZFN) is a useful tool for endogenous site-directed genome modification. The development of an easier, less expensive and repeatedly usable construction method for various sequences of ZFNs should contribute to the further widespread use of this technology. Here, we establish a novel construction method for ZFNs. Zinc finger (ZF) fragments were synthesized by PCR using short primers coding DNA recognition helices of the ZF domain. DNA-binding domains composed of 4 to 6 ZFs were synthesized by overlap extension PCR of these PCR products, and the DNA-binding domains were joined with a nuclease vector by TA cloning. The short primers coding unique DNA recognition helices can be used repeatedly for other ZFN constructions. By using this novel OLTA (OverLap extension PCR and TA-cloning) method, arbitrary ZFN vectors were synthesized within 3 days, from the designing to the sequencing of the vector. Four different ZFN sets synthesized by OLTA showed nuclease activities at endogenous target loci. Genetically modified mice were successfully generated using ZFN vectors constructed by OLTA. This method, which enables the construction of intended ZFNs repeatedly and inexpensively in a short period of time, should contribute to the advancement of ZFN technology.

  17. Functional restoration of elbow extension after spinal-cord injury using a neural network-based synergistic FES controller.

    PubMed

    Giuffrida, Joseph P; Crago, Patrick E

    2005-06-01

    Individuals with a C5/C6 spinal-cord injury (SCI) have paralyzed elbow extensors, yet retain weak to strong voluntary control of elbow flexion and some shoulder movements. They lack elbow extension, which is critical during activities of daily living. This research focuses on the functional evaluation of a developed synergistic controller employing remaining voluntary elbow flexor and shoulder electromyography (EMG) to control elbow extension with functional electrical stimulation (FES). Remaining voluntarily controlled upper extremity muscles were used to train an artificial neural network (ANN) to control stimulation of the paralyzed triceps. Surface EMG was collected from SCI subjects while they produced isometric endpoint force vectors of varying magnitude and direction using triceps stimulation levels predicted by a biomechanical model. ANNs were trained with the collected EMG and stimulation levels. We hypothesized that once trained and implemented in real-time, the synergistic controller would provide several functional benefits. We anticipated the synergistic controller would provide a larger range of endpoint force vectors, the ability to grade and maintain forces, the ability to complete a functional overhead reach task, and use less overall stimulation than a constant stimulation scheme.

  18. Distributed Synchronization Technique for OFDMA-Based Wireless Mesh Networks Using a Bio-Inspired Algorithm

    PubMed Central

    Kim, Mi Jeong; Maeng, Sung Joon; Cho, Yong Soo

    2015-01-01

    In this paper, a distributed synchronization technique based on a bio-inspired algorithm is proposed for an orthogonal frequency division multiple access (OFDMA)-based wireless mesh network (WMN) with a time difference of arrival. The proposed time- and frequency-synchronization technique uses only the signals received from the neighbor nodes, by considering the effect of the propagation delay between the nodes. It achieves a fast synchronization with a relatively low computational complexity because it is operated in a distributed manner, not requiring any feedback channel for the compensation of the propagation delays. In addition, a self-organization scheme that can be effectively used to construct 1-hop neighbor nodes is proposed for an OFDMA-based WMN with a large number of nodes. The performance of the proposed technique is evaluated with regard to the convergence property and synchronization success probability using a computer simulation. PMID:26225974

  19. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  20. A risk assessment modeling technique based on knowledge extraction and information diffusion with support specification

    NASA Astrophysics Data System (ADS)

    Zhang, Ren; Peng, Peng; Xu, Zhisheng; Li, Jiaxun; Niu, Shengjie

    2013-11-01

    For the difficulties of lacking (even no) samples and cases in actual risk evaluation and support decision-making, it was much trouble for risk assessment modeling by general statistic methods and data mining technique. A kind of idea and technique of knowledge extraction and risk assessment was presented based on support specification, and the basic operation steps and modeling route were expounded. Based on an example of aviation weather supporting, a risk evaluation model of weather influencing aircraft taking off and landing safety was established, and the corresponding evaluation experiments were carried out.

  1. Stabilizing operation point technique based on the tunable distributed feedback laser for interferometric sensors

    NASA Astrophysics Data System (ADS)

    Mao, Xuefeng; Zhou, Xinlei; Yu, Qingxu

    2016-02-01

    We describe a stabilizing operation point technique based on the tunable Distributed Feedback (DFB) laser for quadrature demodulation of interferometric sensors. By introducing automatic lock quadrature point and wavelength periodically tuning compensation into an interferometric system, the operation point of interferometric system is stabilized when the system suffers various environmental perturbations. To demonstrate the feasibility of this stabilizing operation point technique, experiments have been performed using a tunable-DFB-laser as light source to interrogate an extrinsic Fabry-Perot interferometric vibration sensor and a diaphragm-based acoustic sensor. Experimental results show that good tracing of Q-point was effectively realized.

  2. Microcapsule-based techniques for improving the safety of lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Baginska, Marta

    Lithium-ion batteries are vital energy storage devices due to their high specific energy density, lack of memory effect, and long cycle life. While they are predominantly used in small consumer electronics, new strategies for improving battery safety and lifetime are critical to the successful implementation of high-capacity, fast-charging materials required for advanced Li-ion battery applications. Currently, the presence of a volatile, combustible electrolyte and an oxidizing agent (Lithium oxide cathodes) make the Li-ion cell susceptible to fire and explosions. Thermal overheating, electrical overcharging, or mechanical damage can trigger thermal runaway, and if left unchecked, combustion of battery materials. To improve battery safety, autonomic, thermally-induced shutdown of Li-ion batteries is demonstrated by depositing thermoresponsive polymer microspheres onto battery anodes. When the internal temperature of the cell reaches a critical value, the microspheres melt and conformally coat the anode and/or separator with an ion insulating barrier, halting Li-ion transport and shutting down the cell permanently. Charge and discharge capacity is measured for Li-ion coin cells containing microsphere-coated anodes or separators as a function of capsule coverage. Scanning electron microscopy images of electrode surfaces from cells that have undergone autonomic shutdown provides evidence of melting, wetting, and re-solidification of polyethylene (PE) into the anode and polymer film formation at the anode/separator interface. As an extension of this autonomic shutdown approach, a particle-based separator capable of performing autonomic shutdown, but which reduces the shorting hazard posed by current bi- and tri-polymer commercial separators, is presented. This dual-particle separator is composed of hollow glass microspheres acting as a physical spacer between electrodes, and PE microspheres to impart autonomic shutdown functionality. An oil-immersion technique is

  3. A hospital-wide clinical findings dictionary based on an extension of the International Classification of Diseases (ICD).

    PubMed Central

    Bréant, C.; Borst, F.; Campi, D.; Griesser, V.; Momjian, S.

    1999-01-01

    The use of a controlled vocabulary set in a hospital-wide clinical information system is of crucial importance for many departmental database systems to communicate and exchange information. In the absence of an internationally recognized clinical controlled vocabulary set, a new extension of the International statistical Classification of Diseases (ICD) is proposed. It expands the scope of the standard ICD beyond diagnosis and procedures to clinical terminology. In addition, the common Clinical Findings Dictionary (CFD) further records the definition of clinical entities. The construction of the vocabulary set and the CFD is incremental and manual. Tools have been implemented to facilitate the tasks of defining/maintaining/publishing dictionary versions. The design of database applications in the integrated clinical information system is driven by the CFD which is part of the Medical Questionnaire Designer tool. Several integrated clinical database applications in the field of diabetes and neuro-surgery have been developed at the HUG. Images Figure 1 PMID:10566451

  4. Comparative study of manual liquid-based cytology (MLBC) technique and direct smear technique (conventional) on fine-needle cytology/fine-needle aspiration cytology samples

    PubMed Central

    Pawar, Prajkta Suresh; Gadkari, Rasika Uday; Swami, Sunil Y.; Joshi, Anil R.

    2014-01-01

    Background: Liquid-based cytology technique enables cells to be suspended in a liquid medium and spread in a monolayer, making better morphological assessment. Automated techniques have been widely used, but limited due to cost and availability. Aim: The aim was to establish manual liquid-based cytology (MLBC) technique on fine-needle aspiration cytology (FNAC) material and compare its results with conventional technique. Materials and Methods: In this study, we examined cells trapped in needles hub used for the collection of FNAC samples. 50 cases were examined by the MLBC technique and compared with the conventional FNAC technique. By centrifugation, sediment was obtained and imprint was taken on defined area. Papanicolaou (Pap) and May-Grünwald Giemsa (MGG) staining was done. Direct smears and MLBC smears were compared for cellularity, background, cellular preservation, and nuclear preservation. Slides were diagnosed independently by two cytologists with more than 5 years’ experience. Standard error of proportion was used for statistical analysis. Results: Cellularity was low in MLBC as compared with conventional smears, which is expected as remnant material in the needle hub was used. Nuclei overlap to a lesser extent and hemorrhage and necrosis was reduced, so cell morphology can be better studied in the MLBC technique. P value obtained was <0.05. Conclusion: This MLBC technique gives results comparable to the conventional technique with better morphology. In a set up where aspirators are learners, this technique will ensure adequacy due to remnant in needle hub getting processed PMID:25210235

  5. Extensive soft-sediment deformation and peperite formation at the base of a rhyolite lava: Owyhee Mountains, SW Idaho, USA

    NASA Astrophysics Data System (ADS)

    McLean, Charlotte E.; Brown, David J.; Rawcliffe, Heather J.

    2016-06-01

    In the Northern Owyhee Mountains (SW Idaho), a >200-m-thick flow of the Miocene Jump Creek Rhyolite was erupted on to a sequence of tuffs, lapilli tuffs, breccias and lacustrine siltstones of the Sucker Creek Formation. The rhyolite lava flowed over steep palaeotopography, resulting in the forceful emplacement of lava into poorly consolidated sediments. The lava invaded this sequence, liquefying and mobilising the sediment, propagating sediment subvertically in large metre-scale fluidal diapirs and sediment injectites. The heat and the overlying pressure of the thick Jump Creek Rhyolite extensively liquefied and mobilised the sediment resulting in the homogenization of the Sucker Creek Formation units, and the formation of metre-scale loading structures (simple and pendulous load casts, detached pseudonodules). Density contrasts between the semi-molten rhyolite and liquefied sediment produced highly fluidal Rayleigh-Taylor structures. Local fluidisation formed peperite at the margins of the lava and elutriation structures in the disrupted sediment. The result is a 30-40-m zone beneath the rhyolite lava of extremely deformed stratigraphy. Brittle failure and folding is recorded in more consolidated sediments, indicating a differential response to loading due to the consolidation state of the sediments. The lava-sediment interaction is interpreted as being a function of (1) the poorly consolidated nature of the sediments, (2) the thickness and heat retention of the rhyolite lava, (3) the density contrast between the lava and the sediment and (4) the forceful emplacement of the lava. This study demonstrates how large lava bodies have the potential to extensively disrupt sediments and form significant lateral and vertical discontinuities that complicate volcanic facies architecture.

  6. Error-reduction techniques and error analysis for fully phase- and amplitude-based encryption.

    PubMed

    Javidi, B; Towghi, N; Maghzi, N; Verrall, S C

    2000-08-10

    The performance of fully phase- and amplitude-based encryption processors is analyzed. The effects of noise perturbations on the encrypted information are considered. A thresholding method of decryption that further reduces the mean-squared error (MSE) for the fully phase- and amplitude-based encryption processes is provided. The proposed thresholding scheme significantly improves the performance of fully phase- and amplitude-based encryption, as measured by the MSE metric. We obtain analytical MSE bounds when thresholding is used for both decryption methods, and we also present computer-simulation results. These results show that the fully phase-based method is more robust. We also give a formal proof of a conjecture about the decrypted distribution of distorted encrypted information. This allows the analytical bounds of the MSE to be extended to more general non-Gaussian, nonadditive, nonstationary distortions. Computer simulations support this extension.

  7. A comparison of model-based and hyperbolic localization techniques as applied to marine mammal calls

    NASA Astrophysics Data System (ADS)

    Tiemann, Christopher O.; Porter, Michael B.

    2003-10-01

    A common technique for the passive acoustic localization of singing marine mammals is that of hyperbolic fixing. This technique assumes straight-line, constant wave speed acoustic propagation to associate travel time with range, but in some geometries, these assumptions can lead to localization errors. A new localization algorithm based on acoustic propagation models can account for waveguide and multipath effects, and it has successfully been tested against real acoustic data from three different environments (Hawaii, California, and Bahamas) and three different species (humpback, blue, and sperm whales). Accuracy of the model-based approach has been difficult to verify given the absence of concurrent visual and acoustic observations of the same animal. However, the model-based algorithm was recently exercised against a controlled source of known position broadcasting recorded whale sounds, and location estimates were then compared to hyperbolic techniques and true source position. In geometries where direct acoustic paths exist, both model-based and hyperbolic techniques perform equally well. However, in geometries where bathymetric and refractive effects are important, such as at long range, the model-based approach shows improved accuracy.

  8. ISAR imaging of maneuvering targets based on the range centroid Doppler technique.

    PubMed

    Lv, Xiaolei; Xing, Mengdao; Wan, Chunru; Zhang, Shouhong

    2010-01-01

    A new inverse synthetic aperture radar (ISAR) imaging approach is presented for application in situations where the maneuverability of noncooperative target is not too severe and the Doppler variation of subechoes from scatterers can be approximated as a first-order polynomial. The proposed algorithm is referred to as the range centroid Doppler (RCD) ISAR imaging technique and is based on the stretch Keystone-Wigner transform (SKWT). The SKWT introduces a stretch weight factor containing a range of chirp rate into the autocorrelation function of each cross-range profile and uses a 1-D interpolation of the phase history which we call stretch keystone formatting. The processing simultaneously eliminates the effects of linear frequency migration for all signal components regardless of their unknown chirp rate in time-frequency plane, but not for the noise or for the cross terms. By utilizing this novel technique, clear ISAR imaging can be achieved for maneuvering targets without an exhaustive search procedure for the motion parameters. Performance comparison is carried out to evaluate the improvement of the RCD technique versus other methods such as the conventional range Doppler (RD) technique, the range instantaneous Doppler (RID) technique, and adaptive joint time-frequency (AJTF) technique. Examples provided demonstrate the effectiveness of the RCD technique with both simulated and experimental ISAR data.

  9. Evaluation of paint coating thickness variations based on pulsed Infrared thermography laser technique

    NASA Astrophysics Data System (ADS)

    Mezghani, S.; Perrin, E.; Vrabie, V.; Bodnar, J. L.; Marthe, J.; Cauwe, B.

    2016-05-01

    In this paper, a pulsed Infrared thermography technique using a homogeneous heat provided by a laser source is used for the non-destructive evaluation of paint coating thickness variations. Firstly, numerical simulations of the thermal response of a paint coated sample are performed. By analyzing the thermal responses as a function of thermal properties and thickness of both coating and substrate layers, optimal excitation parameters of the heating source are determined. Two characteristic parameters were studied with respect to the paint coating layer thickness variations. Results obtained using an experimental test bench based on the pulsed Infrared thermography laser technique are compared with those given by a classical Eddy current technique for paint coating variations from 5 to 130 μm. These results demonstrate the efficiency of this approach and suggest that the pulsed Infrared thermography technique presents good perspectives to characterize the heterogeneity of paint coating on large scale samples with other heating sources.

  10. Mobility Based Key Management Technique for Multicast Security in Mobile Ad Hoc Networks

    PubMed Central

    Madhusudhanan, B.; Chitra, S.; Rajan, C.

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality. PMID:25834838

  11. Vision-based system identification technique for building structures using a motion capture system

    NASA Astrophysics Data System (ADS)

    Oh, Byung Kwan; Hwang, Jin Woo; Kim, Yousok; Cho, Tongjun; Park, Hyo Seon

    2015-11-01

    This paper presents a new vision-based system identification (SI) technique for building structures by using a motion capture system (MCS). The MCS with outstanding capabilities for dynamic response measurements can provide gage-free measurements of vibrations through the convenient installation of multiple markers. In this technique, from the dynamic displacement responses measured by MCS, the dynamic characteristics (natural frequency, mode shape, and damping ratio) of building structures are extracted after the processes of converting the displacement from MCS to acceleration and conducting SI by frequency domain decomposition. A free vibration experiment on a three-story shear frame was conducted to validate the proposed technique. The SI results from the conventional accelerometer-based method were compared with those from the proposed technique and showed good agreement, which confirms the validity and applicability of the proposed vision-based SI technique for building structures. Furthermore, SI directly employing MCS measured displacements to FDD was performed and showed identical results to those of conventional SI method.

  12. Mobility based key management technique for multicast security in mobile ad hoc networks.

    PubMed

    Madhusudhanan, B; Chitra, S; Rajan, C

    2015-01-01

    In MANET multicasting, forward and backward secrecy result in increased packet drop rate owing to mobility. Frequent rekeying causes large message overhead which increases energy consumption and end-to-end delay. Particularly, the prevailing group key management techniques cause frequent mobility and disconnections. So there is a need to design a multicast key management technique to overcome these problems. In this paper, we propose the mobility based key management technique for multicast security in MANET. Initially, the nodes are categorized according to their stability index which is estimated based on the link availability and mobility. A multicast tree is constructed such that for every weak node, there is a strong parent node. A session key-based encryption technique is utilized to transmit a multicast data. The rekeying process is performed periodically by the initiator node. The rekeying interval is fixed depending on the node category so that this technique greatly minimizes the rekeying overhead. By simulation results, we show that our proposed approach reduces the packet drop rate and improves the data confidentiality.

  13. DCT-Yager FNN: a novel Yager-based fuzzy neural network with the discrete clustering technique.

    PubMed

    Singh, A; Quek, C; Cho, S Y

    2008-04-01

    superior performance. Extensive experiments have been conducted to test the effectiveness of these two networks, using various clustering algorithms. It follows that the SDCT and UDCT clustering algorithms are particularly suited to networks based on the Yager inference rule.

  14. Novel technique for distributed fibre sensing based on coherent Rayleigh scattering measurements of birefringence

    NASA Astrophysics Data System (ADS)

    Lu, Xin; Soto, Marcelo A.; Thévenaz, Luc

    2016-05-01

    A novel distributed fibre sensing technique is described and experimentally validated, based on birefringence measurements using coherent Rayleigh scattering. It natively provides distributed measurements of temperature and strain with more than an order of magnitude higher sensitivity than Brillouin sensing, and requiring access to a single fibre-end. Unlike the traditional Rayleigh-based coherent optical time-domain reflectometry, this new method provides absolute measurements of the measurand and may lead to a robust discrimination between temperature and strain in combination with another technique. Since birefringence is purposely induced in the fibre by design, large degrees of freedom are offered to optimize and scale the sensitivity to a given quantity. The technique has been validated in 2 radically different types of birefringent fibres - elliptical-core and Panda polarization-maintaining fibres - with a good repeatability.

  15. Segmentation techniques evaluation based on a single compact breast mass classification scheme

    NASA Astrophysics Data System (ADS)

    Matheus, Bruno R. N.; Marcomini, Karem D.; Schiabel, Homero

    2016-03-01

    In this work some segmentation techniques are evaluated by using a simple centroid-based classification system regarding breast mass delineation in digital mammography images. The aim is to determine the best one for future CADx developments. Six techniques were tested: Otsu, SOM, EICAMM, Fuzzy C-Means, K-Means and Level-Set. All of them were applied to segment 317 mammography images from DDSM database. A single compact set of attributes was extracted and two centroids were defined, one for malignant and another for benign cases. The final classification was based on proximity with a given centroid and the best results were presented by the Level-Set technique with a 68.1% of Accuracy, which indicates this method as the most promising for breast masses segmentation aiming a more precise interpretation in schemes CADx.

  16. Growth of healthy term infants fed ready-to-feed and powdered forms of an extensively hydrolyzed casein-based infant formula: a randomized, blinded, controlled trial.

    PubMed

    Borschel, Marlene W; Baggs, Geraldine E; Barrett-Reis, Bridget

    2014-06-01

    Extensively hydrolyzed formulas present a complex matrix subject to adverse conditions during manufacture that could influence growth and tolerance of infants fed these formulas. A masked, randomized, parallel growth study was conducted in infants fed a ready-to-feed (RTF) or powdered (PWD) form of an extensively hydrolyzed casein-based formula. Infants were enrolled between 0 and 9 days and studied to 112 days of age. Growth, formula intake, and stool patterns were assessed. There were no significant differences between groups in weight, length, head circumference, or their respective gains. Tolerance was similar between groups except that the RTF group had greater formula intakes and passed more stools/day compared to the PWD group. This study demonstrates that the PWD formulation of this RTF formula supports similar growth and tolerance in infants during the first 4 months of life.

  17. Randomization techniques for the intensity modulation-based quantum stream cipher and progress of experiment

    NASA Astrophysics Data System (ADS)

    Kato, Kentaro; Hirota, Osamu

    2011-08-01

    The quantum noise based direct encryption protocol Y-OO is expected to provide physical complexity based security, which is thought to be comparable to information theoretic security in mathematical cryptography, for the. physical layer of fiber-optic communication systems. So far, several randomization techniques for the quantum stream cipher by Y-OO protocol have been proposed, but most of them were developed under the assumption that phase shift keying is used as the modulation format. On the other hand, the recent progress in the experimental study on the intensity modulation based quantum stream cipher by Y-OO protocol raises expectations for its realization. The purpose of this paper is to present design and implementation methods of a composite model of the intensity modulation based quantum stream cipher with some randomization techniques. As a result this paper gives a viewpoint of how the Y-OO cryptosystem is miniaturized.

  18. Extensive Reading Coursebooks in China

    ERIC Educational Resources Information Center

    Renandya, Willy A.; Hu, Guangwei; Xiang, Yu

    2015-01-01

    This article reports on a principle-based evaluation of eight dedicated extensive reading coursebooks published in mainland China and used in many universities across the country. The aim is to determine the extent to which these coursebooks reflect a core set of nine second language acquisition and extensive reading principles. Our analysis shows…

  19. Molecular-Based Optical Measurement Techniques for Transition and Turbulence in High-Speed Flow

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Danehy, Paul M.; Cutler, Andrew D.

    2013-01-01

    High-speed laminar-to-turbulent transition and turbulence affect the control of flight vehicles, the heat transfer rate to a flight vehicle's surface, the material selected to protect such vehicles from high heating loads, the ultimate weight of a flight vehicle due to the presence of thermal protection systems, the efficiency of fuel-air mixing processes in high-speed combustion applications, etc. Gaining a fundamental understanding of the physical mechanisms involved in the transition process will lead to the development of predictive capabilities that can identify transition location and its impact on parameters like surface heating. Currently, there is no general theory that can completely describe the transition-to-turbulence process. However, transition research has led to the identification of the predominant pathways by which this process occurs. For a truly physics-based model of transition to be developed, the individual stages in the paths leading to the onset of fully turbulent flow must be well understood. This requires that each pathway be computationally modeled and experimentally characterized and validated. This may also lead to the discovery of new physical pathways. This document is intended to describe molecular based measurement techniques that have been developed, addressing the needs of the high-speed transition-to-turbulence and high-speed turbulence research fields. In particular, we focus on techniques that have either been used to study high speed transition and turbulence or techniques that show promise for studying these flows. This review is not exhaustive. In addition to the probe-based techniques described in the previous paragraph, several other classes of measurement techniques that are, or could be, used to study high speed transition and turbulence are excluded from this manuscript. For example, surface measurement techniques such as pressure and temperature paint, phosphor thermography, skin friction measurements and

  20. Enhanced Detection of Multivariate Outliers Using Algorithm-Based Visual Display Techniques.

    ERIC Educational Resources Information Center

    Dickinson, Wendy B.

    This study uses an algorithm-based visual display technique (FACES) to provide enhanced detection of multivariate outliers within large-scale data sets. The FACES computer graphing algorithm (H. Chernoff, 1973) constructs a cartoon-like face, using up to 18 variables for each case. A major advantage of FACES is the ability to store and show the…

  1. Validation of Learning Effort Algorithm for Real-Time Non-Interfering Based Diagnostic Technique

    ERIC Educational Resources Information Center

    Hsu, Pi-Shan; Chang, Te-Jeng

    2011-01-01

    The objective of this research is to validate the algorithm of learning effort which is an indicator of a new real-time and non-interfering based diagnostic technique. IC3 Mentor, the adaptive e-learning platform fulfilling the requirements of intelligent tutor system, was applied to 165 university students. The learning records of the subjects…

  2. Using a Written Journal Technique to Enhance Inquiry-Based Reflection about Teaching

    ERIC Educational Resources Information Center

    Fry, Jane; Carol, Klages; Venneman, Sandy

    2013-01-01

    The aim of this study was to explore the efficacy of two written journal techniques used to encourage teacher candidates' inquiry-based reflection regarding course textbook content. Ninety-six participants were randomly assigned to one of the two experimental conditions, journaling with Questions, Quotes and Reflections (Double Q R) or…

  3. Phase demodulation from a single fringe pattern based on a correlation technique.

    PubMed

    Robin, Eric; Valle, Valéry

    2004-08-01

    We present a method for determining the demodulated phase from a single fringe pattern. This method, based on a correlation technique, searches in a zone of interest for the degree of similarity between a real fringe pattern and a mathematical model. This method, named modulated phase correlation, is tested with different examples. PMID:15298408

  4. Two Student Self-Management Techniques Applied to Data-Based Program Modification.

    ERIC Educational Resources Information Center

    Wesson, Caren

    Two student self-management techniques, student charting and student selection of instructional activities, were applied to ongoing data-based program modification. Forty-two elementary school resource room students were assigned randomly (within teacher) to one of three treatment conditions: Teacher Chart-Teacher Select Instructional Activities…

  5. Characterization and performance of carbon films deposited by plasma and ion beam based techniques

    SciTech Connect

    Walter, K.C.; Kung, H.; Levine, T.

    1994-12-31

    Plasma and ion beam based techniques have been used to deposit carbon-based films. The ion beam based method, a cathodic arc process, used a magnetically mass analyzed beam and is inherently a line-of-sight process. Two hydrocarbon plasma-based, non-line-of-sight techniques were also used and have the advantage of being capable of coating complicated geometries. The self-bias technique can produce hard carbon films, but is dependent on rf power and the surface area of the target. The pulsed-bias technique can also produce hard carbon films but has the additional advantage of being independent of rf power and target surface area. Tribological results indicated the coefficient of friction is nearly the same for carbon films from each deposition process, but the wear rate of the cathodic arc film was five times less than for the self-bias or pulsed-bias films. Although the cathodic arc film was the hardest, contained the highest fraction of sp{sup 3} bonds and exhibited the lowest wear rate, the cathodic arc film also produced the highest wear on the 440C stainless steel counterface during tribological testing. Thus, for tribological applications requiring low wear rates for both counterfaces, coating one surface with a very hard, wear resistant film may detrimentally affect the tribological behavior of the counterface.

  6. An Autocorrelation-Based Technique for Built-Up Area Detection in ALOS PALSAR Imagery

    NASA Astrophysics Data System (ADS)

    Stasolla, Mattia; Gamba, Paolo

    2008-11-01

    The paper describes the results of the human extent analysis developed in the framework of project ALOS- 3648 "Human settlements mapping in Africa by SAR". The approach exploits textural indexes to detect human extent masks that are finally combined. Extensive tests on different and sparsely located sites show that the approach is valid and robust enough o be considered fro regional processing of ALOS/PALSAR data for human settlement extraction in Africa. Further improvements are possible and will be considered, but the achieved results are indeed promising for coming to the first regional urban extent data base extracted from VHR SAR data.

  7. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  8. Texture-based measurement of spatial frequency response using the dead leaves target: extensions, and application to real camera systems

    NASA Astrophysics Data System (ADS)

    McElvain, Jon; Campbell, Scott P.; Miller, Jonathan; Jin, Elaine W.

    2010-01-01

    The dead leaves model was recently introduced as a method for measuring the spatial frequency response (SFR) of camera systems. The target consists of a series of overlapping opaque circles with a uniform gray level distribution and radii distributed as r-3. Unlike the traditional knife-edge target, the SFR derived from the dead leaves target will be penalized for systems that employ aggressive noise reduction. Initial studies have shown that the dead leaves SFR correlates well with sharpness/texture blur preference, and thus the target can potentially be used as a surrogate for more expensive subjective image quality evaluations. In this paper, the dead leaves target is analyzed for measurement of camera system spatial frequency response. It was determined that the power spectral density (PSD) of the ideal dead leaves target does not exhibit simple power law dependence, and scale invariance is only loosely obeyed. An extension to the ideal dead leaves PSD model is proposed, including a correction term to account for system noise. With this extended model, the SFR of several camera systems with a variety of formats was measured, ranging from 3 to 10 megapixels; the effects of handshake motion blur are also analyzed via the dead leaves target.

  9. Chaotic extension neural network theory-based XXY stage collision fault detection using a single accelerometer sensor.

    PubMed

    Hsieh, Chin-Tsung; Yau, Her-Terng; Wu, Shang-Yi; Lin, Huo-Cheng

    2014-11-14

    The collision fault detection of a XXY stage is proposed for the first time in this paper. The stage characteristic signals are extracted and imported into the master and slave chaos error systems by signal filtering from the vibratory magnitude of the stage. The trajectory diagram is made from the chaos synchronization dynamic error signals E1 and E2. The distance between characteristic positive and negative centers of gravity, as well as the maximum and minimum distances of trajectory diagram, are captured as the characteristics of fault recognition by observing the variation in various signal trajectory diagrams. The matter-element model of normal status and collision status is built by an extension neural network. The correlation grade of various fault statuses of the XXY stage was calculated for diagnosis. The dSPACE is used for real-time analysis of stage fault status with an accelerometer sensor. Three stage fault statuses are detected in this study, including normal status, Y collision fault and X collision fault. It is shown that the scheme can have at least 75% diagnosis rate for collision faults of the XXY stage. As a result, the fault diagnosis system can be implemented using just one sensor, and consequently the hardware cost is significantly reduced.

  10. An Extension of the Athena++ Code Framework for GRMHD Based on Advanced Riemann Solvers and Staggered-mesh Constrained Transport

    NASA Astrophysics Data System (ADS)

    White, Christopher J.; Stone, James M.; Gammie, Charles F.

    2016-08-01

    We present a new general relativistic magnetohydrodynamics (GRMHD) code integrated into the Athena++ framework. Improving upon the techniques used in most GRMHD codes, ours allows the use of advanced, less diffusive Riemann solvers, in particular HLLC and HLLD. We also employ a staggered-mesh constrained transport algorithm suited for curvilinear coordinate systems in order to maintain the divergence-free constraint of the magnetic field. Our code is designed to work with arbitrary stationary spacetimes in one, two, or three dimensions, and we demonstrate its reliability through a number of tests. We also report on its promising performance and scalability.

  11. A hybrid algorithm for clustering of time series data based on affinity search technique.

    PubMed

    Aghabozorgi, Saeed; Ying Wah, Teh; Herawan, Tutut; Jalab, Hamid A; Shaygan, Mohammad Amin; Jalali, Alireza

    2014-01-01

    Time series clustering is an important solution to various problems in numerous fields of research, including business, medical science, and finance. However, conventional clustering algorithms are not practical for time series data because they are essentially designed for static data. This impracticality results in poor clustering accuracy in several systems. In this paper, a new hybrid clustering algorithm is proposed based on the similarity in shape of time series data. Time series data are first grouped as subclusters based on similarity in time. The subclusters are then merged using the k-Medoids algorithm based on similarity in shape. This model has two contributions: (1) it is more accurate than other conventional and hybrid approaches and (2) it determines the similarity in shape among time series data with a low complexity. To evaluate the accuracy of the proposed model, the model is tested extensively using syntactic and real-world time series datasets.

  12. An Error Diagnosis Technique Based on Location Sets to Rectify Subcircuits

    NASA Astrophysics Data System (ADS)

    Shioki, Kosuke; Okada, Narumi; Ishihara, Toshiro; Hirose, Tetsuya; Kuroki, Nobutaka; Numa, Masahiro

    This paper presents an error diagnosis technique for incremental synthesis, called EXLLS (Extended X-algorithm for LUT-based circuit model based on Location sets to rectify Subcircuits), which rectifies five or more functional errors in the whole circuit based on location sets to rectify subcircuits. Conventional error diagnosis technique, called EXLIT, tries to rectify five or more functional errors based on incremental rectification for subcircuits. However, the solution depends on the selection and the order of modifications on subcircuits, which increases the number of locations to be changed. To overcome this problem, we propose EXLLS based on location sets to rectify subcircuits, which obtains two or more solutions by separating i) extraction of location sets to be rectified, and ii) rectification for the whole circuit based on the location sets. Thereby EXLLS can rectify five or more errors with fewer locations to change. Experimental results have shown that EXLLS reduces increase in the number of locations to be rectified with conventional technique by 90.1%.

  13. Extrusion based rapid prototyping technique: an advanced platform for tissue engineering scaffold fabrication.

    PubMed

    Hoque, M Enamul; Chuan, Y Leng; Pashby, Ian

    2012-02-01

    Advances in scaffold design and fabrication technology have brought the tissue engineering field stepping into a new era. Conventional techniques used to develop scaffolds inherit limitations, such as lack of control over the pore morphology and architecture as well as reproducibility. Rapid prototyping (RP) technology, a layer-by-layer additive approach offers a unique opportunity to build complex 3D architectures overcoming those limitations that could ultimately be tailored to cater for patient-specific applications. Using RP methods, researchers have been able to customize scaffolds to mimic the biomechanical properties (in terms of structural integrity, strength, and microenvironment) of the organ or tissue to be repaired/replaced quite closely. This article provides intensive description on various extrusion based scaffold fabrication techniques and review their potential utility for TE applications. The extrusion-based technique extrudes the molten polymer as a thin filament through a nozzle onto a platform layer-by-layer and thus building 3D scaffold. The technique allows full control over pore architecture and dimension in the x- and y- planes. However, the pore height in z-direction is predetermined by the extruding nozzle diameter rather than the technique itself. This review attempts to assess the current state and future prospects of this technology.

  14. Encoding technique for high data compaction in data bases of fusion devices

    SciTech Connect

    Vega, J.; Cremy, C.; Sanchez, E.; Portas, A.

    1996-12-01

    At present, data requirements of hundreds of Mbytes/discharge are typical in devices such as JET, TFTR, DIII-D, etc., and these requirements continue to increase. With these rates, the amount of storage required to maintain discharge information is enormous. Compaction techniques are now essential to reduce storage. However, general compression techniques may distort signals, but this is undesirable for fusion diagnostics. We have developed a general technique for data compression which is described here. The technique, which is based on delta compression, does not require an examination of the data as in delayed methods. Delta values are compacted according to general encoding forms which satisfy a prefix code property and which are defined prior to data capture. Several prefix codes, which are bit oriented and which have variable code lengths, have been developed. These encoding methods are independent of the signal analog characteristics and enable one to store undistorted signals. The technique has been applied to databases of the TJ-I tokamak and the TJ-IU torsatron. Compaction rates of over 80{percent} with negligible computational effort were achieved. Computer programs were written in ANSI C, thus ensuring portability and easy maintenance. We also present an interpretation, based on information theory, of the high compression rates achieved without signal distortion. {copyright} {ital 1996 American Institute of Physics.}

  15. Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique

    PubMed Central

    Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam

    2014-01-01

    Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050

  16. Spatial and temporal variation of bulk snow properties in northern boreal and tundra environments based on extensive field measurements

    NASA Astrophysics Data System (ADS)

    Hannula, Henna-Reetta; Lemmetyinen, Juha; Kontu, Anna; Derksen, Chris; Pulliainen, Jouni

    2016-08-01

    An extensive in situ data set of snow depth, snow water equivalent (SWE), and snow density collected in support of the European Space Agency (ESA) SnowSAR-2 airborne campaigns in northern Finland during the winter of 2011-2012 is presented (ESA Earth Observation Campaigns data 2000-2016). The suitability of the in situ measurement protocol to provide an accurate reference for the simultaneous airborne SAR (synthetic aperture radar) data products over different land cover types was analysed in the context of spatial scale, sample spacing, and uncertainty. The analysis was executed by applying autocorrelation analysis and root mean square difference (RMSD) error estimations. The results showed overall higher variability for all the three bulk snow parameters over tundra, open bogs and lakes (due to wind processes); however, snow depth tended to vary over shorter distances in forests (due to snow-vegetation interactions). Sample spacing/sample size had a statistically significant effect on the mean snow depth over all land cover types. Analysis executed for 50, 100, and 200 m transects revealed that in most cases less than five samples were adequate to describe the snow depth mean with RMSD < 5 %, but for land cover with high overall variability an indication of increased sample size of 1.5-3 times larger was gained depending on the scale and the desired maximum RMSD. Errors for most of the land cover types reached ˜ 10 % if only three measurements were considered. The collected measurements, which are available via the ESA website upon registration, compose an exceptionally large manually collected snow data set in Scandinavian taiga and tundra environments. This information represents a valuable contribution to the snow research community and can be applied to various snow studies.

  17. Adapting Agriculture Platforms for Nutrition: A Case Study of a Participatory, Video-Based Agricultural Extension Platform in India

    PubMed Central

    Kadiyala, Suneetha; Morgan, Emily H.; Cyriac, Shruthi; Margolies, Amy; Roopnaraine, Terry

    2016-01-01

    Successful integration of nutrition interventions into large-scale development programmes from nutrition-relevant sectors, such as agriculture, can address critical underlying determinants of undernutrition and enhance the coverage and effectiveness of on-going nutrition-specific activities. However, evidence on how this can be done is limited. This study examines the feasibility of delivering maternal, infant, and young child nutrition behaviour change communication through an innovative agricultural extension programme serving nutritionally vulnerable groups in rural India. The existing agriculture programme involves participatory production of low-cost videos promoting best practices and broad dissemination through village-level women’s self-help groups. For the nutrition intervention, 10 videos promoting specific maternal, infant, and young child nutrition practices were produced and disseminated in 30 villages. A range of methods was used to collect data, including in-depth interviews with project staff, frontline health workers, and self-help group members and their families; structured observations of mediated video dissemination sessions; nutrition knowledge tests with project staff and self-help group members; and a social network questionnaire to assess diffusion of promoted nutrition messages. We found the nutrition intervention to be well-received by rural communities and viewed as complementary to existing frontline health services. However, compared to agriculture, nutrition content required more time, creativity, and technical support to develop and deliver. Experimentation with promoted nutrition behaviours was high, but sharing of information from the videos with non-viewers was limited. Key lessons learned include the benefits of and need for collaboration with existing health services; continued technical support for implementing partners; engagement with local cultural norms and beliefs; empowerment of women’s group members to champion

  18. Use of extensively hydrolysed formula for refeeding neonates postnecrotising enterocolitis: a nationwide survey-based, cross-sectional study

    PubMed Central

    Lapillonne, Alexandre; Matar, Maroun; Adleff, Ariane; Chbihi, Marwa; Kermorvant-Duchemin, Elsa; Campeotto, Florence

    2016-01-01

    Objective To evaluate the prevalence of and reasons for using extensively hydrolysed formulas (EHFs) of cow's milk proteins in the French neonatal units as well as the modality of their prescription for refeeding infants recovering from necrotising enterocolitis (NEC). Methods A multicentre nationwide cross-sectional study using a questionnaire to address the prevalence of use and the reasons for prescribing EHF in hospitalised neonates and to examine the protocols and the actual reasons for their use for refeeding infants in recovery from NEC. The questionnaire was sent to only 1 senior neonatologist in each neonatal unit included in the study. Results More than half of the French neonatal units participated in the survey. 91% of the surveyed units used EHF. Of 1969 infants hospitalised on the day the survey was run, 12% were fed on an EHF. 11% of the EHF prescriptions were due to previous NEC. The main reasons for using an EHF to feed infants post-NEC were the absence of human milk (75%) and surgical management of NEC (17%). When given, EHF was mainly prescribed for a period varying between 15 days and 3 months. None of the involved units continued using the EHF after 6 months of age. More than half of the surveyed units acknowledged hospitalising infants for the initiation of weaning EHF but only 21% of them tested these infants for cow's milk allergy. Conclusions The prevalence of EHF use in the French neonatal units is high. Refeeding infants post-NEC is one of the main reasons for such a high prevalence. The main incentive for using an EHF is the absence of human breast milk, either maternal or donor. PMID:27388344

  19. Effect of root canal filling techniques on the bond strength of epoxy resin-based sealers.

    PubMed

    Rached-Júnior, Fuad Jacob Abi; Souza, Angélica Moreira; Macedo, Luciana Martins Domingues; Raucci-Neto, Walter; Baratto-Filho, Flares; Silva, Bruno Marques; Silva-Sousa, Yara Teresinha Corrêa

    2016-01-01

    The aim of this study was to evaluate the effects of different root canal filling techniques on the bond strength of epoxy resin-based sealers. Sixty single-rooted canines were prepared using ProTaper (F5) and divided into the following groups based on the root filling technique: Lateral Compaction (LC), Single Cone (SC), and Tagger Hybrid Technique (THT). The following subgroups (n = 10) were also created based on sealer material used: AH Plus and Sealer 26. Two-millimeter-thick slices were cut from all the root thirds and subjected to push-out test. Data (MPa) was analyzed using ANOVA and Tukey's test (α = 0.05). The push-out values were significantly affected by the sealer, filling technique, and root third (p < 0.05). AH Plus (1.37 ± 1.04) exhibited higher values than Sealer 26 (0.92 ± 0.51), while LC (1.80 ± 0.98) showed greater bond strength than THT (1.16 ± 0.50) and SC (0.92 ± 0.25). The cervical (1.45 ± 1.14) third exhibited higher bond strength, followed by the middle (1.20 ± 0.72) and apical (0.78 ± 0.33) thirds. AH Plus/LC (2.26 ± 1.15) exhibited the highest bond strength values, followed by AH Plus/THT (1.32 ± 0.61), Sealer 26/LC (1.34 ± 0.42), and Sealer 26/THT (1.00 ± 0.27). The lowest values were obtained with AH Plus/SC and Sealer 26/SC. Thus, it can be concluded that the filling technique affects the bond strength of sealers. LC was associated with higher bond strength between the material and intra-radicular dentine than THT and SC techniques. PMID:26910020

  20. Agarose-Based Substrate Modification Technique for Chemical and Physical Guiding of Neurons In Vitro.

    PubMed

    Krumpholz, Katharina; Rogal, Julia; El Hasni, Akram; Schnakenberg, Uwe; Bräunig, Peter; Bui-Göbbels, Katrin

    2015-08-26

    A new low cost and highly reproducible technique is presented that provides patterned cell culture substrates. These allow for selective positioning of cells and a chemically and mechanically directed guiding of their extensions. The patterned substrates consist of structured agarose hydrogels molded from reusable silicon micro templates. These templates consist of pins arranged equidistantly in squares, connected by bars, which mold corresponding wells and channels in the nonadhesive agarose hydrogel. Subsequent slice production with a standard vibratome, comprising the described template pattern, completes substrate production. Invertebrate neurons of locusts and pond snails are used for this application as they offer the advantage over vertebrate cells as being very large and suitable for cultivation in low cell density. Their neurons adhere to and grow only on the adhesive areas not covered by the agarose. Agarose slices of 50 μm thickness placed on glass, polystyrene, or MEA surfaces position and immobilize the neurons in the wells, and the channels guide their neurite outgrowth toward neighboring wells. In addition to the application with invertebrate neurons, the technique may also provide the potential for the application of a wide range of cell types. Long-term objective is the achievement of isolated low-density neuronal networks on MEAs or different culture substrates for various network analysis applications. PMID:26237337

  1. Airframe structural damage detection: a non-linear structural surface intensity based technique.

    PubMed

    Semperlotti, Fabio; Conlon, Stephen C; Barnard, Andrew R

    2011-04-01

    The non-linear structural surface intensity (NSSI) based damage detection technique is extended to airframe applications. The selected test structure is an upper cabin airframe section from a UH-60 Blackhawk helicopter (Sikorsky Aircraft, Stratford, CT). Structural damage is simulated through an impact resonator device, designed to simulate the induced vibration effects typical of non-linear behaving damage. An experimental study is conducted to prove the applicability of NSSI on complex mechanical systems as well as to evaluate the minimum sensor and actuator requirements. The NSSI technique is shown to have high damage detection sensitivity, covering an extended substructure with a single sensing location.

  2. MR-Based Cardiac and Respiratory Motion-Compensation Techniques for PET-MR Imaging.

    PubMed

    Munoz, Camila; Kolbitsch, Christoph; Reader, Andrew J; Marsden, Paul; Schaeffter, Tobias; Prieto, Claudia

    2016-04-01

    Cardiac and respiratory motion cause image quality degradation in PET imaging, affecting diagnostic accuracy of the images. Whole-body simultaneous PET-MR scanners allow for using motion information estimated from MR images to correct PET data and produce motion-compensated PET images. This article reviews methods that have been proposed to estimate motion from MR images and different techniques to include this information in PET reconstruction, in order to overcome the problem of cardiac and respiratory motion in PET-MR imaging. MR-based motion correction techniques significantly increase lesion detectability and contrast, and also improve accuracy of uptake values in PET images.

  3. Airframe structural damage detection: a non-linear structural surface intensity based technique.

    PubMed

    Semperlotti, Fabio; Conlon, Stephen C; Barnard, Andrew R

    2011-04-01

    The non-linear structural surface intensity (NSSI) based damage detection technique is extended to airframe applications. The selected test structure is an upper cabin airframe section from a UH-60 Blackhawk helicopter (Sikorsky Aircraft, Stratford, CT). Structural damage is simulated through an impact resonator device, designed to simulate the induced vibration effects typical of non-linear behaving damage. An experimental study is conducted to prove the applicability of NSSI on complex mechanical systems as well as to evaluate the minimum sensor and actuator requirements. The NSSI technique is shown to have high damage detection sensitivity, covering an extended substructure with a single sensing location. PMID:21476618

  4. Gradient-based multiobjective optimization using a distance constraint technique and point replacement

    NASA Astrophysics Data System (ADS)

    Sato, Yuki; Izui, Kazuhiro; Yamada, Takayuki; Nishiwaki, Shinji

    2016-07-01

    This paper proposes techniques to improve the diversity of the searching points during the optimization process in an Aggregative Gradient-based Multiobjective Optimization (AGMO) method, so that well-distributed Pareto solutions are obtained. First to be discussed is a distance constraint technique, applied among searching points in the objective space when updating design variables, that maintains a minimum distance between the points. Next, a scheme is introduced that deals with updated points that violate the distance constraint, by deleting the offending points and introducing new points in areas of the objective space where searching points are sparsely distributed. Finally, the proposed method is applied to example problems to illustrate its effectiveness.

  5. Planning and delivery comparison of six linac-based stereotactic radiosurgery techniques

    NASA Astrophysics Data System (ADS)

    Thakur, Varun Singh

    This work presents planning and delivery comparison of linac-based SRS treatment techniques currently available for single lesion cranial SRS. In total, two dedicated SRS systems (Novalis Tx, Cyberknife) and a HI-ART TomoTherapy system with six different delivery techniques are evaluated. Four delivery techniques are evaluated on a Novalis Tx system: circular cones, dynamic conformal arcs (DCA), static non-coplanar intensity modulated radiotherapy (NCP-IMRT), and volumetric modulated arc therapy (RapidArc) techniques are compared with intensity modulation based helical Tomotherapy on the HI-ART Tomotherapy system and with non-isocentric, multiple overlapping based robotic radiosurgery using the CyberKnife system. Thirteen patients are retrospectively selected for the study. The target volumes of each patient are transferred to a CT scan of a Lucy phantom (Standard Imaging Inc., Middleton, WI, USA) designed for end-to-end SRS QA. In order to evaluate the plans, several indices scoring the conformality, homogeneity and gradients in the plan are calculated and compared for each of the plans. Finally, to check the clinical deliverability of the plans and the delivery accuracy of different systems, a few targets are delivered on each system. A comparison between planned dose on treatment planning system and dose delivered on Gafchromic EBT film (ISP, Wayne, New Jersey, USA) is carried out by comparing dose beam profiles, isodose lines and by calculating gamma index.

  6. Experimental comparison between speckle and grating-based imaging technique using synchrotron radiation X-rays.

    PubMed

    Kashyap, Yogesh; Wang, Hongchang; Sawhney, Kawal

    2016-08-01

    X-ray phase contrast and dark-field imaging techniques provide important and complementary information that is inaccessible to the conventional absorption contrast imaging. Both grating-based imaging (GBI) and speckle-based imaging (SBI) are able to retrieve multi-modal images using synchrotron as well as lab-based sources. However, no systematic comparison has been made between the two techniques so far. We present an experimental comparison between GBI and SBI techniques with synchrotron radiation X-ray source. Apart from the simple experimental setup, we find SBI does not suffer from the issue of phase unwrapping, which can often be problematic for GBI. In addition, SBI is also superior to GBI since two orthogonal differential phase gradients can be simultaneously extracted by one dimensional scan. The GBI has less stringent requirements for detector pixel size and transverse coherence length when a second or third grating can be used. This study provides the reference for choosing the most suitable technique for diverse imaging applications at synchrotron facility.

  7. Rotational roadmapping: a new image-based navigation technique for the interventional room.

    PubMed

    Kukuk, Markus; Napel, Sandy

    2007-01-01

    For decades, conventional 2D-roadmaping has been the method of choice for image-based guidewire navigation during endovascular procedures. Only recently have 3D-roadmapping techniques become available that are based on the acquisition and reconstruction of a 3D image of the vascular tree. In this paper, we present a new image-based navigation technique called RoRo (Rotational Roadmapping) that eliminates the guess-work inherent to the conventional 2D method, but does not require a 3D image. Our preliminary clinical results show that there are situations in which RoRo is preferred over the existing two methods, thus demonstrating potential for filling a clinical niche and complementing the spectrum of available navigation tools. PMID:18044622

  8. The use of computer vision techniques to augment home based sensorised environments.

    PubMed

    Uhríková, Zdenka; Nugent, Chris D; Hlavác, Václav

    2008-01-01

    Technology within the home environment is becoming widely accepted as a means to facilitate independent living. Nevertheless, practical issues of detecting different tasks between multiple persons within the same environment along with managing instances of uncertainty associated with recorded sensor data are two key challenges yet to be fully solved. This work presents details of how computer vision techniques can be used as both alternative and complementary means in the assessment of behaviour in home based sensorised environments. Within our work we assessed the ability of vision processing techniques in conjunction with sensor based data to deal with instances of multiple occupancy. Our Results indicate that the inclusion of the video data improved the overall process of task identification by detecting and recognizing multiple people in the environment using color based tracking algorithm.

  9. Application of mass spectrometry-based proteomics techniques for the detection of protein doping in sports.

    PubMed

    Kay, Richard G; Creaser, Colin S

    2010-04-01

    Mass spectrometry-based proteomic approaches have been used to develop methodologies capable of detecting the abuse of protein therapeutics such as recombinant human erythropoietin and recombinant human growth hormone. Existing detection methods use antibody-based approaches that, although effective, suffer from long assay development times and specificity issues. The application of liquid chromatography with tandem mass spectrometry and selected reaction-monitoring-based analysis has demonstrated the ability to detect and quantify existing protein therapeutics in plasma. Furthermore, the multiplexing capability of selected reaction-monitoring analysis has also aided in the detection of multiple downstream biomarkers in a single analysis, requiring less sample than existing immunological techniques. The flexibility of mass spectrometric instrumentation has shown that the technique is capable of detecting the abuse of novel and existing protein therapeutics, and has a vital role in the fight to keep sports drug-free.

  10. Towards a balanced software team formation based on Belbin team role using fuzzy technique

    NASA Astrophysics Data System (ADS)

    Omar, Mazni; Hasan, Bikhtiyar; Ahmad, Mazida; Yasin, Azman; Baharom, Fauziah; Mohd, Haslina; Darus, Norida Muhd

    2016-08-01

    In software engineering (SE), team roles play significant impact in determining the project success. To ensure the optimal outcome of the project the team is working on, it is essential to ensure that the team members are assigned to the right role with the right characteristics. One of the prevalent team roles is Belbin team role. A successful team must have a balance of team roles. Thus, this study demonstrates steps taken to determine balance of software team formation based on Belbin team role using fuzzy technique. Fuzzy technique was chosen because it allows analyzing of imprecise data and classifying selected criteria. In this study, two roles in Belbin team role, which are Shaper (Sh) and Plant (Pl) were chosen to assign the specific role in software team. Results show that the technique is able to be used for determining the balance of team roles. Future works will focus on the validation of the proposed method by using empirical data in industrial setting.

  11. Research on target recognition techniques of radar networking based on fuzzy mathematics

    NASA Astrophysics Data System (ADS)

    Guan, Chengbin; Wang, Guohong; Guan, Chengzhun; Pan, Jinshan

    2007-11-01

    Nowadays there are more and more targets, so it is more difficult for radar networking to track the important targets. To reduce the pressure on radar networking and the waste of ammunition, it is very necessary for radar networking to recognize the targets. Two target recognition approaches of radar networking based on fuzzy mathematics are proposed in this paper, which are multi-level fuzzy synthetical evaluation technique and lattice approaching degree technique. By analyzing the principles, the application techniques are given, the merits and shortcomings are also analyzed, and applying environments are advised. Another emphasis is the compare between the multiple mono-level fuzzy synthetical evaluation and the multi-level fuzzy synthetical evaluation, an instance is carried out to illuminate the problem, then the results are analyzed in theory, the conclusions are gotten which can be instructions for application in engineering.

  12. A study on laser-based ultrasonic technique by the use of guided wave tomographic imaging

    SciTech Connect

    Park, Junpil Lim, Juyoung; Cho, Younho; Krishnaswamy, Sridhar

    2015-03-31

    Guided wave tests are impractical for investigating specimens with limited accessibility and coarse surfaces or geometrically complicated features. A non-contact setup with a laser ultrasonic transmitter and receiver is the classic attractive for guided wave inspection. The present work was done to develop a non-contact guided-wave tomography technique by laser ultrasonic technique in a plate-like structure. A method for Lam wave generation and detection in an aluminum plate with a pulse laser ultrasonic transmitter and a Michelson interferometer receiver has been developed. In the images obtained by laser scanning, the defect shape and area showed good agreement with the actual defect. The proposed approach can be used as a non-contact-based online inspection and monitoring technique.

  13. Optical implementation of improved resolution with intermediate-view reconstruction technique based on integral imaging

    NASA Astrophysics Data System (ADS)

    Lee, Keong-Jin; Lee, Sang-Tae; Oh, Yong-Seok; Hong, Suk-Pyo; Kim, Chang-Keun; Kim, Eun-Soo

    2008-02-01

    To overcome the viewing resolution limit defined by the Nyquist sampling theorem for a given lenslet pitch, a Moving Array-Lens Technique (MALT) was developed in 3-D integral imaging technique. Even though the MALT is an effective method for resolution improvement of Integral Imaging, this cannot be applied to a real-time 3-D integral imaging display system because of its mechanical movement. In this paper, we propose an integral imaging display using a computational pick-up method based on Intermediate-View Reconstruction Technique instead of optical moving pickup. We show that the proposed system can provide optically resolution-improved 3-D images of integral imaging by use of EIs generated by the IVRT through the optical experiments.

  14. Reformulation linearization technique based branch-and-reduce approach applied to regional water supply system planning

    NASA Astrophysics Data System (ADS)

    Lan, Fujun; Bayraksan, Güzin; Lansey, Kevin

    2016-03-01

    A regional water supply system design problem that determines pipe and pump design parameters and water flows over a multi-year planning horizon is considered. A non-convex nonlinear model is formulated and solved by a branch-and-reduce global optimization approach. The lower bounding problem is constructed via a three-pronged effort that involves transforming the space of certain decision variables, polyhedral outer approximations, and the Reformulation Linearization Technique (RLT). Range reduction techniques are employed systematically to speed up convergence. Computational results demonstrate the efficiency of the proposed algorithm; in particular, the critical role range reduction techniques could play in RLT based branch-and-bound methods. Results also indicate using reclaimed water not only saves freshwater sources but is also a cost-effective non-potable water source in arid regions. Supplemental data for this article can be accessed at http://dx.doi.org/10.1080/0305215X.2015.1016508.

  15. A Rapid, Fluorescence-Based Field Screening Technique for Organic Species in Soil and Water Matrices.

    PubMed

    Russell, Amber L; Martin, David P; Cuddy, Michael F; Bednar, Anthony J

    2016-06-01

    Real-time detection of hydrocarbon contaminants in the environment presents analytical challenges because traditional laboratory-based techniques are cumbersome and not readily field portable. In the current work, a method for rapid and semi-quantitative detection of organic contaminants, primarily crude oil, in natural water and soil matrices has been developed. Detection limits in the parts per million and parts per billion were accomplished when using visual and digital detection methods, respectively. The extraction technique was modified from standard methodologies used for hydrocarbon analysis and provides a straight-forward separation technique that can remove interference from complex natural constituents. For water samples this method is semi-quantitative, with recoveries ranging from 70 % to 130 %, while measurements of soil samples are more qualitative due to lower extraction efficiencies related to the limitations of field-deployable procedures.

  16. Inspection technique for cleaved optical fiber ends based on Fabry-Perot resonator

    NASA Astrophysics Data System (ADS)

    Kihara, Mitsuru; Watanabe, Hiroshi; Yajima, Yuichi; Toyonaga, Masanobu

    2011-05-01

    We present a novel inspection technique for cleaved optical fiber ends based on the Fabry-Perot resonator. The technique uses mainly laser diodes, an optical power meter, 3-dB coupler, and XY lateral adjustment stage. It can be achieved more easily than current imaging processing that uses a charge coupled device camera and video monitor. The inspected fiber end is considered failed or successful depending on whether both the measured return losses from the fiber end at two wavelengths are equal to ~14.7 dB. Experimentally obtained fiber end images were in good agreement with scanning electron microscope observation images. Thus, the proposed technique provides a simple and cost-effective way to inspect cleaved optical fiber ends.

  17. Preliminary Tests of a Practical Fuzzy FES Controller Based on Cycle-to-Cycle Control in the Knee Flexion and Extension Control

    NASA Astrophysics Data System (ADS)

    Watanabe, Takashi; Masuko, Tomoya; Arifin, Achmad

    The fuzzy controller based on cycle-to-cycle control with output value adjustment factors (OAF) was developed for restoring gait of paralyzed subjects by using functional electrical stimulation (FES). Results of maximum knee flexion and extension controls with neurologically intact subjects suggested that the OAFs would be effective in reaching the target within small number of cycles and in reducing the error after reaching the target. Oscillating responses between cycles were also suppressed. The fuzzy controller was expected to be examined to optimize the OAFs with more subjects including paralyzed patients for clinical application.

  18. Experience with school-based interventions against soil-transmitted helminths and extension of coverage to non-enrolled children.

    PubMed

    Olsen, Annette

    2003-05-01

    This paper reviews the experience with school-based interventions against soil-transmitted helminths with regard to reduction in prevalence, intensity of infection and morbidity. It also examines the existing experience with coverage of school-based programmes to non-enrolled children. However, as this experience is limited, the paper also seeks to give an overview of the need for school control programmes to include other segments of the community. The experiences from the programmes indicate that treatment should be performed twice or thrice yearly without prior diagnosis, should be school-based and involving schoolteachers assisted by health staff, if possible. The drugs of choice are a single dose of 400 mg albendazole or 500 mg mebendazole. If intensities of Trichuris trichiura or hookworm infections are high, a double or triple dose of one of these drugs could be considered to maximise reduction in intensities. For the benefit of growth and iron status, it should be considered to supplement with iron and other micronutrients. School-based programmes should include non-enrolled school age children and pre-school children, and the system of having 'treatment days' at school, where these groups are invited for treatment, seems to be a promising strategy. While antenatal clinics have been involved in the anthelminthic treatment of pregnant women, they have not covered non-pregnant adolescent girls and women. These could be offered treatment through the 'treatment days' at school mentioned earlier.

  19. An Extension to the Constructivist Coding Hypothesis as a Learning Model for Selective Feedback when the Base Rate Is High

    ERIC Educational Resources Information Center

    Ghaffarzadegan, Navid; Stewart, Thomas R.

    2011-01-01

    Elwin, Juslin, Olsson, and Enkvist (2007) and Henriksson, Elwin, and Juslin (2010) offered the constructivist coding hypothesis to describe how people code the outcomes of their decisions when availability of feedback is conditional on the decision. They provided empirical evidence only for the 0.5 base rate condition. This commentary argues that…

  20. Optimal technique of linear accelerator-based stereotactic radiosurgery for tumors adjacent to brainstem.

    PubMed

    Chang, Chiou-Shiung; Hwang, Jing-Min; Tai, Po-An; Chang, You-Kang; Wang, Yu-Nong; Shih, Rompin; Chuang, Keh-Shih

    2016-01-01

    Stereotactic radiosurgery (SRS) is a well-established technique that is replacing whole-brain irradiation in the treatment of intracranial lesions, which leads to better preservation of brain functions, and therefore a better quality of life for the patient. There are several available forms of linear accelerator (LINAC)-based SRS, and the goal of the present study is to identify which of these techniques is best (as evaluated by dosimetric outcomes statistically) when the target is located adjacent to brainstem. We collected the records of 17 patients with lesions close to the brainstem who had previously been treated with single-fraction radiosurgery. In all, 5 different lesion catalogs were collected, and the patients were divided into 2 distance groups-1 consisting of 7 patients with a target-to-brainstem distance of less than 0.5cm, and the other of 10 patients with a target-to-brainstem distance of ≥ 0.5 and < 1cm. Comparison was then made among the following 3 types of LINAC-based radiosurgery: dynamic conformal arcs (DCA), intensity-modulated radiosurgery (IMRS), and volumetric modulated arc radiotherapy (VMAT). All techniques included multiple noncoplanar beams or arcs with or without intensity-modulated delivery. The volume of gross tumor volume (GTV) ranged from 0.2cm(3) to 21.9cm(3). Regarding the dose homogeneity index (HIICRU) and conformity index (CIICRU) were without significant difference between techniques statistically. However, the average CIICRU = 1.09 ± 0.56 achieved by VMAT was the best of the 3 techniques. Moreover, notable improvement in gradient index (GI) was observed when VMAT was used (0.74 ± 0.13), and this result was significantly better than those achieved by the 2 other techniques (p < 0.05). For V4Gy of brainstem, both VMAT (2.5%) and IMRS (2.7%) were significantly lower than DCA (4.9%), both at the p < 0.05 level. Regarding V2Gy of normal brain, VMAT plans had attained 6.4 ± 5%; this was significantly better (p < 0.05) than

  1. Review of pyroelectric thermal energy harvesting and new MEMs based resonant energy conversion techniques

    SciTech Connect

    Hunter, Scott Robert; Lavrik, Nickolay V; Mostafa, Salwa; Rajic, Slobodan; Datskos, Panos G

    2012-01-01

    Harvesting electrical energy from thermal energy sources using pyroelectric conversion techniques has been under investigation for over 50 years, but it has not received the attention that thermoelectric energy harvesting techniques have during this time period. This lack of interest stems from early studies which found that the energy conversion efficiencies achievable using pyroelectric materials were several times less than those potentially achievable with thermoelectrics. More recent modeling and experimental studies have shown that pyroelectric techniques can be cost competitive with thermoelectrics and, using new temperature cycling techniques, has the potential to be several times as efficient as thermoelectrics under comparable operating conditions. This paper will review the recent history in this field and describe the techniques that are being developed to increase the opportunities for pyroelectric energy harvesting. The development of a new thermal energy harvester concept, based on temperature cycled pyroelectric thermal-to-electrical energy conversion, are also outlined. The approach uses a resonantly driven, pyroelectric capacitive bimorph cantilever structure that can be used to rapidly cycle the temperature in the energy harvester. The device has been modeled using a finite element multi-physics based method, where the effect of the structure material properties and system parameters on the frequency and magnitude of temperature cycling, and the efficiency of energy recycling using the proposed structure, have been modeled. Results show that thermal contact conductance and heat source temperature differences play key roles in dominating the cantilever resonant frequency and efficiency of the energy conversion technique. This paper outlines the modeling, fabrication and testing of cantilever and pyroelectric structures and single element devices that demonstrate the potential of this technology for the development of high efficiency thermal

  2. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. PMID:26851478

  3. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    PubMed

    Johnson, Blake N; Mutharasan, Raj

    2014-04-01

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  4. Cell micropatterning on an albumin-based substrate using an inkjet printing technique.

    PubMed

    Yamazoe, Hironori; Tanabe, Toshizumi

    2009-12-15

    Positioning of cells in a desired pattern on a substrate is an important technique for cell-based technologies, including the fundamental investigation of cell functions, tissue-engineering applications, and the fabrication of cell-based biosensors and cell arrays. Recently, the inkjet printing technique was recognized as a promising approach to the creation of cellular patterns on substrates, and it has been achieved by the printing of living cells or cell adhesive proteins. In this article, we created complex cellular patterns by using an albumin-based substrate and inkjet printing technique. Albumin was cross-linked using ethylene glycol diglycidyl ether. Subsequent casting of the cross-linked albumin solution onto glass plates prevented cells from adhering to their surfaces. Through screening various chemical reagents, we found that these cross-linked albumin surfaces dramatically changed into cell adhesive surfaces after immersion in cationic polymer solutions. Based on this finding, cell adhesive regions were prepared with a desired pattern by printing the polyethyleneimine (PEI) solution onto a cross-linked albumin substrate using a modified commercial inkjet printer. Various cellular patterns including figures, letters, and gradients could be fabricated by seeding mouse L929 fibroblast cells or mouse Neuro-2a neuroblastoma cells onto the printed PEI-patterned substrate. Compared with the printing of fragile living cells or proteins, printing of stable PEI circumvents clogging of printer head nozzles and enables reproducible printing. Therefore, the present method will allow the creation of complex cell patterns.

  5. Preliminary study of an angiographic and angio-tomographic technique based on K-edge filters

    SciTech Connect

    Golosio, Bruno; Brunetti, Antonio; Oliva, Piernicola; Carpinelli, Massimo; Luca Masala, Giovanni; Meloni, Francesco; Battista Meloni, Giovanni

    2013-08-14

    Digital Subtraction Angiography is commonly affected by artifacts due to the patient movements during the acquisition of the images without and with the contrast medium. This paper presents a preliminary study on an angiographic and angio-tomographic technique based on the quasi-simultaneous acquisition of two images, obtained using two different filters at the exit of an X-ray tube. One of the two filters (K-edge filter) contains the same chemical element used as a contrast agent (gadolinium in this study). This filter absorbs more radiation with energy just above the so called K-edge energy of gadolinium than the radiation with energy just below it. The other filter (an aluminium filter in this study) is simply used to suppress the low-energy contribution to the spectrum. Using proper calibration curves, the two images are combined to obtain an image of the contrast agent distribution. In the angio-tomographic application of the proposed technique two images, corresponding to the two filter types, are acquired for each viewing angle of the tomographic scan. From the two tomographic reconstructions, it is possible to obtain a three-dimensional map of the contrast agent distribution. The technique was tested on a sample consisting of a rat skull placed inside a container filled with water. Six small cylinders with 4.7 mm internal diameter containing the contrast medium at different concentrations were placed inside the skull. In the plain angiographic application of the technique, five out of six cylinders were visible, with gadolinium concentration down to 0.96%. In the angio-tomographic application, all six cylinders were visible, with gadolinium concentration down to 0.49%. This preliminary study shows that the proposed technique can provide images of the contrast medium at low concentration without most of the artifacts that are present in images produced by conventional techniques. The results encourage further investigation on the feasibility of a clinical

  6. Preliminary study of an angiographic and angio-tomographic technique based on K-edge filters

    NASA Astrophysics Data System (ADS)

    Golosio, Bruno; Oliva, Piernicola; Brunetti, Antonio; Luca Masala, Giovanni; Carpinelli, Massimo; Meloni, Francesco; Battista Meloni, Giovanni

    2013-08-01

    Digital Subtraction Angiography is commonly affected by artifacts due to the patient movements during the acquisition of the images without and with the contrast medium. This paper presents a preliminary study on an angiographic and angio-tomographic technique based on the quasi-simultaneous acquisition of two images, obtained using two different filters at the exit of an X-ray tube. One of the two filters (K-edge filter) contains the same chemical element used as a contrast agent (gadolinium in this study). This filter absorbs more radiation with energy just above the so called K-edge energy of gadolinium than the radiation with energy just below it. The other filter (an aluminium filter in this study) is simply used to suppress the low-energy contribution to the spectrum. Using proper calibration curves, the two images are combined to obtain an image of the contrast agent distribution. In the angio-tomographic application of the proposed technique two images, corresponding to the two filter types, are acquired for each viewing angle of the tomographic scan. From the two tomographic reconstructions, it is possible to obtain a three-dimensional map of the contrast agent distribution. The technique was tested on a sample consisting of a rat skull placed inside a container filled with water. Six small cylinders with 4.7 mm internal diameter containing the contrast medium at different concentrations were placed inside the skull. In the plain angiographic application of the technique, five out of six cylinders were visible, with gadolinium concentration down to 0.96%. In the angio-tomographic application, all six cylinders were visible, with gadolinium concentration down to 0.49%. This preliminary study shows that the proposed technique can provide images of the contrast medium at low concentration without most of the artifacts that are present in images produced by conventional techniques. The results encourage further investigation on the feasibility of a clinical

  7. Directed assembly techniques for nano-manufacturing of scalable single walled carbon nanotube based devices

    NASA Astrophysics Data System (ADS)

    Makaram, Prashanth

    Single Walled Carbon Nanotubes (SWNTs) are being considered building blocks for next generation electronics due to their unique electrical, mechanical and thermal properties. A number of SWNT based devices including scanning probes, field emitters, field effect transistors, biological and chemical sensors, and memory devices have been demonstrated. Despite successful demonstration of these single devices, the success of SWNT based nanoelectronics is hampered due to the lack of a successful nano-manufacturing method. Precise alignment and placement of SWNTs is necessary for successful integration of SWNTs into nanoelectronics. The work described in this thesis is focused on developing electric field assisted assembly techniques for precise placement and controlled orientation of SWNTs. In a first set of experiments we evaluate the use of micro/nano finger shaped metal electrodes to assemble SWNTs. Eventhough this assembly technique help in understanding the electrophoretic behavior of SWNTs, problems related with orientation, assembly at nanoscale and electrode degradation demanded evaluating alternative techniques. Nanotemplates that use trenches made in PMMA on a conductive substrate are utilized for the directed, controlled assembly of SWNTs This technique uses a combination of electrophoretic forces and fluidic forces to assemble and align the SWNTs. We were able to assemble SWNTs in trenches that are as small as 80 nm wide and 100,000 nm long over a 2.25 cm2 area in 30-90 seconds. Based on the experimental results and analysis a model is proposed to explain the assembly and alignment mechanism of SWNT s. The technique has been utilized to fabricated interconnects and field effect transistors to demonstrate the feasibility to make devices. Finally we introduce a novel room temperature assembly technique for fabricating a three dimensional single walled carbon nanotube platform. A top down lithographic approach is used to fabricate the platform while a bottom

  8. Evaluation of clipping based iterative PAPR reduction techniques for FBMC systems.

    PubMed

    Kollár, Zsolt; Varga, Lajos; Horváth, Bálint; Bakki, Péter; Bitó, János

    2014-01-01

    This paper investigates filter bankmulticarrier (FBMC), a multicarrier modulation technique exhibiting an extremely low adjacent channel leakage ratio (ACLR) compared to conventional orthogonal frequency division multiplexing (OFDM) technique. The low ACLR of the transmitted FBMC signal makes it especially favorable in cognitive radio applications, where strict requirements are posed on out-of-band radiation. Large dynamic range resulting in high peak-to-average power ratio (PAPR) is characteristic of all sorts of multicarrier signals. The advantageous spectral properties of the high-PAPR FBMC signal are significantly degraded if nonlinearities are present in the transceiver chain. Spectral regrowth may appear, causing harmful interference in the neighboring frequency bands. This paper presents novel clipping based PAPR reduction techniques, evaluated and compared by simulations and measurements, with an emphasis on spectral aspects. The paper gives an overall comparison of PAPR reduction techniques, focusing on the reduction of the dynamic range of FBMC signals without increasing out-of-band radiation. An overview is presented on transmitter oriented techniques employing baseband clipping, which can maintain the system performance with a desired bit error rate (BER). PMID:24558338

  9. [Research progress on urban carbon fluxes based on eddy covariance technique].

    PubMed

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique. PMID:24830264

  10. [Research progress on urban carbon fluxes based on eddy covariance technique].

    PubMed

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique.

  11. An acoustic-array based structural health monitoring technique for wind turbine blades

    NASA Astrophysics Data System (ADS)

    Aizawa, Kai; Poozesh, Peyman; Niezrecki, Christopher; Baqersad, Javad; Inalpolat, Murat; Heilmann, Gunnar

    2015-04-01

    This paper proposes a non-contact measurement technique for health monitoring of wind turbine blades using acoustic beamforming techniques. The technique works by mounting an audio speaker inside a wind turbine blade and observing the sound radiated from the blade to identify damage within the structure. The main hypothesis for the structural damage detection is that the structural damage (cracks, edge splits, holes etc.) on the surface of a composite wind turbine blade results in changes in the sound radiation characteristics of the structure. Preliminary measurements were carried out on two separate test specimens, namely a composite box and a section of a wind turbine blade to validate the methodology. The rectangular shaped composite box and the turbine blade contained holes with different dimensions and line cracks. An acoustic microphone array with 62 microphones was used to measure the sound radiation from both structures when the speaker was located inside the box and also inside the blade segment. A phased array beamforming technique and CLEAN-based subtraction of point spread function from a reference (CLSPR) were employed to locate the different damage types on both the composite box and the wind turbine blade. The same experiment was repeated by using a commercially available 48-channel acoustic ring array to compare the test results. It was shown that both the acoustic beamforming and the CLSPR techniques can be used to identify the damage in the test structures with sufficiently high fidelity.

  12. Modelling of a novel x-ray phase contrast imaging technique based on coded apertures

    NASA Astrophysics Data System (ADS)

    Olivo, A.; Speller, R.

    2007-11-01

    X-ray phase contrast imaging is probably the most relevant among emerging x-ray imaging techniques, and it has the proven potential of revolutionizing the field of diagnostic radiology. Impressive images of a wide range of samples have been obtained, mostly at synchrotron radiation facilities. The necessity of relying on synchrotron radiation has prevented to a large extent a widespread diffusion of phase contrast imaging, thus precluding its transfer to clinical practice. A new technique, based on the use of coded apertures, was recently developed at UCL. This technique was demonstrated to provide intense phase contrast signals with conventional x-ray sources and detectors. Unlike other attempts at making phase contrast imaging feasible with conventional sources, the coded-aperture approach does not impose substantial limitations and/or filtering of the radiation beam, and it therefore allows, for the first time, exposures compatible with clinical practice. The technique has been thoroughly modelled, and this paper describes the technique in detail by going through the different steps of the modelling. All the main factors influencing image quality are discussed, alongside the viability of realizing a prototype suitable for clinical use. The model has been experimentally validated and a section of the paper shows the comparison between simulated and experimental results.

  13. A cluster randomized control field trial of the ABRACADABRA web-based reading technology: replication and extension of basic findings.

    PubMed

    Piquette, Noella A; Savage, Robert S; Abrami, Philip C

    2014-01-01

    The present paper reports a cluster randomized control trial evaluation of teaching using ABRACADABRA (ABRA), an evidence-based and web-based literacy intervention (http://abralite.concordia.ca) with 107 kindergarten and 96 grade 1 children in 24 classes (12 intervention 12 control classes) from all 12 elementary schools in one school district in Canada. Children in the intervention condition received 10-12 h of whole class instruction using ABRA between pre- and post-test. Hierarchical linear modeling of post-test results showed significant gains in letter-sound knowledge for intervention classrooms over control classrooms. In addition, medium effect sizes were evident for three of five outcome measures favoring the intervention: letter-sound knowledge (d= +0.66), phonological blending (d = +0.52), and word reading (d = +0.52), over effect sizes for regular teaching. It is concluded that regular teaching with ABRA technology adds significantly to literacy in the early elementary years.

  14. Nano-based drug delivery system enhances the oral absorption of lipophilic drugs with extensive presystemic metabolism.

    PubMed

    Zhang, Zhiwen; Gao, Fang; Jiang, Shijun; Ma, Li; Li, Yaping

    2012-10-01

    Oral administration remains the most preferred route for the treatment of many diseases due to its convenience and adaptability. However, the presystemic metabolism may be an important barrier that prevents lipophilic drugs from achieving their pharmacological effects following oral delivery. Nano-based drug delivery system provides an effective strategy to reduce the presystemic metabolism and increase the systemic exposure of lipophilic drugs. In this review, we described the physiological factors affecting the presystemic metabolism of lipophilic drugs, intestinal transport of nanosystems, strategy of nanosystems to avoid the presystemic metabolism, and the current application of various oral nanosystems including lipid and polymeric nanocarriers. The nano-based drug delivery system has a lot of potential for reducing the presystemic metabolism and enhancing the bioavailability of orally administrated lipophilic drugs.

  15. Sound field separation technique based on equivalent source method and its application in nearfield acoustic holography.

    PubMed

    Bi, Chuan-Xing; Chen, Xin-Zhao; Chen, Jian

    2008-03-01

    A technique for separating sound fields using two closely spaced parallel measurement surfaces and based on equivalent source method is proposed. The method can separate wave components crossing two measurement surfaces in opposite directions, which makes nearfield acoustic holography (NAH) applications in a field where there exist sources on the two sides of the hologram surface, in a reverberant field or in a scattered field, possible. The method is flexible in applications, simple in computation, and very easy to implement. The measurement surfaces can be arbitrarily shaped, and they are not restricted to be regular as in the traditional field separation technique. And, because the method performs field separation calculations directly in the spatial domain-not in the wave number domain--it avoids the errors and limitations (the window effects, etc.) associated with the traditional field separation technique based on the spatial Fourier transform method. In the paper, a theoretical description is first given, and the performance of the proposed field separation technique and its application in NAH are then evaluated through experiments.

  16. Stable adaptive PI control for permanent magnet synchronous motor drive based on improved JITL technique.

    PubMed

    Zheng, Shiqi; Tang, Xiaoqi; Song, Bao; Lu, Shaowu; Ye, Bosheng

    2013-07-01

    In this paper, a stable adaptive PI control strategy based on the improved just-in-time learning (IJITL) technique is proposed for permanent magnet synchronous motor (PMSM) drive. Firstly, the traditional JITL technique is improved. The new IJITL technique has less computational burden and is more suitable for online identification of the PMSM drive system which is highly real-time compared to traditional JITL. In this way, the PMSM drive system is identified by IJITL technique, which provides information to an adaptive PI controller. Secondly, the adaptive PI controller is designed in discrete time domain which is composed of a PI controller and a supervisory controller. The PI controller is capable of automatically online tuning the control gains based on the gradient descent method and the supervisory controller is developed to eliminate the effect of the approximation error introduced by the PI controller upon the system stability in the Lyapunov sense. Finally, experimental results on the PMSM drive system show accurate identification and favorable tracking performance.

  17. Resonant fiber optic gyro based on a sinusoidal wave modulation and square wave demodulation technique.

    PubMed

    Wang, Linglan; Yan, Yuchao; Ma, Huilian; Jin, Zhonghe

    2016-04-20

    New developments are made in the resonant fiber optic gyro (RFOG), which is an optical sensor for the measurement of rotation rate. The digital signal processing system based on the phase modulation technique is capable of detecting the weak frequency difference induced by the Sagnac effect and suppressing the reciprocal noise in the circuit, which determines the detection sensitivity of the RFOG. A new technique based on the sinusoidal wave modulation and square wave demodulation is implemented, and the demodulation curve of the system is simulated and measured. Compared with the past technique using sinusoidal modulation and demodulation, it increases the slope of the demodulation curve by a factor of 1.56, improves the spectrum efficiency of the modulated signal, and reduces the occupancy of the field-programmable gate array resource. On the basis of this new phase modulation technique, the loop is successfully locked and achieves a short-term bias stability of 1.08°/h, which is improved by a factor of 1.47. PMID:27140098

  18. Intelligent technique for knowledge reuse of dental medical records based on case-based reasoning.

    PubMed

    Gu, Dong-Xiao; Liang, Chang-Yong; Li, Xing-Guo; Yang, Shan-Lin; Zhang, Pei

    2010-04-01

    With the rapid development of both information technology and the management of modern medical regulation, the generation of medical records tends to be increasingly intelligent. In this paper, Case-Based Reasoning is applied to the process of generating records of dental cases. Based on the analysis of the features of dental records, a case base is constructed. A mixed case retrieval method (FAIES) is proposed for the knowledge reuse of dental records by adopting Fuzzy Mathematics, which improves similarity algorithm based on Euclidian-Lagrangian Distance, and PULL & PUSH weight adjustment strategy. Finally, an intelligent system of dental cases generation (CBR-DENT) is constructed. The effectiveness of the system, the efficiency of the retrieval method, the extent of adaptation and the adaptation efficiency are tested using the constructed case base. It is demonstrated that FAIES is very effective in terms of reducing the time of writing medical records and improving the efficiency and quality. FAIES is also proven to be an effective aid for diagnoses and provides a new idea for the management of medical records and its applications.

  19. A scale space feature based registration technique for fusion of satellite imagery

    NASA Technical Reports Server (NTRS)

    Raghavan, Srini; Cromp, Robert F.; Campbell, William C.

    1997-01-01

    Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.

  20. Mining, compressing and classifying with extensible motifs

    PubMed Central

    Apostolico, Alberto; Comin, Matteo; Parida, Laxmi

    2006-01-01

    Background Motif patterns of maximal saturation emerged originally in contexts of pattern discovery in biomolecular sequences and have recently proven a valuable notion also in the design of data compression schemes. Informally, a motif is a string of intermittently solid and wild characters that recurs more or less frequently in an input sequence or family of sequences. Motif discovery techniques and tools tend to be computationally imposing, however, special classes of "rigid" motifs have been identified of which the discovery is affordable in low polynomial time. Results In the present work, "extensible" motifs are considered such that each sequence of gaps comes endowed with some elasticity, whereby the same pattern may be stretched to fit segments of the source that match all the solid characters but are otherwise of different lengths. A few applications of this notion are then described. In applications of data compression by textual substitution, extensible motifs are seen to bring savings on the size of the codebook, and hence to improve compression. In germane contexts, in which compressibility is used in its dual role as a basis for structural inference and classification, extensible motifs are seen to support unsupervised classification and phylogeny reconstruction. Conclusion Off-line compression based on extensible motifs can be used advantageously to compress and classify biological sequences. PMID:16722593

  1. Atmospheric phase screen correction in ground-based SAR with PS technique.

    PubMed

    Qiu, Zhiwei; Ma, Yuxiao; Guo, Xiantao

    2016-01-01

    Ground-based synthetic aperture radar (GBSAR) is a powerful tool used in monitoring structures, such as bridges and dams. However, despite the extremely short range of GBSAR interferometry, the atmosphere effects cannot be neglected. The permanent scatterer technique is an effective operational tool that utilizes a long series of SAR data and detects information with high accuracy. An algorithm based on the permanent scatterer technique is developed in accordance with the phase model used in GBSAR interferometry. In this study, atmospheric correction is carried out on a real campaign (Geheyan Dam, China). The atmosphere effects created using this method, which utilizes SAR data, can be reduced effectively compared to when plumb line data are used. PMID:27652167

  2. Deformation grating fabrication technique based on the solvent-assisted microcontact molding.

    PubMed

    Dai, Xianglu; Xie, Huimin; Wang, Huaixi

    2014-10-20

    A deformation grating fabrication technique based on solvent-assisted microcontact molding (SAMIM) is reported in this paper. The fabrication process can be divided into three steps: imprinting a grating on a medium polymer substrate (MPS) by SAMIM, coating a thin metal film on the MPS, and transferring the film to the measured surface. In order to increase the stiffness of the elastic mold without decreasing its conformal contact formation ability, a re-useable, glass-embedded polydimethylsiloxane (PDMS) mold is used. In addition, a characterization method based on the Fourier transform and phase analysis is proposed to check the quality of the fabricated grating. Verified by experiment, the proposed fabrication technique can fabricate a high-frequency large-area grating on different specimens, which can be a qualified deformation sensor for the moiré method. PMID:25402792

  3. Quantification of Virus Particles Using Nanopore-Based Resistive-Pulse Sensing Techniques

    PubMed Central

    Yang, Lu; Yamamoto, Takatoki

    2016-01-01

    Viruses have drawn much attention in recent years due to increased recognition of their important roles in virology, immunology, clinical diagnosis, and therapy. Because the biological and physical properties of viruses significantly impact their applications, quantitative detection of individual virus particles has become a critical issue. However, due to various inherent limitations of conventional enumeration techniques such as infectious titer assays, immunological assays, and electron microscopic observation, this issue remains challenging. Thanks to significant advances in nanotechnology, nanostructure-based electrical sensors have emerged as promising platforms for real-time, sensitive detection of numerous bioanalytes. In this paper, we review recent progress in nanopore-based electrical sensing, with particular emphasis on the application of this technique to the quantification of virus particles. Our aim is to provide insights into this novel nanosensor technology, and highlight its ability to enhance current understanding of a variety of viruses. PMID:27713738

  4. Recent advancements in sensing techniques based on functional materials for organophosphate pesticides.

    PubMed

    Kumar, Pawan; Kim, Ki-Hyun; Deep, Akash

    2015-08-15

    The use of organophosphate pesticides (OPs) for pest control in agriculture has caused serious environmental problems throughout the world. OPs are highly toxic with the potential to cause neurological disorders in humans. As the application of OPs has greatly increased in various agriculture activities, it has become imperative to accurately monitor their concentration levels for the protection of ecological systems and food supplies. Although there are many conventional methods available for the detection of OPs, the development of portable sensors is necessary to facilitate routine analysis with more convenience. Some of these potent alternative techniques based on functional materials include fluorescence nanomaterials based sensors, molecular imprinted (MIP) sensors, electrochemical sensors, and biosensors. This review explores the basic features of these sensing approaches through evaluation of their performance. The discussion is extended further to describe the challenges and opportunities for these unique sensing techniques.

  5. Novel On-wafer Radiation Pattern Measurement Technique for MEMS Actuator Based Reconfigurable Patch Antennas

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2002-01-01

    The paper presents a novel on-wafer, antenna far field pattern measurement technique for microelectromechanical systems (MEMS) based reconfigurable patch antennas. The measurement technique significantly reduces the time and the cost associated with the characterization of printed antennas, fabricated on a semiconductor wafer or dielectric substrate. To measure the radiation patterns, the RF probe station is modified to accommodate an open-ended rectangular waveguide as the rotating linearly polarized sampling antenna. The open-ended waveguide is attached through a coaxial rotary joint to a Plexiglas(Trademark) arm and is driven along an arc by a stepper motor. Thus, the spinning open-ended waveguide can sample the relative field intensity of the patch as a function of the angle from bore sight. The experimental results include the measured linearly polarized and circularly polarized radiation patterns for MEMS-based frequency reconfigurable rectangular and polarization reconfigurable nearly square patch antennas, respectively.

  6. Method based on the double sideband technique for the dynamic tracking of micrometric particles

    NASA Astrophysics Data System (ADS)

    Ramirez, Claudio; Lizana, Angel; Iemmi, Claudio; Campos, Juan

    2016-06-01

    Digital holography (DH) methods are of interest in a large number of applications. Recently, the double sideband (DSB) technique was proposed, which is a DH based method that, by using double filtering, provides reconstructed images without distortions and is free of twin images by using an in-line configuration. In this work, we implement a method for the investigation of the mobility of particles based on the DSB technique. Particle holographic images obtained using the DSB method are processed with digital picture recognition methods, allowing us to accurately track the spatial position of particles. The dynamic nature of the method is achieved experimentally by using a spatial light modulator. The suitability of the proposed tracking method is validated by determining the trajectory and velocity described by glass microspheres in movement.

  7. LIMB demonstration project extension

    SciTech Connect

    Not Available

    1990-09-21

    The purpose of the DOE limestone injection multistage burner (LIMB) Demonstration Project Extension is to extend the data base on LIMB technology and to expand DOE's list of Clean Coal Technologies by demonstrating the Coolside process as part of the project. The main objectives of this project are: to demonstrate the general applicability of LIMB technology by testing 3 coals and 4 sorbents (total of 12 coal/sorbent combinations) at the Ohio Edison Edgewater plant; and to demonstrate that Coolside is a viable technology for improving precipitator performance and reducing sulfur dioxide emissions while acceptable operability is maintained. Progress is reported. 3 figs.

  8. Phylogenetic relationships within the speciose family Characidae (Teleostei: Ostariophysi: Characiformes) based on multilocus analysis and extensive ingroup sampling

    PubMed Central

    2011-01-01

    Background With nearly 1,100 species, the fish family Characidae represents more than half of the species of Characiformes, and is a key component of Neotropical freshwater ecosystems. The composition, phylogeny, and classification of Characidae is currently uncertain, despite significant efforts based on analysis of morphological and molecular data. No consensus about the monophyly of this group or its position within the order Characiformes has been reached, challenged by the fact that many key studies to date have non-overlapping taxonomic representation and focus only on subsets of this diversity. Results In the present study we propose a new definition of the family Characidae and a hypothesis of relationships for the Characiformes based on phylogenetic analysis of DNA sequences of two mitochondrial and three nuclear genes (4,680 base pairs). The sequences were obtained from 211 samples representing 166 genera distributed among all 18 recognized families in the order Characiformes, all 14 recognized subfamilies in the Characidae, plus 56 of the genera so far considered incertae sedis in the Characidae. The phylogeny obtained is robust, with most lineages significantly supported by posterior probabilities in Bayesian analysis, and high bootstrap values from maximum likelihood and parsimony analyses. Conclusion A monophyletic assemblage strongly supported in all our phylogenetic analysis is herein defined as the Characidae and includes the characiform species lacking a supraorbital bone and with a derived position of the emergence of the hyoid artery from the anterior ceratohyal. To recognize this and several other monophyletic groups within characiforms we propose changes in the limits of several families to facilitate future studies in the Characiformes and particularly the Characidae. This work presents a new phylogenetic framework for a speciose and morphologically diverse group of freshwater fishes of significant ecological and evolutionary importance

  9. Studies of an extensively axisymmetric rocket based combined cycle (RBCC) engine powered single-stage-to-orbit (SSTO) vehicle

    NASA Technical Reports Server (NTRS)

    Foster, Richard W.; Escher, William J. D.; Robinson, John W.

    1989-01-01

    The present comparative performance study has established that rocket-based combined cycle (RBCC) propulsion systems, when incorporated by essentially axisymmetric SSTO launch vehicle configurations whose conical forebody maximizes both capture-area ratio and total capture area, are capable of furnishing payload-delivery capabilities superior to those of most multistage, all-rocket launchers. Airbreathing thrust augmentation in the rocket-ejector mode of an RBCC powerplant is noted to make a major contribution to final payload capability, by comparison to nonair-augmented rocket engine propulsion systems.

  10. Studies of an extensively axisymmetric rocket based combined cycle (RBCC) engine powered single-stage-to-orbit (SSTO) vehicle

    SciTech Connect

    Foster, R.W.; Escher, W.J.D.; Robinson, J.W.

    1989-01-01

    The present comparative performance study has established that rocket-based combined cycle (RBCC) propulsion systems, when incorporated by essentially axisymmetric SSTO launch vehicle configurations whose conical forebody maximizes both capture-area ratio and total capture area, are capable of furnishing payload-delivery capabilities superior to those of most multistage, all-rocket launchers. Airbreathing thrust augmentation in the rocket-ejector mode of an RBCC powerplant is noted to make a major contribution to final payload capability, by comparison to nonair-augmented rocket engine propulsion systems. 16 refs.

  11. Calibrating and training of neutron based NSA techniques with less SNM standards

    SciTech Connect

    Geist, William H; Swinhoe, Martyn T; Bracken, David S; Freeman, Corey R; Newell, Matthew R

    2010-01-01

    Accessing special nuclear material (SNM) standards for the calibration of and training on nondestructive assay (NDA) instruments has become increasingly difficult in light of enhanced safeguards and security regulations. Limited or nonexistent access to SNM has affected neutron based NDA techniques more than gamma ray techniques because the effects of multiplication require a range of masses to accurately measure the detector response. Neutron based NDA techniques can also be greatly affected by the matrix and impurity characteristics of the item. The safeguards community has been developing techniques for calibrating instrumentation and training personnel with dwindling numbers of SNM standards. Monte Carlo methods have become increasingly important for design and calibration of instrumentation. Monte Carlo techniques have the ability to accurately predict the detector response for passive techniques. The Monte Carlo results are usually benchmarked to neutron source measurements such as californium. For active techniques, the modeling becomes more difficult because of the interaction of the interrogation source with the detector and nuclear material; and the results cannot be simply benchmarked with neutron sources. A Monte Carlo calculated calibration curve for a training course in Indonesia of material test reactor (MTR) fuel elements assayed with an active well coincidence counter (AWCC) will be presented as an example. Performing training activities with reduced amounts of nuclear material makes it difficult to demonstrate how the multiplication and matrix properties of the item affects the detector response and limits the knowledge that can be obtained with hands-on training. A neutron pulse simulator (NPS) has been developed that can produce a pulse stream representative of a real pulse stream output from a detector measuring SNM. The NPS has been used by the International Atomic Energy Agency (IAEA) for detector testing and training applications at the

  12. [A comprehensive approach to designing of magnetotherapy techniques based on the Atos device].

    PubMed

    Raĭgorodskiĭ, Iu M; Semiachkin, G P; Tatarenko, D A

    1995-01-01

    The paper determines how to apply a comprehensive approach to designing magnetic therapeutical techniques based on concomitant exposures to two or more physical factors. It shows the advantages of the running pattern of a magnetic field and photostimuli in terms of optimization of physiotherapeutical exposures. An Atos apparatus with an Amblio-1 attachment is used as an example to demonstrate how to apply the comprehensive approach for ophthalmology.

  13. A survey of partition-based techniques for copy-move forgery detection.

    PubMed

    Diane, Wandji Nanda Nathalie; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques.

  14. Analytical Method for Selecting a Rectification Technique for a Piezoelectric Generator based on Admittance Measurement

    NASA Astrophysics Data System (ADS)

    Mateu, Loreto; Zessin, Henrik; Spies, Peter

    2013-12-01

    AC-DC converters employed for harvesting power from piezoelectric transducers can be divided into linear (i.e. diode bridge) and non-linear (i.e. synchronized switch harvesting on inductor, SSHI). This paper presents an analytical technique based on the measurement of the impedance circle of the piezoelectric element to determine whether either diode bridge or SSHI converter harvests more of the available power at the piezoelectric element.

  15. A Survey of Partition-Based Techniques for Copy-Move Forgery Detection

    PubMed Central

    Nathalie Diane, Wandji Nanda; Xingming, Sun; Moise, Fah Kue

    2014-01-01

    A copy-move forged image results from a specific type of image tampering procedure carried out by copying a part of an image and pasting it on one or more parts of the same image generally to maliciously hide unwanted objects/regions or clone an object. Therefore, detecting such forgeries mainly consists in devising ways of exposing identical or relatively similar areas in images. This survey attempts to cover existing partition-based copy-move forgery detection techniques. PMID:25152931

  16. A functional technique based on the Euclidean algorithm with applications to 2-D acoustic diffractal diffusers

    NASA Astrophysics Data System (ADS)

    Cortés-Vega, Luis

    2015-09-01

    We built, based on the Euclidean algorithm, a functional technique, which allows to discover a direct proof of Chinese Remainder Theorem. Afterwards, by using this functional approach, we present some applications to 2-D acoustic diffractal diffusers. The novelty of the method is their functional algorithmic character, which improves ideas, as well as, other results of the author and his collaborators in a previous work.

  17. Monolayer MoS2 metal insulator transition based memcapacitor modeling with extension to a ternary device

    NASA Astrophysics Data System (ADS)

    Khan, Abdul Karim; Lee, Byoung Hun

    2016-09-01

    Memcapacitor model based on its one possible physical realization is developed and simulated in order to know its limitation before making a real device. The proposed device structure consists of vertically stacked dielectric layer and MoS2 monolayer between two external metal plates. The Metal Insulator Transition (MIT) phenomenon of MoS2 monolayer is represented in terms of percolation probabilty which is used as the system state. Cluster based site percolation theory is used to mimic the MIT of MoS2 which shows slight discontinuous change in MoS2 monolayer conductivity. The metal to insulator transition switches the capacitance of the device in hysterical way. An Ioffe Regel criterion is used to determine the MIT state of MoS2 monolayer. A good control of MIT time in the range of psec is also achieved by changing a single parameter in the model. The model shows memcapacitive behavior with an edge of fast switching (in psec range) over the previous general models. The model is then extended into vertical cascaded version which behaves like a ternary device instead of binary.

  18. A novel software-based technique for quantifying placental calcifications and infarctions from ultrasound

    NASA Astrophysics Data System (ADS)

    Ryan, John T.; McAuliffe, Fionnuala; Higgins, Mary; Stanton, Marie; Brennan, Patrick

    2008-03-01

    In obstetrics, antenatal ultrasound assessment of placental morphology comprises an important part of the estimation of fetal health. Ultrasound analysis of the placenta may reveal abnormalities such as placental calcification and infarcts. Current methods of quantification of these abnormalities are subjective and involve a grading system of Grannum stages I-III. The aim of this project is to develop a software tool that quantifies semi-automatically placental ultrasound images and facilitates the assessment of placental morphology. We have developed a 2D ultrasound imaging software tool that allows the obstetrician or sonographer to define the placental region of interest. A secondary reference map is created for use in our quantification algorithm. Using a slider technique the user can easily define an upper threshold based on high intensity for calcification classification and a lower threshold to define infarction regions based on low intensity within the defined region of interest. The percentage of the placental area that is calcified and also the percentage of infarction is calculated and this is the basis of our new metric. Ultrasound images of abnormal and normal placentas have been acquired to aid our software development. A full clinical prospective evaluation is currently being performed and we are currently applying this technology to the three-dimensional ultrasound domain. We have developed a novel software-based technique for calculating the extent of placental calcification and infarction, providing a new metric in this field. Our new metric may provide a more accurate measurement of placental calcification and infarction than current techniques.

  19. Development of PCR-based technique for detection of purity of Pashmina fiber from textile materials.

    PubMed

    Kumar, Rajiv; Shakyawar, D B; Pareek, P K; Raja, A S M; Prince, L L L; Kumar, Satish; Naqvi, S M K

    2015-04-01

    Pashmina fiber is one of major specialty animal fiber in India. The quality of Pashmina obtained from Changthangi and Chegu goats in India is very good. Due to restricted availability and high prices, adulteration of natural prized fibers is becoming a common practice by the manufacturers. Sheep wool is a cheap substitute, which is usually used for adulteration and false declaration of Pashmina-based products. Presently, there is lack of cost-effective and readily available methodology to identify the adulteration of Pashmina products from other similar looking substitutes like sheep wool. Polymerase chain reaction (PCR)-based detection method can be used to identify origin of animal fiber. Extraction of quality DNA from dyed and processed animal fiber and textile materials is a limiting factor in the development of such detection methods. In the present study, quality DNA was extracted from textile materials, and PCR-based technique using mitochondrial gene (12S rRNA) specific primers was developed for detection of the Pashmina in textile blends. This technique has been used for detection of the adulteration of the Pashmina products with sheep wool. The technique can detect adulteration level up to 10 % of sheep/goat fibers in textile blends.

  20. The application of a DNA-based identification technique to over-the-counter herbal medicines.

    PubMed

    Kazi, Tazimuddin; Hussain, Nazreen; Bremner, Paul; Slater, Adrian; Howard, Caroline

    2013-06-01

    Reliable methods to identify medicinal plant material are becoming more important in an increasingly regulated market place. DNA-based methods have been recognised as a valuable tool in this area with benefits such as being unaffected by the age of the plant material, growth conditions and harvesting techniques. It is possible that the methods of production used for medicinal plant products will degrade or remove DNA. So how applicable are these techniques to processed medicinal plant products? A simple PCR-based identification technique has been developed for St. John's Wort, Hypericum perforatum L. Thirteen St. John's Wort products were purchased including capsules, tablets and tinctures. DNA was extracted from each product, and the species specific PCR test conducted. DNA was successfully extracted from all thirteen products, using a fast and efficient modified method for extracting DNA from tinctures. Only four products yielded the full length ITS region (850 bp) due to the quality of the DNA. All of the products tested positive for H. perforatum DNA. DNA-based identification methods can complement existing methods of authentication. This paper shows that these methods are applicable to a wide range of processed products, provided that they are designed to account for the possibility of DNA degradation. PMID:23500384

  1. A review of the different techniques for solid surface acid-base characterization.

    PubMed

    Sun, Chenhang; Berg, John C

    2003-09-18

    In this work, various techniques for solid surface acid-base (AB) characterization are reviewed. Different techniques employ different scales to rank acid-base properties. Based on the results from literature and the authors' own investigations for mineral oxides, these scales are compared. The comparison shows that Isoelectric Point (IEP), the most commonly used AB scale, is not a description of the absolute basicity or acidity of a surface, but a description of their relative strength. That is, a high IEP surface shows more basic functionality comparing with its acidic functionality, whereas a low IEP surface shows less basic functionality comparing with its acidic functionality. The choice of technique and scale for AB characterization depends on the specific application. For the cases in which the overall AB property is of interest, IEP (by electrokinetic titration) and H(0,max) (by indicator dye adsorption) are appropriate. For the cases in which the absolute AB property is of interest such as in the study of adhesion, it is more pertinent to use chemical shift (by XPS) and the heat of adsorption of probe gases (by calorimetry or IGC).

  2. An extensively hydrolysed rice protein-based formula in the management of infants with cow's milk protein allergy: preliminary results after 1 month

    PubMed Central

    Vandenplas, Yvan; De Greef, Elisabeth; Hauser, Bruno

    2014-01-01

    Background Guidelines recommend extensively hydrolysed cow's milk protein formulas (eHF) in the treatment of infants diagnosed with cow's milk protein allergy (CMPA). Extensively hydrolysed rice protein infant formulas (eRHFs) have recently become available, and could offer a valid alternative. Methods A prospective trial was performed to evaluate the clinical tolerance of a new eRHF in infants with a confirmed CMPA. Patients were followed for 1 month. Clinical tolerance of the eRHF was evaluated with a symptom-based score (SBS) and growth (weight and length) was monitored. Results Thirty-nine infants (mean age 3.4 months, range 0.5–6 months) diagnosed with CMPA were enrolled. All infants tolerated the eRHF and experienced a normal growth. Conclusions In accordance with current guidelines, this eRHF is tolerated by more than 90% of children with proven CMPA with a 95% CI, and is an adequate alternative to cow's milk-based eHF. Trial registration number ClinicalTrials.gov NCT01998074. PMID:24914098

  3. An efficient algorithm for multipole energies and derivatives based on spherical harmonics and extensions to particle mesh Ewald

    PubMed Central

    Simmonett, Andrew C.; Pickard, Frank C.; Schaefer, Henry F.; Brooks, Bernard R.

    2014-01-01

    Next-generation molecular force fields deliver accurate descriptions of non-covalent interactions by employing more elaborate functional forms than their predecessors. Much work has been dedicated to improving the description of the electrostatic potential (ESP) generated by these force fields. A common approach to improving the ESP is by augmenting the point charges on each center with higher-order multipole moments. The resulting anisotropy greatly improves the directionality of the non-covalent bonding, with a concomitant increase in computational cost. In this work, we develop an efficient strategy for enumerating multipole interactions, by casting an efficient spherical harmonic based approach within a particle mesh Ewald (PME) framework. Although the derivation involves lengthy algebra, the final expressions are relatively compact, yielding an approach that can efficiently handle both finite and periodic systems without imposing any approximations beyond PME. Forces and torques are readily obtained, making our method well suited to modern molecular dynamics simulations. PMID:24832247

  4. Validation and extension of the PREMM1,2 model in a population-based cohort of colorectal cancer patients

    PubMed Central

    Balaguer, Francesc; Balmaña, Judith; Castellví-Bel, Sergi; Steyerberg, Ewout W.; Andreu, Montserrat; Llor, Xavier; Jover, Rodrigo; Syngal, Sapna; Castells, Antoni

    2008-01-01

    Summary Background and aims Early recognition of patients at risk for Lynch syndrome is critical but often difficult. Recently, a predictive algorithm -the PREMM1,2 model- has been developed to quantify the risk of carrying a germline mutation in the mismatch repair (MMR) genes, MLH1 and MSH2. However, its performance in an unselected, population-based colorectal cancer population as well as its performance in combination with tumor MMR testing are unknown. Methods We included all colorectal cancer cases from the EPICOLON study, a prospective, multicenter, population-based cohort (n=1,222). All patients underwent tumor microsatellite instability analysis and immunostaining for MLH1 and MSH2, and those with MMR deficiency (n=91) underwent tumor BRAF V600E mutation analysis and MLH1/MSH2 germline testing. Results The PREMM1,2 model with a ≥5% cut-off had a sensitivity, specificity and positive predictive value (PPV) of 100%, 68% and 2%, respectively. The use of a higher PREMM1,2 cut-off provided a higher specificity and PPV, at expense of a lower sensitivity. The combination of a ≥5% cut-off with tumor MMR testing maintained 100% sensitivity with an increased specificity (97%) and PPV (21%). The PPV of a PREMM1,2 score ≥20% alone (16%) approached the PPV obtained with PREMM1,2 score ≥5% combined with tumor MMR testing. In addition, a PREMM1,2 score of <5% was associated with a high likelihood of a BRAF V600E mutation. Conclusions The PREMM1,2 model is useful to identify MLH1/MSH2 mutation carriers among unselected colorectal cancer patients. Quantitative assessment of the genetic risk might be useful to decide on subsequent tumor MMR and germline testing. PMID:18061181

  5. A model-based technique for predicting pilot opinion ratings for large commercial transports

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1982-01-01

    A model-based technique for predicting pilot opinion ratings is described. Features of this procedure, which is based on the optimal-control model for pilot/vehicle systems, include (1) capability to treat "unconventional" aircraft dynamics, (2) a relatively free-form pilot model, (3) a simple scalar metric for attentional workload, and (4) a straightforward manner of proceeding from descriptions of the flight task environment and requirements to a prediction of pilot opinion rating. The method was able to provide a good match to a set of pilot opinion ratings obtained in a manned simulation study of large commercial aircraft in landing approach.

  6. A model-based technique for predicting pilot opinion ratings for large commercial transports

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1980-01-01

    A model-based technique for predicting pilot opinion ratings is described. Features of this procedure, which is based on the optimal-control model for pilot/vehicle systems, include (1) capability to treat 'unconventional' aircraft dynamics, (2) a relatively free-form pilot model, (3) a simple scalar metric for attentional workload, and (4) a straightforward manner of proceeding from descriptions of the flight task environment and requirements to a prediction of pilot opinion rating. The method is able to provide a good match to a set of pilot opinion ratings obtained in a manned simulation study of large commercial aircraft in landing approach.

  7. [Thinking on TCM literature evaluation methods and techniques based on mass information].

    PubMed

    Xie, Qi; Cui, Meng; Pan, Yan-li

    2007-08-01

    The necessity and feasibility of TCM literature evaluation based on mass information of TCM literature was discussed in this paper. Beginning with the description on current situation of mass TCM literature information research, the authors offered a tentative plan for evaluating scientific and technologic TCM literature, its method and technique, and systematically analyzed the key issues, such as the subjects selection, documents screening and sorting, literature analysis, and development of software analysis platform, then, the methodology and the technology for constituting the mass TCM literature information based evaluation system was systemically clarified.

  8. Robust and discriminating method for face recognition based on correlation technique and independent component analysis model.

    PubMed

    Alfalou, A; Brosseau, C

    2011-03-01

    We demonstrate a novel technique for face recognition. Our approach relies on the performances of a strongly discriminating optical correlation method along with the robustness of the independent component analysis (ICA) model. Simulations were performed to illustrate how this algorithm can identify a face with images from the Pointing Head Pose Image Database. While maintaining algorithmic simplicity, this approach based on ICA representation significantly increases the true recognition rate compared to that obtained using our previously developed all-numerical ICA identity recognition method and another method based on optical correlation and a standard composite filter. PMID:21368935

  9. A chromism-based assay (CHROBA) technique for in situ detection of protein kinase activity.

    PubMed

    Tomizaki, Kin-ya; Jie, Xu; Mihara, Hisakazu

    2005-03-15

    A unique chromism-based assay technique (CHROBA) using photochromic spiropyran-containing peptides has been firstly established for detection of protein kinase A-catalyzed phosphorylation. The alternative method has advantages that avoid isolation and/or immobilization of kinase substrates to remove excess reagents including nonreactive isotope-labeled ATP or fluorescently-labeled anti-phosphoamino acid antibodies from the reaction mixture. Such a novel protocol based on thermocoloration of the spiropyran moiety in the peptide can offer not only an efficient screening method of potent kinase substrates but also a versatile analytical tool for monitoring other post-translational modification activities. PMID:15745830

  10. Precoding techniques for PAPR reduction in asymmetrically clipped OFDM based optical wireless system

    NASA Astrophysics Data System (ADS)

    Ranjha, Bilal; Kavehrad, Mohsen

    2013-01-01

    In this paper, we have analyzed different precoding based Peak-to-Average-Power (PAPR) reduction techniques for asymmetrically-clipped Orthogonal Frequency Division Multiplexing (OFDM) optical wireless communication systems. Intensity Modulated Direct Detection (IM/DD) technique is among the popular techniques for optical wireless communication systems. OFDM cannot be directly applied to IM systems because of the bipolar nature of the output signal. Therefore some variants of OFDM systems have been proposed for (IM/DD) optical wireless systems. Among them are DC-biased-OFDM, Asymmetrically-Clipped Optical OFDM (ACO-OFDM) [2] and Pulse Amplitude Modulated Discrete Multitone (PAM-DMT) [3]. Both ACO-OFDM and PAM-DMT require low average power and thus are very attractive for optical wireless systems. OFDM systems suffer from high PAPR problem that can limit its performance due to non-linear characteristics of LED. Therefore PAPR reduction techniques have to be employed. This paper analyzes precoding based PAPR reduction methods for ACO-OFDM and PAM-DMT. We have used Discrete Fourier Transform (DFT) coding, Zadoff-Chu Transform (ZCT) [8] and Discrete Cosine Transform (DCT) for ACOOFDM and only DCT for PAM-DMT since the modulating symbols are real. We have compared the performance of these precoding techniques using different QAM modulation schemes. Simulation results have shown that both DFT and ZCT offer more PAPR reduction than DCT in ACO-OFDM. For PAM-DMT, DCT precoding yields significant PAPR reduction compared to conventional PAM-DMT signal. These precoding schemes also offer the advantage of zero signaling overhead.

  11. Carbon Dioxide Capture and Separation Techniques for Gasification-based Power Generation Point Sources

    SciTech Connect

    Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I.; Heintz, Y.J.; Ilconich, J.B.

    2007-06-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.

  12. A Web-Based Data Architecture for Problem-Solving Environments: Application of Distributed Authoring and Versioning to the Extensible Computational Chemistry Environment

    SciTech Connect

    Schuchardt, Karen L. ); Myers, James D. ); Stephan, Eric G. )

    2001-12-01

    Next-generation problem-solving environments (PSEs) promise significant advances over those now available. They will span scientific disciplines and incorporate collaboration capabilities. They will host feature-detection and other agents, allow data mining and pedigree tracking, and provide access from a wide range of devices. Fundamental changes in PSE architecture are required to realize these and other PSE goals. This paper focuses specifically on issues related to data management and recommends an approach based on open, metadata-driven repositories with loosely defined, dynamic schemas. Benefits of this approach are discussed, and the redesign of the Extensible Computational Chemistry Environment's (Ecce) data storage architecture to use such a repository is described, based on the distributed authoring and versioning (DAV) standard. The suitability of DAV for scientific data, the mapping of the Ecce schema to DAV, and promising initial results are presented.

  13. Spatial and Temporal Variation of Bulk Snow Properties in North Boreal and Tundra Environments Based on Extensive Field Measurements

    NASA Astrophysics Data System (ADS)

    Hannula, H. R.; Lemmetyinen, J.; Pulliainen, J.; Kontu, A.; Derksen, C.

    2015-12-01

    A large collection of in situ snow data was collected in a support of ESA SnowSAR airborne acquisitions in Northern Finland (Lemmetyinen et al., 2014). The purpose was to demonstrate the mission concept of the proposed ESA CoReH2O (Cold Regions Hydrology High-resolution Observatory, Rott et al., 2010) mission, a candidate in the ESA Earth Explorer series of Earth observing satellites. Around 21400 snow depth measurements, 600 SWE measurements and a number of distributed snow pit measurements were collected during 19 days between December 2011 and March 2012. In this study, these field measurements will be used to analyse the snow property variation within and between different land-cover types in North boreal and tundra environments. The wide heterogeneity of snow properties forms a challenge for remote snow information retrieval as even in flat areas the amount and type of heterogeneity vary in a number of different scales. Especially, information of hemispheric scale SWE variation is suffering from large uncertainties, although, assimilation of ground-based and space-borne information and algorithms specific for a land-cover type have lowered these remaining uncertainties (Takala et al., 2011). This study aims to contribute to this future work for more reliable SWE retrievals by statistically describing the snow parameter variation in these northern environments. Lemmetyinen, J., J. Pulliainen, A. Kontu, A. Wiesmann, C. Mätzler, H. Rott, K. Voglmeier, T. Nagler, A. Meta, A. Coccia, M. Schneebeli, M. Proksch, M. Davidson, D. Schüttemeyer, Chung-Chi Lin, and M. Kern, 2014. Observations of seasonal snow cover at X- and Ku bands during the NoSREx campaign. Proc. EUSAR 2014, 3-5 June 2014, Berlin. Rott, H., S.H. Yueh, D.W. Cline, C. Duguay, R. Essery et al., 2010. Cold Regions Hydrology High-resolution Observatory for Snow and Cold Land Processes. Proc. IEEE, 98(5), 752-765. Takala, M., K. Luojus, J. Pulliainen, C. Derksen, J. Lemmetyinen, J-P. Kärnä, J. Koskinen

  14. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    PubMed

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  15. Functional model of monofin swimming technique based on the construction of neural networks.

    PubMed

    Rejman, Marek; Ochmann, Bartosz

    2007-01-01

    In this study we employed an Artificial Neuronal Network to analyze the forces flexing the monofin in reaction to water resistance. In addition we selected and characterized key kinematic parameters of leg and monofin movements that define how to use a monofin efficiently and economically to achieve maximum swimming speed. By collecting the data recorded by strain gauges placed throughout the monofin, we were able to demonstrate the distribution of forces flexing the monofin in a single movement cycle. Kinematic and dynamic data were synchronized and used as entry variable to build up a Multi-Layer Perception Network. The horizontal velocity of the swimmer's center of body mass was used as an output variable. The network response graphs indicated the criteria for achieving maximum swimming speed. Our results pointed out the need to intensify the angular velocity of thigh extension and dorsal flexion of the feet, to strengthen velocity of attack of the tail and to accelerate the attack of the distal part of the fin. The other two parameters which should be taken into account are dynamics of tail flexion change in downbeat and dynamics of the change in angle of attack in upbeat. Key pointsThe one-dimensional structure of the monofin swimming creates favorable conditions to study the swimming technique.Monofin swimming modeling allows unequivocal interpretation of the propulsion structure. This further permits to define the mechanisms, which determine efficient propulsion.This study is the very first one in which the Neuronal Networks was applied to construct a functional/applicable to practice model of monofin swimming.The objective suggestions lead to formulating the criteria of monofin swimming technique, which plays the crucial role in achieving maximal swimming speed.Theoretical and empirical (realistic) verification created by parameters indicate by neural networks, paves the way for creating suitable models, which could be employed for other sports.

  16. Functional Model of Monofin Swimming Technique Based on the Construction of Neural Networks

    PubMed Central

    Rejman, Marek; Ochmann, Bartosz

    2007-01-01

    In this study we employed an Artificial Neuronal Network to analyze the forces flexing the monofin in reaction to water resistance. In addition we selected and characterized key kinematic parameters of leg and monofin movements that define how to use a monofin efficiently and economically to achieve maximum swimming speed. By collecting the data recorded by strain gauges placed throughout the monofin, we were able to demonstrate the distribution of forces flexing the monofin in a single movement cycle. Kinematic and dynamic data were synchronized and used as entry variable to build up a Multi-Layer Perception Network. The horizontal velocity of the swimmer’s center of body mass was used as an output variable. The network response graphs indicated the criteria for achieving maximum swimming speed. Our results pointed out the need to intensify the angular velocity of thigh extension and dorsal flexion of the feet, to strengthen velocity of attack of the tail and to accelerate the attack of the distal part of the fin. The other two parameters which should be taken into account are dynamics of tail flexion change in downbeat and dynamics of the change in angle of attack in upbeat. Key pointsThe one-dimensional structure of the monofin swimming creates favorable conditions to study the swimming technique.Monofin swimming modeling allows unequivocal interpretation of the propulsion structure. This further permits to define the mechanisms, which determine efficient propulsion.This study is the very first one in which the Neuronal Networks was applied to construct a functional/applicable to practice model of monofin swimming.The objective suggestions lead to formulating the criteria of monofin swimming technique, which plays the crucial role in achieving maximal swimming speed.Theoretical and empirical (realistic) verification created by parameters indicate by neural networks, paves the way for creating suitable models, which could be employed for other sports. PMID

  17. Tests of an extension of the dual pathway model of bulimic symptoms to the state-based level.

    PubMed

    Holmes, Millicent; Fuller-Tyszkiewicz, Matthew; Skouteris, Helen; Broadbent, Jaclyn

    2014-04-01

    The dual pathway model proposes that trait body dissatisfaction leads to bulimic symptoms via two distinct pathways: dieting and trait negative affect. As many of these modelled variables have state-based equivalents, the present study evaluated the generalisability of this model to predict associations between state body dissatisfaction and instances of disordered eating. 124 women aged 18 to 40 years completed an online survey (accessed via a mobile phone device with web access) over a 7-day period. The mobile phone device prompted participants at random intervals seven times daily to self-report their state body dissatisfaction, current mood experiences, dieting attempts, and disordered eating practices. Multi-level mediation modelling revealed that both negative mood states and dieting significantly mediated the state body dissatisfaction-disordered eating relationships, although the strength of these associations depended on the aspect of disordered eating measured and individual differences in trait body dissatisfaction, internalization of appearance standards, tendency towards dieting, and BMI. Collectively, these results not only support adapting the dual pathway model to the state-level, but also suggest that several of the model implied pathways may be more relevant for individuals with more pathological eating- and body-related concerns and behaviours. PMID:24854819

  18. Orders of magnitude extension of the effective dynamic range of TDC-based TOFMS data through maximum likelihood estimation.

    PubMed

    Ipsen, Andreas; Ebbels, Timothy M D

    2014-10-01

    In a recent article, we derived a probability distribution that was shown to closely approximate that of the data produced by liquid chromatography time-of-flight mass spectrometry (LC/TOFMS) instruments employing time-to-digital converters (TDCs) as part of their detection system. The approach of formulating detailed and highly accurate mathematical models of LC/MS data via probability distributions that are parameterized by quantities of analytical interest does not appear to have been fully explored before. However, we believe it could lead to a statistically rigorous framework for addressing many of the data analytical problems that arise in LC/MS studies. In this article, we present new procedures for correcting for TDC saturation using such an approach and demonstrate that there is potential for significant improvements in the effective dynamic range of TDC-based mass spectrometers, which could make them much more competitive with the alternative analog-to-digital converters (ADCs). The degree of improvement depends on our ability to generate mass and chromatographic peaks that conform to known mathematical functions and our ability to accurately describe the state of the detector dead time-tasks that may be best addressed through engineering efforts.

  19. Extensive Set of 16S rRNA-Based Probes for Detection of Bacteria in Human Feces

    PubMed Central

    Harmsen, Hermie J. M.; Raangs, Gerwin C.; He, Tao; Degener, John E.; Welling, Gjalt W.

    2002-01-01

    For the detection of six groups of anaerobic bacteria in human feces, we designed seven new 16S rRNA-based oligonucleotide probes. This set of probes extends the current set of probes and gives more data on the composition of the human gut flora. Probes were designed for Phascolarctobacterium and relatives (Phasco741), Veillonella (Veil223), Eubacterium hallii and relatives (Ehal1469), Lachnospira and relatives (Lach571), and Eubacterium cylindroides and relatives (Ecyl387), and two probes were designed for Ruminococcus and relatives (Rbro730 and Rfla729). The hybridization conditions for the new probes were optimized for fluorescent in situ hybridization, and the probes were validated against a set of reference organisms. The probes were applied to fecal samples of 11 volunteers to enumerate their target bacterial groups. The Phasco741 and Veil223 probes both detected average numbers below 1% of the total number of bacteria as determined with the bacterial kingdom-specific Bact338 probe. The Ecyl387 probe detected about 1.4%, the Lach571 and Ehal1469 probes detected 3.8 and 3.6%, respectively, and a combination of the Rbro730 and Rfla729 probes detected 10.3%. A set of 15 probes consisting of probes previously described and those presented here were evaluated in hybridization with the fecal samples of the same volunteers. Together, the group-specific probes detected 90% of the total bacterial cells. PMID:12039758

  20. Document-level classification of CT pulmonary angiography reports based on an extension of the ConText algorithm.

    PubMed

    Chapman, Brian E; Lee, Sean; Kang, Hyunseok Peter; Chapman, Wendy W

    2011-10-01

    In this paper we describe an application called peFinder for document-level classification of CT pulmonary angiography reports. peFinder is based on a generalized version of the ConText algorithm, a simple text processing algorithm for identifying features in clinical report documents. peFinder was used to answer questions about the disease state (pulmonary emboli present or absent), the certainty state of the diagnosis (uncertainty present or absent), the temporal state of an identified pulmonary embolus (acute or chronic), and the technical quality state of the exam (diagnostic or not diagnostic). Gold standard answers for each question were determined from the consensus classifications of three human annotators. peFinder results were compared to naive Bayes' classifiers using unigrams and bigrams. The sensitivities (and positive predictive values) for peFinder were 0.98(0.83), 0.86(0.96), 0.94(0.93), and 0.60(0.90) for disease state, quality state, certainty state, and temporal state respectively, compared to 0.68(0.77), 0.67(0.87), 0.62(0.82), and 0.04(0.25) for the naive Bayes' classifier using unigrams, and 0.75(0.79), 0.52(0.69), 0.59(0.84), and 0.04(0.25) for the naive Bayes' classifier using bigrams. PMID:21459155