Sample records for commonly applied techniques

  1. Performance Characterization of an Instrument.

    ERIC Educational Resources Information Center

    Salin, Eric D.

    1984-01-01

    Describes an experiment designed to teach students to apply the same statistical awareness to instrumentation they commonly apply to classical techniques. Uses propagation of error techniques to pinpoint instrumental limitations and breakdowns and to demonstrate capabilities and limitations of volumetric and gravimetric methods. Provides lists of…

  2. Closed-form Static Analysis with Inertia Relief and Displacement-Dependent Loads Using a MSC/NASTRAN DMAP Alter

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.

    1995-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.

  3. MSC/NASTRAN DMAP Alter Used for Closed-Form Static Analysis With Inertia Relief and Displacement-Dependent Loads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.

  4. Techniques for characterizing lignin

    Treesearch

    Nicole M. Stark; Daniel J. Yelle; Umesh P. Agarwal

    2016-01-01

    Many techniques are available to characterize lignin. The techniques presented in this chapter are considered nondegradative, which are commonly applied to lignin. A brief discussion of lignin structure is included with this chapter to aid the reader in understanding why the discussed characterization techniques are appropriate for the study of lignin. Because the...

  5. Vocal cord paralysis in children.

    PubMed

    King, Ericka F; Blumin, Joel H

    2009-12-01

    Vocal fold paralysis (VFP) is an increasingly commonly identified problem in the pediatric patient. Diagnostic and management techniques honed in adult laryngologic practice have been successfully applied to children. Iatrogenic causes, including cardiothoracic procedures, remain a common cause of unilateral VFP. Neurologic disorders predominate in the cause of bilateral VFP. Diagnosis with electromyography is currently being evaluated in children. Treatment of VFP is centered around symptomology, which is commonly divided between voice and airway concerns. Speech therapy shows promise in older children. Surgical management for unilateral VFP with injection laryngoplasty is commonly performed and well tolerated. Laryngeal reinnervation is currently being applied to the pediatric population as a permanent treatment and offers several advantages over laryngeal framework procedures. For bilateral VFP, tracheotomy is still commonly performed. Glottic dilation procedures are performed both openly and endoscopically with a high degree of success. VFP is a well recognized problem in pediatric patients with disordered voice and breathing. Some patients will spontaneously recover their laryngeal function. For those who do not, a variety of reliable techniques are available for rehabilitative treatment.

  6. Pairwise-Comparison Software

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1995-01-01

    Pairwise comparison (PWC) is computer program that collects data for psychometric scaling techniques now used in cognitive research. It applies technique of pairwise comparisons, which is one of many techniques commonly used to acquire the data necessary for analyses. PWC administers task, collects data from test subject, and formats data for analysis. Written in Turbo Pascal v6.0.

  7. Runtime support for parallelizing data mining algorithms

    NASA Astrophysics Data System (ADS)

    Jin, Ruoming; Agrawal, Gagan

    2002-03-01

    With recent technological advances, shared memory parallel machines have become more scalable, and offer large main memories and high bus bandwidths. They are emerging as good platforms for data warehousing and data mining. In this paper, we focus on shared memory parallelization of data mining algorithms. We have developed a series of techniques for parallelization of data mining algorithms, including full replication, full locking, fixed locking, optimized full locking, and cache-sensitive locking. Unlike previous work on shared memory parallelization of specific data mining algorithms, all of our techniques apply to a large number of common data mining algorithms. In addition, we propose a reduction-object based interface for specifying a data mining algorithm. We show how our runtime system can apply any of the technique we have developed starting from a common specification of the algorithm.

  8. Auricular reconstruction for microtia: Part II. Surgical techniques.

    PubMed

    Walton, Robert L; Beahm, Elisabeth K

    2002-07-01

    Reconstruction of the microtic ear represents one of the most demanding challenges in reconstructive surgery. In this review the two most commonly used techniques for ear reconstruction, the Brent and Nagata techniques, are addressed in detail. Unique to this endeavor, the originator of each technique has been allowed to submit representative case material and to address the pros and cons of the other's technique. What follows is a detailed, insightful overview of microtia reconstruction, as a state of the art. The review then details commonly encountered problems in ear reconstruction and pertinent technical points. Finally, a glimpse into the future is offered with an accounting of the advances made in tissue engineering as this technology applies to auricular reconstruction.

  9. Mechanisms of behavior modification in clinical behavioral medicine in China.

    PubMed

    Yang, Zhiyin; Su, Zhonghua; Ji, Feng; Zhu, Min; Bai, Bo

    2014-08-01

    Behavior modification, as the core of clinical behavioral medicine, is often used in clinical settings. We seek to summarize behavior modification techniques that are commonly used in clinical practice of behavioral medicine in China and discuss possible biobehavioral mechanisms. We reviewed common behavior modification techniques in clinical settings in China, and we reviewed studies that explored possible biobehavioral mechanisms. Commonly used clinical approaches of behavior modification in China include behavior therapy, cognitive therapy, cognitive-behavioral therapy, health education, behavior management, behavioral relaxation training, stress management intervention, desensitization therapy, biofeedback therapy, and music therapy. These techniques have been applied in the clinical treatment of a variety of diseases, such as chronic diseases, psychosomatic diseases, and psychological disorders. The biobehavioral mechanisms of these techniques involve the autonomic nervous system, neuroendocrine system, neurobiochemistry, and neuroplasticity. Behavior modification techniques are commonly used in the treatment of a variety of somatic and psychological disorders in China. Multiple biobehavioral mechanisms are involved in successful behavior modification.

  10. Autoclave heat treatment for prealloyed powder products

    NASA Technical Reports Server (NTRS)

    Freche, J. C.; Ashbrook, R. L.

    1973-01-01

    Technique could be applied directly to loose powders as part of hot pressing process of forming them to any required shapes. This would eliminate initial extrusion step commonly applied to prealloyed powders, substantially reduce cost of forming operation, and result in optimum properties.

  11. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  12. Solid Phase Extraction (SPE) for Biodiesel Processing and Analysis

    DTIC Science & Technology

    2017-12-13

    1 METHODS ...sources. There are several methods than can be applied to development of separation techniques that may replace necessary water wash steps in...biodiesel refinement. Unfortunately, the most common methods are poorly suited or face high costs when applied to diesel purification. Distillation is

  13. Techniques to evaluate the importance of common cause degradation on reliability and safety of nuclear weapons.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darby, John L.

    2011-05-01

    As the nuclear weapon stockpile ages, there is increased concern about common degradation ultimately leading to common cause failure of multiple weapons that could significantly impact reliability or safety. Current acceptable limits for the reliability and safety of a weapon are based on upper limits on the probability of failure of an individual item, assuming that failures among items are independent. We expanded the current acceptable limits to apply to situations with common cause failure. Then, we developed a simple screening process to quickly assess the importance of observed common degradation for both reliability and safety to determine if furthermore » action is necessary. The screening process conservatively assumes that common degradation is common cause failure. For a population with between 100 and 5000 items we applied the screening process and conclude the following. In general, for a reliability requirement specified in the Military Characteristics (MCs) for a specific weapon system, common degradation is of concern if more than 100(1-x)% of the weapons are susceptible to common degradation, where x is the required reliability expressed as a fraction. Common degradation is of concern for the safety of a weapon subsystem if more than 0.1% of the population is susceptible to common degradation. Common degradation is of concern for the safety of a weapon component or overall weapon system if two or more components/weapons in the population are susceptible to degradation. Finally, we developed a technique for detailed evaluation of common degradation leading to common cause failure for situations that are determined to be of concern using the screening process. The detailed evaluation requires that best estimates of common cause and independent failure probabilities be produced. Using these techniques, observed common degradation can be evaluated for effects on reliability and safety.« less

  14. New Researches and Application Progress of Commonly Used Optical Molecular Imaging Technology

    PubMed Central

    Chen, Zhi-Yi; Yang, Feng; Lin, Yan; Zhou, Qiu-Lan; Liao, Yang-Ying

    2014-01-01

    Optical molecular imaging, a new medical imaging technique, is developed based on genomics, proteomics and modern optical imaging technique, characterized by non-invasiveness, non-radiativity, high cost-effectiveness, high resolution, high sensitivity and simple operation in comparison with conventional imaging modalities. Currently, it has become one of the most widely used molecular imaging techniques and has been applied in gene expression regulation and activity detection, biological development and cytological detection, drug research and development, pathogenesis research, pharmaceutical effect evaluation and therapeutic effect evaluation, and so forth, This paper will review the latest researches and application progresses of commonly used optical molecular imaging techniques such as bioluminescence imaging and fluorescence molecular imaging. PMID:24696850

  15. Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note

    PubMed Central

    2017-01-01

    Efforts to apply augmented reality (AR) technology in the medical field include the introduction of AR techniques into dental practice. The present report introduces a simple method of applying AR during an inferior alveolar nerve block, a procedure commonly performed in dental clinics. PMID:28879340

  16. Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note.

    PubMed

    Won, Yu-Jin; Kang, Sang-Hoon

    2017-06-01

    Efforts to apply augmented reality (AR) technology in the medical field include the introduction of AR techniques into dental practice. The present report introduces a simple method of applying AR during an inferior alveolar nerve block, a procedure commonly performed in dental clinics.

  17. Functional Magnetic Resonance Imaging in Alzheimer' Disease Drug Development.

    PubMed

    Holiga, Stefan; Abdulkadir, Ahmed; Klöppel, Stefan; Dukart, Juergen

    2018-01-01

    While now commonly applied for studying human brain function the value of functional magnetic resonance imaging in drug development has only recently been recognized. Here we describe the different functional magnetic resonance imaging techniques applied in Alzheimer's disease drug development with their applications, implementation guidelines, and potential pitfalls.

  18. Spatial Assessment of Model Errors from Four Regression Techniques

    Treesearch

    Lianjun Zhang; Jeffrey H. Gove; Jeffrey H. Gove

    2005-01-01

    Fomst modelers have attempted to account for the spatial autocorrelations among trees in growth and yield models by applying alternative regression techniques such as linear mixed models (LMM), generalized additive models (GAM), and geographicalIy weighted regression (GWR). However, the model errors are commonly assessed using average errors across the entire study...

  19. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  20. Forces of Commonly Used Chiropractic Techniques for Children: A Review of the Literature.

    PubMed

    Todd, Angela J; Carroll, Matthew T; Mitchell, Eleanor K L

    2016-01-01

    The purpose of this study is to review the available literature that describes forces of the most commonly used chiropractic techniques for children. Review of the English-language literature using search terms Chiropract* and technique, protocol, or approach in databases PubMed, Cumulative Index to Nursing and Allied Health Literature, Allied and Complementary Medicine, and Index to Chiropractic Literature and direct contact with authors of articles and book chapters. Eleven articles that discussed the 7 most commonly used pediatric chiropractic techniques and the forces applied were identified. Chiropractic techniques reviewed described forces that were modified based on the age of the patient. Force data for mechanically assisted devices were varied, with the minimum force settings for some devices outside the age-specific safe range recommended in the literature when not modified in some way. This review found that technique selection and application by chiropractors treating infants and young children are typically modified in force and speed to suit the age and development of the child. Copyright © 2016. Published by Elsevier Inc.

  1. Fire in Eastern Hardwood Forests through 14,000 Years

    Treesearch

    Martin A. Spetich; Roger W. Perry; Craig A. Harper; Stacy L. Clark

    2011-01-01

    Fire helped shape the structure and species composition of hardwood forests of the eastern United States over the past 14,000 years. Periodic fires were common in much of this area prior to European settlement, and fire-resilient species proliferated. Early European settlers commonly adopted Native American techniques of applying fire to the landscape. As the demand...

  2. Using Single Drop Microextraction for Headspace Analysis with Gas Chromatography

    ERIC Educational Resources Information Center

    Riccio, Daniel; Wood, Derrick C.; Miller, James M.

    2008-01-01

    Headspace (HS) gas chromatography (GC) is commonly used to analyze samples that contain non-volatiles. In 1996, a new sampling technique called single drop microextraction, SDME, was introduced, and in 2001 it was applied to HS analysis. It is a simple technique that uses equipment normally found in the undergraduate laboratory, making it ideal…

  3. A Procedure for Estimating a Criterion-Referenced Standard to Identify Educationally Deprived Children for Title I Services. Final Report.

    ERIC Educational Resources Information Center

    Ziomek, Robert L.; Wright, Benjamin D.

    Techniques such as the norm-referenced and average score techniques, commonly used in the identification of educationally disadvantaged students, are critiqued. This study applied latent trait theory, specifically the Rasch Model, along with teacher judgments relative to the mastery of instructional/test decisions, to derive a standard setting…

  4. Circumcision-incision orchidopexy: A novel technique for palpable, low inguinal undescended testis.

    PubMed

    Chua, Michael E; Silangcruz, Jan Michael A; Gomez, Odina; Dy, Jun S; Morales, Marcelino L

    2017-11-01

    Given that both orchidopexy and circumcision are commonly done in a single operative setting, we adopted a technique of combined orchidopexy and circumcision using a single circumcision incision. We applied this new technique to boys with palpable, low inguinal cryptorchidism. Here we describe a case series of 7 boys who underwent concurrent orchidopexy via the circumcision site. We present this novel technique and discuss our preliminary outcomes, including the anatomic basis and feasibility. The technique appears to be an alternative for concurrent circumcision and cryptorchid cases with palpable, low inguinal testes.

  5. Soil Microbial Forensics.

    PubMed

    Santiago-Rodriguez, Tasha M; Cano, Raúl J

    2016-08-01

    Soil microbial forensics can be defined as the study of how microorganisms can be applied to forensic investigations. The field of soil microbial forensics is of increasing interest and applies techniques commonly used in diverse disciplines in order to identify microbes and determine their abundances, complexities, and interactions with soil and surrounding objects. Emerging new techniques are also providing insights into the complexity of microbes in soil. Soil may harbor unique microbes that may reflect specific physical and chemical characteristics indicating site specificity. While applications of some of these techniques in the field of soil microbial forensics are still in early stages, we are still gaining insight into how microorganisms may be more robustly used in forensic investigations.

  6. Which Technique Is Most Effective for Learning Declarative Concepts--Provided Examples, Generated Examples, or Both?

    ERIC Educational Resources Information Center

    Zamary, Amanda; Rawson, Katherine A.

    2018-01-01

    Students in many courses are commonly expected to learn declarative concepts, which are abstract concepts denoted by key terms with short definitions that can be applied to a variety of scenarios as reported by Rawson et al. ("Educational Psychology Review" 27:483-504, 2015). Given that declarative concepts are common and foundational in…

  7. SBL-Online: Implementing Studio-Based Learning Techniques in an Online Introductory Programming Course to Address Common Programming Errors and Misconceptions

    ERIC Educational Resources Information Center

    Polo, Blanca J.

    2013-01-01

    Much research has been done in regards to student programming errors, online education and studio-based learning (SBL) in computer science education. This study furthers this area by bringing together this knowledge and applying it to proactively help students overcome impasses caused by common student programming errors. This project proposes a…

  8. Multi-frame image processing with panning cameras and moving subjects

    NASA Astrophysics Data System (ADS)

    Paolini, Aaron; Humphrey, John; Curt, Petersen; Kelmelis, Eric

    2014-06-01

    Imaging scenarios commonly involve erratic, unpredictable camera behavior or subjects that are prone to movement, complicating multi-frame image processing techniques. To address these issues, we developed three techniques that can be applied to multi-frame image processing algorithms in order to mitigate the adverse effects observed when cameras are panning or subjects within the scene are moving. We provide a detailed overview of the techniques and discuss the applicability of each to various movement types. In addition to this, we evaluated algorithm efficacy with demonstrated benefits using field test video, which has been processed using our commercially available surveillance product. Our results show that algorithm efficacy is significantly improved in common scenarios, expanding our software's operational scope. Our methods introduce little computational burden, enabling their use in real-time and low-power solutions, and are appropriate for long observation periods. Our test cases focus on imaging through turbulence, a common use case for multi-frame techniques. We present results of a field study designed to test the efficacy of these techniques under expanded use cases.

  9. Low cost MATLAB-based pulse oximeter for deployment in research and development applications.

    PubMed

    Shokouhian, M; Morling, R C S; Kale, I

    2013-01-01

    Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.

  10. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  11. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  13. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  14. Order reduction for a model of marine bacteriophage evolution

    NASA Astrophysics Data System (ADS)

    Pagliarini, Silvia; Korobeinikov, Andrei

    2017-02-01

    A typical mechanistic model of viral evolution necessary includes several time scales which can differ by orders of magnitude. Such a diversity of time scales makes analysis of these models difficult. Reducing the order of a model is highly desirable when handling such a model. A typical approach applied to such slow-fast (or singularly perturbed) systems is the time scales separation technique. Constructing the so-called quasi-steady-state approximation is the usual first step in applying the technique. While this technique is commonly applied, in some cases its straightforward application can lead to unsatisfactory results. In this paper we construct the quasi-steady-state approximation for a model of evolution of marine bacteriophages based on the Beretta-Kuang model. We show that for this particular model the quasi-steady-state approximation is able to produce only qualitative but not quantitative fit.

  15. Circumcision-incision orchidopexy: A novel technique for palpable, low inguinal undescended testis

    PubMed Central

    Silangcruz, Jan Michael A.; Gomez, Odina; Dy, Jun S.; Morales, Marcelino L.

    2017-01-01

    Given that both orchidopexy and circumcision are commonly done in a single operative setting, we adopted a technique of combined orchidopexy and circumcision using a single circumcision incision. We applied this new technique to boys with palpable, low inguinal cryptorchidism. Here we describe a case series of 7 boys who underwent concurrent orchidopexy via the circumcision site. We present this novel technique and discuss our preliminary outcomes, including the anatomic basis and feasibility. The technique appears to be an alternative for concurrent circumcision and cryptorchid cases with palpable, low inguinal testes. PMID:29124248

  16. Concepts and methods in neuromodulation and functional electrical stimulation: an introduction.

    PubMed

    Holsheimer, J

    1998-04-01

    This article introduces two clinical fields in which stimulation is applied to the nervous system: neuromodulation and functional electrical stimulation. The concepts underlying these fields and their main clinical applications, as well as the methods and techniques used in each field, are described. Concepts and techniques common in one field that might be beneficial to the other are discussed. 1998 Blackwell Science, Inc.

  17. Knowledge Innovation System: The Common Language.

    ERIC Educational Resources Information Center

    Rogers, Debra M. Amidon

    1993-01-01

    The Knowledge Innovation System is a management technique in which a networked enterprise uses knowledge flow as a collaborative advantage. Enterprise Management System-Architecture, which can be applied to collaborative activities, has five domains: economic, sociological, psychological, managerial, and technological. (SK)

  18. A novel analytical technique suitable for the identification of plastics.

    PubMed

    Nečemer, Marijan; Kump, Peter; Sket, Primož; Plavec, Janez; Grdadolnik, Jože; Zvanut, Maja

    2013-01-01

    The enormous development and production of plastic materials in the last century resulted in increasing numbers of such kinds of objects. Development of a simple and fast technique to classify different types of plastics could be used in many activities dealing with plastic materials such as packaging of food, sorting of used plastic materials, and also, if technique would be non-destructive, for conservation of plastic artifacts in museum collections, a relatively new field of interest since 1990. In our previous paper we introduced a non-destructive technique for fast identification of unknown plastics based on EDXRF spectrometry,1 using as a case study some plastic artifacts archived in the Museum in order to show the advantages of the nondestructive identification of plastic material. In order to validate our technique it was necessary to apply for this purpose the comparison of analyses with some of the analytical techniques, which are more suitable and so far rather widely applied in identifying some most common sorts of plastic materials.

  19. A Simple Lightning Assimilation Technique For Improving Retrospective WRF Simulations

    EPA Science Inventory

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain...

  20. OOPs!

    ERIC Educational Resources Information Center

    Margush, Tim

    2001-01-01

    Discussion of Object Oriented Programming (OOP) focuses on criticism of an earlier article that addressed problems of applying specific functionality to controls across several forms in a Visual Basic project. Examines the Object Oriented techniques, inheritance and composition, commonly employed to extend the functionality of an object.…

  1. A simple lightning assimilation technique for improving retrospective WRF simulations.

    EPA Science Inventory

    Convective rainfall is often a large source of error in retrospective modeling applications. In particular, positive rainfall biases commonly exist during summer months due to overactive convective parameterizations. In this study, lightning assimilation was applied in the Kain-F...

  2. Transverse Pupil Shifts for Adaptive Optics Non-Common Path Calibration

    NASA Technical Reports Server (NTRS)

    Bloemhof, Eric E.

    2011-01-01

    A simple new way of obtaining absolute wavefront measurements with a laboratory Fizeau interferometer was recently devised. In that case, the observed wavefront map is the difference of two cavity surfaces, those of the mirror under test and of an unknown reference surface on the Fizeau s transmission flat. The absolute surface of each can be determined by applying standard wavefront reconstruction techniques to two grids of absolute surface height differences of the mirror under test, obtained from pairs of measurements made with slight transverse shifts in X and Y. Adaptive optics systems typically provide an actuated periscope between wavefront sensor (WFS) and commonmode optics, used for lateral registration of deformable mirror (DM) to WFS. This periscope permits independent adjustment of either pupil or focal spot incident on the WFS. It would be used to give the required lateral pupil motion between common and non-common segments, analogous to the lateral shifts of the two phase contributions in the lab Fizeau. The technique is based on a completely new approach to calibration of phase. It offers unusual flexibility with regard to the transverse spatial frequency scales probed, and will give results quite quickly, making use of no auxiliary equipment other than that built into the adaptive optics system. The new technique may be applied to provide novel calibration information about other optical systems in which the beam may be shifted transversely in a controlled way.

  3. A new technique for measuring gas conversion factors for hydrocarbon mass flowmeters

    NASA Technical Reports Server (NTRS)

    Singh, J. J.; Sprinkle, D. R.

    1983-01-01

    A technique for measuring calibration conversion factors for hydrocarbon mass flowmeters was developed. It was applied to a widely used type of commercial thermal mass flowmeter for hydrocarbon gases. The values of conversion factors for two common hydrocarbons measured using this technique are in good agreement with the empirical values cited by the manufacturer. Similar agreements can be expected for all other hydrocarbons. The technique is based on Nernst theorem for matching the partial pressure of oxygen in the combustion product gases with that in normal air. It is simple, quick and relatively safe--particularly for toxic/poisonous hydrocarbons.

  4. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  5. Modeling Multidisciplinary Science: Incorporating a Common Research Theme into Biology and Chemistry Courses.

    ERIC Educational Resources Information Center

    Reed, Kelynne E.; Stewart, Betty H.; Redshaw, Peggy A.

    2003-01-01

    Describes a project using a multidisciplinary approach for the simultaneous integration of a theme into several disciplines in which participating students apply techniques they learned during the semester and report their findings with a poster presentation. (YDS)

  6. A comparison of linear and nonlinear statistical techniques in performance attribution.

    PubMed

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  7. A grid-doubling finite-element technique for calculating dynamic three-dimensional spontaneous rupture on an earthquake fault

    USGS Publications Warehouse

    Barall, Michael

    2009-01-01

    We present a new finite-element technique for calculating dynamic 3-D spontaneous rupture on an earthquake fault, which can reduce the required computational resources by a factor of six or more, without loss of accuracy. The grid-doubling technique employs small cells in a thin layer surrounding the fault. The remainder of the modelling volume is filled with larger cells, typically two or four times as large as the small cells. In the resulting non-conforming mesh, an interpolation method is used to join the thin layer of smaller cells to the volume of larger cells. Grid-doubling is effective because spontaneous rupture calculations typically require higher spatial resolution on and near the fault than elsewhere in the model volume. The technique can be applied to non-planar faults by morphing, or smoothly distorting, the entire mesh to produce the desired 3-D fault geometry. Using our FaultMod finite-element software, we have tested grid-doubling with both slip-weakening and rate-and-state friction laws, by running the SCEC/USGS 3-D dynamic rupture benchmark problems. We have also applied it to a model of the Hayward fault, Northern California, which uses realistic fault geometry and rock properties. FaultMod implements fault slip using common nodes, which represent motion common to both sides of the fault, and differential nodes, which represent motion of one side of the fault relative to the other side. We describe how to modify the traction-at-split-nodes method to work with common and differential nodes, using an implicit time stepping algorithm.

  8. Prostate Cancer Probability Prediction By Machine Learning Technique.

    PubMed

    Jović, Srđan; Miljković, Milica; Ivanović, Miljan; Šaranović, Milena; Arsić, Milena

    2017-11-26

    The main goal of the study was to explore possibility of prostate cancer prediction by machine learning techniques. In order to improve the survival probability of the prostate cancer patients it is essential to make suitable prediction models of the prostate cancer. If one make relevant prediction of the prostate cancer it is easy to create suitable treatment based on the prediction results. Machine learning techniques are the most common techniques for the creation of the predictive models. Therefore in this study several machine techniques were applied and compared. The obtained results were analyzed and discussed. It was concluded that the machine learning techniques could be used for the relevant prediction of prostate cancer.

  9. VEG: An intelligent workbench for analysing spectral reflectance data

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann; Harrison, Patrick R.; Kimes, Daniel S.

    1994-01-01

    An Intelligent Workbench (VEG) was developed for the systematic study of remotely sensed optical data from vegetation. A goal of the remote sensing community is to infer the physical and biological properties of vegetation cover (e.g. cover type, hemispherical reflectance, ground cover, leaf area index, biomass, and photosynthetic capacity) using directional spectral data. VEG collects together, in a common format, techniques previously available from many different sources in a variety of formats. The decision as to when a particular technique should be applied is nonalgorithmic and requires expert knowledge. VEG has codified this expert knowledge into a rule-based decision component for determining which technique to use. VEG provides a comprehensive interface that makes applying the techniques simple and aids a researcher in developing and testing new techniques. VEG also provides a classification algorithm that can learn new classes of surface features. The learning system uses the database of historical cover types to learn class descriptions of one or more classes of cover types.

  10. Data Mining Methods for Recommender Systems

    NASA Astrophysics Data System (ADS)

    Amatriain, Xavier; Jaimes*, Alejandro; Oliver, Nuria; Pujol, Josep M.

    In this chapter, we give an overview of the main Data Mining techniques used in the context of Recommender Systems. We first describe common preprocessing methods such as sampling or dimensionality reduction. Next, we review the most important classification techniques, including Bayesian Networks and Support Vector Machines. We describe the k-means clustering algorithm and discuss several alternatives. We also present association rules and related algorithms for an efficient training process. In addition to introducing these techniques, we survey their uses in Recommender Systems and present cases where they have been successfully applied.

  11. Archaeology Through Computational Linguistics: Inscription Statistics Predict Excavation Sites of Indus Valley Artifacts.

    PubMed

    Recchia, Gabriel L; Louwerse, Max M

    2016-11-01

    Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.

  12. A methodology for commonality analysis, with applications to selected space station systems

    NASA Technical Reports Server (NTRS)

    Thomas, Lawrence Dale

    1989-01-01

    The application of commonality in a system represents an attempt to reduce costs by reducing the number of unique components. A formal method for conducting commonality analysis has not been established. In this dissertation, commonality analysis is characterized as a partitioning problem. The cost impacts of commonality are quantified in an objective function, and the solution is that partition which minimizes this objective function. Clustering techniques are used to approximate a solution, and sufficient conditions are developed which can be used to verify the optimality of the solution. This method for commonality analysis is general in scope. It may be applied to the various types of commonality analysis required in the conceptual, preliminary, and detail design phases of the system development cycle.

  13. A Computer Based Moire Technique To Measure Very Small Displacements

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  14. Separation of phytochemicals from Helichrysum italicum: An analysis of different isolation techniques and biological activity of prepared extracts.

    PubMed

    Maksimovic, Svetolik; Tadic, Vanja; Skala, Dejan; Zizovic, Irena

    2017-06-01

    Helichrysum italicum presents a valuable source of natural bioactive compounds. In this work, a literature review of terpenes, phenolic compounds, and other less common phytochemicals from H. italicum with regard to application of different separation methods is presented. Data including extraction/separation methods and experimental conditions applied, obtained yields, number of identified compounds, content of different compound groups, and analytical techniques applied are shown as corresponding tables. Numerous biological activities of both isolates and individual compounds are emphasized. In addition, the data reported are discussed, and the directions for further investigations are proposed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Mixture Modeling: Applications in Educational Psychology

    ERIC Educational Resources Information Center

    Harring, Jeffrey R.; Hodis, Flaviu A.

    2016-01-01

    Model-based clustering methods, commonly referred to as finite mixture modeling, have been applied to a wide variety of cross-sectional and longitudinal data to account for heterogeneity in population characteristics. In this article, we elucidate 2 such approaches: growth mixture modeling and latent profile analysis. Both techniques are…

  16. Impact of application site on the efficacy of two topically-applied insecticides to Culex quinquefasciatus Say

    USDA-ARS?s Scientific Manuscript database

    Ultra-low volume and low volume insecticide treatments commonly used to control mosquito populations were evaluated for efficacy against Culex quinquefasciatus using a topical bioassay technique. Traditional topical bioassays focus pesticide application to the mesothoracic pleural area. Although, in...

  17. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  18. Arched needle technique for inferior alveolar mandibular nerve block.

    PubMed

    Chakranarayan, Ashish; Mukherjee, B

    2013-03-01

    One of the most commonly used local anesthetic techniques in dentistry is the Fischer's technique for the inferior alveolar nerve block. Incidentally this technique also suffers the maximum failure rate of approximately 35-45%. We studied a method of inferior alveolar nerve block by injecting a local anesthetic solution into the pterygomandibular space by arching and changing the approach angle of the conventional technique and estimated its efficacy. The needle after the initial insertion is arched and inserted in a manner that it approaches the medial surface of the ramus at an angle almost perpendicular to it. The technique was applied to 100 patients for mandibular molar extraction and the anesthetic effects were assessed. A success rate of 98% was obtained.

  19. A Teacher's Guide to Memory Techniques.

    ERIC Educational Resources Information Center

    Hodges, Daniel L.

    1982-01-01

    To aid instructors in teaching their students to use effective methods of memorization, this article outlines major memory methods, provides examples of their use, evaluates the methods, and discusses ways students can be taught to apply them. First, common, but less effective, memory methods are presented, including reading and re-reading…

  20. The Role of the Sampling Distribution in Understanding Statistical Inference

    ERIC Educational Resources Information Center

    Lipson, Kay

    2003-01-01

    Many statistics educators believe that few students develop the level of conceptual understanding essential for them to apply correctly the statistical techniques at their disposal and to interpret their outcomes appropriately. It is also commonly believed that the sampling distribution plays an important role in developing this understanding.…

  1. A prospective pilot study on early toxicity from a simultaneously integrated boost technique for canine sinonasal tumours using image-guided intensity-modulated radiation therapy.

    PubMed

    Soukup, A; Meier, V; Pot, S; Voelter, K; Rohrer Bley, C

    2018-05-14

    In order to overcome the common local treatment failure of canine sinonasal tumours, integrated boost techniques were tried in the cobalt/orthovoltage era, but dismissed because of unacceptable early (acute) toxicity. Intriguingly, a recent calculation study of a simultaneously integrated boost (SIB) technique for sinonasal irradiation using intensity-modulated radiation therapy (IMRT) predicted theoretical feasibility. In this prospective pilot study we applied a commonly used protocol of 10 × 4.2 Gy to the planning target volume (PTV) with a 20%-SIB dose to the gross tumour volume (GTV). Our hypothesis expected this dose escalation to be clinically tolerable if applied with image-guided IMRT. We included 9 dogs diagnosed with sinonasal tumours without local/distant metastases. For treatment planning, organs at risk were contoured according to strict anatomical guidelines. Planning volume extensions (GTV/CTV/PTV) were standardized to minimize interplanner variability. Treatments were applied with rigid patient positioning and verified daily with image guidance. After radiation therapy, we set focus on early ophthalmologic complications as well as mucosal and cutaneous toxicity. Early toxicity was evaluated at week 1, 2, 3, 8 and 12 after radiotherapy. Only mild ophthalmologic complications were found. Three patients (33%) had self-limiting moderate to severe early toxicity (grade 3 mucositis) which was managed medically. No patient developed ulcerations/haemorrhage/necrosis of skin/mucosa. The SIB protocol applied with image-guided IMRT to treat canine sinonasal tumours led to clinically acceptable side effects. The suspected increased tumour control probability and the risk of late toxicity with the used dose escalation of 20% has to be further investigated. © 2018 John Wiley & Sons Ltd.

  2. A dynamic mechanical analysis technique for porous media

    PubMed Central

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in hydraulic conductivity as well. PMID:25248170

  3. Applying computation biology and "big data" to develop multiplex diagnostics for complex chronic diseases such as osteoarthritis.

    PubMed

    Ren, Guomin; Krawetz, Roman

    2015-01-01

    The data explosion in the last decade is revolutionizing diagnostics research and the healthcare industry, offering both opportunities and challenges. These high-throughput "omics" techniques have generated more scientific data in the last few years than in the entire history of mankind. Here we present a brief summary of how "big data" have influenced early diagnosis of complex diseases. We will also review some of the most commonly used "omics" techniques and their applications in diagnostics. Finally, we will discuss the issues brought by these new techniques when translating laboratory discoveries to clinical practice.

  4. The development of laser speckle velocimetry for the study of vortical flows

    NASA Technical Reports Server (NTRS)

    Krothapalli, A.

    1991-01-01

    A new experimental technique commonly known as PIDV (particle image displacement velocity) was developed to measure an instantaneous two dimensional velocity fluid in a selected plane of the flow field. This technique was successfully applied to the study of several problems: (1) unsteady flows with large scale vortical structures; (2) the instantaneous two dimensional flow in the transition region of a rectangular air jet; and (3) the instantaneous flow over a circular bump in a transonic flow. In several other experiments PIDV is routinely used as a non-intrusive measurement technique to obtain instantaneous two dimensional velocity fields.

  5. Image-Based 3d Reconstruction and Analysis for Orthodontia

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.

    2012-08-01

    Among the main tasks of orthodontia are analysis of teeth arches and treatment planning for providing correct position for every tooth. The treatment plan is based on measurement of teeth parameters and designing perfect teeth arch curve which teeth are to create after treatment. The most common technique for teeth moving uses standard brackets which put on teeth and a wire of given shape which is clamped by these brackets for producing necessary forces to every tooth for moving it in given direction. The disadvantages of standard bracket technique are low accuracy of tooth dimensions measurements and problems with applying standard approach for wide variety of complex orthodontic cases. The image-based technique for orthodontic planning, treatment and documenting aimed at overcoming these disadvantages is proposed. The proposed approach provides performing accurate measurements of teeth parameters needed for adequate planning, designing correct teeth position and monitoring treatment process. The developed technique applies photogrammetric means for teeth arch 3D model generation, brackets position determination and teeth shifting analysis.

  6. The requirements and feasibility of business planning in the office of space and terrestrial applications

    NASA Technical Reports Server (NTRS)

    Greenberg, J. S.; Miller, B. P.

    1979-01-01

    The feasibility of applying strategic business planning techniques which are developed and used in the private sector to the planning of certain projects within the NASA Office of Space and Terrestrial Applications was assessed. The methods of strategic business planning that are currently in use in the private sector are examined. The typical contents of a private sector strategic business plan and the techniques commonly used to develop the contents of the plan are described, along with modifications needed to apply these concepts to public sector projects. The current long-range planning process in the Office of Space and Terrestrial Applications is reviewed and program initiatives that might be candidates for the use of strategic business planning techniques are identified. In order to more fully illustrate the information requirements of a strategic business plan for a NASA program, a sample business plan is prepared for a hypothetical Operational Earth Resources Satellite program.

  7. Thermal Network Modelling Handbook

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Thermal mathematical modelling is discussed in detail. A three-fold purpose was established: (1) to acquaint the new user with the terminology and concepts used in thermal mathematical modelling, (2) to present the more experienced and occasional user with quick formulas and methods for solving everyday problems, coupled with study cases which lend insight into the relationships that exist among the various solution techniques and parameters, and (3) to begin to catalog in an orderly fashion the common formulas which may be applied to automated conversational language techniques.

  8. Preservation of commonly applied fluorescent tracers in complex water samples

    NASA Astrophysics Data System (ADS)

    Cao, Viet; Schaffer, Mario; Jin, Yulan; Licha, Tobias

    2017-06-01

    Water sample preservation and pre-treatment are important steps for achieving accurate and reproductive results from tracer tests. However, this is particularly challenging for complex water mixtures prior to fluorescence analysis. In this study, the interference of iron and calcium precipitation with nine commonly applied conservative tracers, uranine, eosin, 1-naphthalene sulfonate, 1,5-naphthalene disulfonate, 2,6-naphthalene disulfonate, 4-amino-1-naphthalene sulfonate, 6-hydroxy-2-naphthalene sulfonate, 1,3,6-naphthalene trisulfonate, and 1,3,6,8-pyrene tetrasulfonate, was investigated in batch experiments. In general, the observed results are influenced by precipitates. A technique consisting of pH adjustment and centrifugation is described for preserving samples and avoiding the impact of these precipitates on the tracer test results.

  9. BOREHOLE NEUTRON ACTIVATION: THE RARE EARTHS.

    USGS Publications Warehouse

    Mikesell, J.L.; Senftle, F.E.

    1987-01-01

    Neutron-induced borehole gamma-ray spectroscopy has been widely used as a geophysical exploration technique by the petroleum industry, but its use for mineral exploration is not as common. Nuclear methods can be applied to mineral exploration, for determining stratigraphy and bed correlations, for mapping ore deposits, and for studying mineral concentration gradients. High-resolution detectors are essential for mineral exploration, and by using them an analysis of the major element concentrations in a borehole can usually be made. A number of economically important elements can be detected at typical ore-grade concentrations using this method. Because of the application of the rare-earth elements to high-temperature superconductors, these elements are examined in detail as an example of how nuclear techniques can be applied to mineral exploration.

  10. Applying Metrological Techniques to Satellite Fundamental Climate Data Records

    NASA Astrophysics Data System (ADS)

    Woolliams, Emma R.; Mittaz, Jonathan PD; Merchant, Christopher J.; Hunt, Samuel E.; Harris, Peter M.

    2018-02-01

    Quantifying long-term environmental variability, including climatic trends, requires decadal-scale time series of observations. The reliability of such trend analysis depends on the long-term stability of the data record, and understanding the sources of uncertainty in historic, current and future sensors. We give a brief overview on how metrological techniques can be applied to historical satellite data sets. In particular we discuss the implications of error correlation at different spatial and temporal scales and the forms of such correlation and consider how uncertainty is propagated with partial correlation. We give a form of the Law of Propagation of Uncertainties that considers the propagation of uncertainties associated with common errors to give the covariance associated with Earth observations in different spectral channels.

  11. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  12. Quantile Regression in the Study of Developmental Sciences

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Logan, Jessica A. R.

    2014-01-01

    Linear regression analysis is one of the most common techniques applied in developmental research, but only allows for an estimate of the average relations between the predictor(s) and the outcome. This study describes quantile regression, which provides estimates of the relations between the predictor(s) and outcome, but across multiple points of…

  13. Forest ecosystems of a Lower Gulf Coastal Plainlandscape: multifactor classification and analysis

    Treesearch

    P. Charles Goebel; Brian J. Palik; L. Katherine Kirkman; Mark B. Drew; Larry West; Dee C. Pederson

    2001-01-01

    The most common forestland classification techniques applied in the southeastern United States are vegetation-based. While not completely ignored, the application of multifactor, hierarchical ecosystem classifications are limited despite their widespread use in other regions of the eastern United States. We present one of the few truly integrated ecosystem...

  14. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  15. On Evaluating Brain Tissue Classifiers without a Ground Truth

    PubMed Central

    Martin-Fernandez, Marcos; Ungar, Lida; Nakamura, Motoaki; Koo, Min-Seong; McCarley, Robert W.; Shenton, Martha E.

    2009-01-01

    In this paper, we present a set of techniques for the evaluation of brain tissue classifiers on a large data set of MR images of the head. Due to the difficulty of establishing a gold standard for this type of data, we focus our attention on methods which do not require a ground truth, but instead rely on a common agreement principle. Three different techniques are presented: the Williams’ index, a measure of common agreement; STAPLE, an Expectation Maximization algorithm which simultaneously estimates performance parameters and constructs an estimated reference standard; and Multidimensional Scaling, a visualization technique to explore similarity data. We apply these different evaluation methodologies to a set eleven different segmentation algorithms on forty MR images. We then validate our evaluation pipeline by building a ground truth based on human expert tracings. The evaluations with and without a ground truth are compared. Our findings show that comparing classifiers without a gold standard can provide a lot of interesting information. In particular, outliers can be easily detected, strongly consistent or highly variable techniques can be readily discriminated, and the overall similarity between different techniques can be assessed. On the other hand, we also find that some information present in the expert segmentations is not captured by the automatic classifiers, suggesting that common agreement alone may not be sufficient for a precise performance evaluation of brain tissue classifiers. PMID:17532646

  16. Risk management in the competitive electric power industry

    NASA Astrophysics Data System (ADS)

    Dahlgren, Robert William

    From 1990 until present day, the electric power industry has experienced dramatic changes worldwide. This recent evolution of the power industry has included creation and multiple iterations of competitive wholesale markets in many different forms. The creation of these competitive markets has resulted in increased short-term volatility of power prices. Vertically integrated utilities emerged from years of regulatory controls to now experience the need to perform risk assessment. The goal of this dissertation is to provide background and details of the evolution of market structures combined with examples of how to apply price risk assessment techniques such as Value-at-Risk (VaR). In Chapter 1, the history and evolution of three selected regional markets, PJM, California, and England and Wales is presented. A summary of the commonalities and differences is presented to provide an overview of the rate of transformation of the industry in recent years. The broad area of risk management in the power industry is also explored through a State-of-the-Art Literature Survey. In Chapter 2, an illustration of risk assessment to power trading is presented. The techniques of Value-at-Risk and Conditional Value-at-Risk are introduced and applied to a common scenario. The advantages and limitations of the techniques are compared through observation of their results against the common example. Volatility in the California Power Markets is presented in Chapter 3. This analysis explores the California markets in the summer of 2000 including the application of VaR analysis to the extreme volatility observed during this period. In Chapter 4, CVaR is applied to the same California historical data used in Chapter 3. In addition, the unique application of minimizing the risk of a power portfolio by minimizing CVaR is presented. The application relies on recent research into CVaR whereby the portfolio optimization problem can be reduced to a Linear Programming problem.

  17. Limit Cycle Analysis Applied to the Oscillations of Decelerating Blunt-Body Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Queen, Eric M.

    2008-01-01

    Many blunt-body entry vehicles have nonlinear dynamic stability characteristics that produce self-limiting oscillations in flight. Several different test techniques can be used to extract dynamic aerodynamic coefficients to predict this oscillatory behavior for planetary entry mission design and analysis. Most of these test techniques impose boundary conditions that alter the oscillatory behavior from that seen in flight. Three sets of test conditions, representing three commonly used test techniques, are presented to highlight these effects. Analytical solutions to the constant-coefficient planar equations-of-motion for each case are developed to show how the same blunt body behaves differently depending on the imposed test conditions. The energy equation is applied to further illustrate the governing dynamics. Then, the mean value theorem is applied to the energy rate equation to find the effective damping for an example blunt body with nonlinear, self-limiting dynamic characteristics. This approach is used to predict constant-energy oscillatory behavior and the equilibrium oscillation amplitudes for the various test conditions. These predictions are verified with planar simulations. The analysis presented provides an overview of dynamic stability test techniques and illustrates the effects of dynamic stability, static aerodynamics and test conditions on observed dynamic motions. It is proposed that these effects may be leveraged to develop new test techniques and refine test matrices in future tests to better define the nonlinear functional forms of blunt body dynamic stability curves.

  18. A Synthesis of Star Calibration Techniques for Ground-Based Narrowband Electron-Multiplying Charge-Coupled Device Imagers Used in Auroral Photometry

    NASA Technical Reports Server (NTRS)

    Grubbs, Guy II; Michell, Robert; Samara, Marilia; Hampton, Don; Jahn, Jorg-Micha

    2016-01-01

    A technique is presented for the periodic and systematic calibration of ground-based optical imagers. It is important to have a common system of units (Rayleighs or photon flux) for cross comparison as well as self-comparison over time. With the advancement in technology, the sensitivity of these imagers has improved so that stars can be used for more precise calibration. Background subtraction, flat fielding, star mapping, and other common techniques are combined in deriving a calibration technique appropriate for a variety of ground-based imager installations. Spectral (4278, 5577, and 8446 A ) ground-based imager data with multiple fields of view (19, 47, and 180 deg) are processed and calibrated using the techniques developed. The calibration techniques applied result in intensity measurements in agreement between different imagers using identical spectral filtering, and the intensity at each wavelength observed is within the expected range of auroral measurements. The application of these star calibration techniques, which convert raw imager counts into units of photon flux, makes it possible to do quantitative photometry. The computed photon fluxes, in units of Rayleighs, can be used for the absolute photometry between instruments or as input parameters for auroral electron transport models.

  19. Applying Jlint to Space Exploration Software

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus

    2004-01-01

    Java is a very successful programming language which is also becoming widespread in embedded systems, where software correctness is critical. Jlint is a simple but highly efficient static analyzer that checks a Java program for several common errors, such as null pointer exceptions, and overflow errors. It also includes checks for multi-threading problems, such as deadlocks and data races. The case study described here shows the effectiveness of Jlint in find-false positives in the multi-threading warnings gives an insight into design patterns commonly used in multi-threaded code. The results show that a few analysis techniques are sufficient to avoid almost all false positives. These techniques include investigating all possible callers and a few code idioms. Verifying the correct application of these patterns is still crucial, because their correct usage is not trivial.

  20. Collecting data on potentially harmful events: a method for monitoring incidents in general practice.

    PubMed

    Britt, H; Miller, G C; Steven, I D; Howarth, G C; Nicholson, P A; Bhasale, A L; Norton, K J

    1997-04-01

    The prediction and subsequent prevention of errors, which are an integral element of human behaviour, require an understanding of their cause. The incident monitoring technique was developed in the study of aviation errors in the Second World War and has been applied more recently in the field of anaesthetics. This pilot study represents one of the first attempts to apply the incident monitoring technique in the general practice environment. A total of 297 GPs across Australia anonymously reported details of unintended events which harmed or could have harmed the patient. Reports were contemporaneously recorded on prepared forms which allowed a free text description of the incident, and structured responses for contributing and mitigating factors, immediate and long-term out-comes, additional costs etc. The first 500 reports were analysed using both of qualitative and quantitative methods and a brief overview of results is presented. The methodological issues arising in the application of this technique to such a large, widely spread profession, in which episodes of care are not necessarily confined to a single consultation, are discussed. This study demonstrated that the incident monitoring technique can be successfully applied in general practice and that the resulting information can facilitate the identification of common factors contributing to such events and allow the development of preventive interventions.

  1. Dragon Ears airborne acoustic array: CSP analysis applied to cross array to compute real-time 2D acoustic sound field

    NASA Astrophysics Data System (ADS)

    Cerwin, Steve; Barnes, Julie; Kell, Scott; Walters, Mark

    2003-09-01

    This paper describes development and application of a novel method to accomplish real-time solid angle acoustic direction finding using two 8-element orthogonal microphone arrays. The developed prototype system was intended for localization and signature recognition of ground-based sounds from a small UAV. Recent advances in computer speeds have enabled the implementation of microphone arrays in many audio applications. Still, the real-time presentation of a two-dimensional sound field for the purpose of audio target localization is computationally challenging. In order to overcome this challenge, a crosspower spectrum phase1 (CSP) technique was applied to each 8-element arm of a 16-element cross array to provide audio target localization. In this paper, we describe the technique and compare it with two other commonly used techniques; Cross-Spectral Matrix2 and MUSIC3. The results show that the CSP technique applied to two 8-element orthogonal arrays provides a computationally efficient solution with reasonable accuracy and tolerable artifacts, sufficient for real-time applications. Additional topics include development of a synchronized 16-channel transmitter and receiver to relay the airborne data to the ground-based processor and presentation of test data demonstrating both ground-mounted operation and airborne localization of ground-based gunshots and loud engine sounds.

  2. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review

    PubMed Central

    Rodgers, Kiri J.; Hursthouse, Andrew; Cuthbert, Simon

    2015-01-01

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes. PMID:26393631

  3. The Potential of Sequential Extraction in the Characterisation and Management of Wastes from Steel Processing: A Prospective Review.

    PubMed

    Rodgers, Kiri J; Hursthouse, Andrew; Cuthbert, Simon

    2015-09-18

    As waste management regulations become more stringent, yet demand for resources continues to increase, there is a pressing need for innovative management techniques and more sophisticated supporting analysis techniques. Sequential extraction (SE) analysis, a technique previously applied to soils and sediments, offers the potential to gain a better understanding of the composition of solid wastes. SE attempts to classify potentially toxic elements (PTEs) by their associations with phases or fractions in waste, with the aim of improving resource use and reducing negative environmental impacts. In this review we explain how SE can be applied to steel wastes. These present challenges due to differences in sample characteristics compared with materials to which SE has been traditionally applied, specifically chemical composition, particle size and pH buffering capacity, which are critical when identifying a suitable SE method. We highlight the importance of delineating iron-rich phases, and find that the commonly applied BCR (The community Bureau of reference) extraction method is problematic due to difficulties with zinc speciation (a critical steel waste constituent), hence a substantially modified SEP is necessary to deal with particular characteristics of steel wastes. Successful development of SE for steel wastes could have wider implications, e.g., for the sustainable management of fly ash and mining wastes.

  4. Percutaneous thermal ablation of lung tumors - Radiofrequency, microwave and cryotherapy: Where are we going?

    PubMed

    Palussière, J; Catena, V; Buy, X

    2017-09-01

    Main indications of percutaneous pulmonary thermal ablation are early stage non-small cell lung carcinoma (NSCLC) for patients who are not amenable to surgery and slow-evolving localized metastatic disease, either spontaneous or following a general treatment. Radiofrequency ablation (RFA) is the most evaluated technique. This technique offers a local control rate ranging between 80 and 90% for tumors <3cm in diameter. Other more recently used ablation techniques such as microwaves and cryotherapy could overcome some limitations of RFA. One common characteristic of these techniques is an excellent tolerance with very few complications. This article reviews the differences between these techniques when applied to lung tumors, indications, results and complications. Future potential associations with immunotherapy will be discussed. Copyright © 2017. Published by Elsevier Masson SAS.

  5. Evolutionary Based Techniques for Fault Tolerant Field Programmable Gate Arrays

    NASA Technical Reports Server (NTRS)

    Larchev, Gregory V.; Lohn, Jason D.

    2006-01-01

    The use of SRAM-based Field Programmable Gate Arrays (FPGAs) is becoming more and more prevalent in space applications. Commercial-grade FPGAs are potentially susceptible to permanently debilitating Single-Event Latchups (SELs). Repair methods based on Evolutionary Algorithms may be applied to FPGA circuits to enable successful fault recovery. This paper presents the experimental results of applying such methods to repair four commonly used circuits (quadrature decoder, 3-by-3-bit multiplier, 3-by-3-bit adder, 440-7 decoder) into which a number of simulated faults have been introduced. The results suggest that evolutionary repair techniques can improve the process of fault recovery when used instead of or as a supplement to Triple Modular Redundancy (TMR), which is currently the predominant method for mitigating FPGA faults.

  6. Paint Analysis Using Visible Reflectance Spectroscopy: An Undergraduate Forensic Lab

    ERIC Educational Resources Information Center

    Hoffman, Erin M.; Beussman, Douglas J.

    2007-01-01

    The study of forensic science is found throughout undergraduate programs in growing numbers, both as stand-alone courses as well as specific examples within existing courses. Part of the driving force for this trend is the ability to apply common chemistry techniques to everyday situations, all couched in the context of a mystery that must be…

  7. Information Fusion - Methods and Aggregation Operators

    NASA Astrophysics Data System (ADS)

    Torra, Vicenç

    Information fusion techniques are commonly applied in Data Mining and Knowledge Discovery. In this chapter, we will give an overview of such applications considering their three main uses. This is, we consider fusion methods for data preprocessing, model building and information extraction. Some aggregation operators (i.e. particular fusion methods) and their properties are briefly described as well.

  8. A Model for Minimizing Numeric Function Generator Complexity and Delay

    DTIC Science & Technology

    2007-12-01

    allow computation of difficult mathematical functions in less time and with less hardware than commonly employed methods. They compute piecewise...Programmable Gate Arrays (FPGAs). The algorithms and estimation techniques apply to various NFG architectures and mathematical functions. This...thesis compares hardware utilization and propagation delay for various NFG architectures, mathematical functions, word widths, and segmentation methods

  9. Discrete square root filtering - A survey of current techniques.

    NASA Technical Reports Server (NTRS)

    Kaminskii, P. G.; Bryson, A. E., Jr.; Schmidt, S. F.

    1971-01-01

    Current techniques in square root filtering are surveyed and related by applying a duality association. Four efficient square root implementations are suggested, and compared with three common conventional implementations in terms of computational complexity and precision. It is shown that the square root computational burden should not exceed the conventional by more than 50% in most practical problems. An examination of numerical conditioning predicts that the square root approach can yield twice the effective precision of the conventional filter in ill-conditioned problems. This prediction is verified in two examples.

  10. Laser figuring for the generation of analog micro-optics and kineform surfaces

    NASA Technical Reports Server (NTRS)

    Gratrix, Edward J.

    1993-01-01

    To date, there have been many techniques used to generate micro-optic structures in glass or other materials. Using methods common to the lithographic industry, the manufacturing technique known as 'binary optics,' has demonstrated the use of diffractive optics in a variety of micro-optic applications. It is well established that diffractive structures have limited capability when applied in a design more suited for a refractive element. For applications that demand fast, highly efficient, broadband designs, we have developed a technique which uses laser figuring to generate the refractive micro-optical surface. This paper describes the technique used to fabricate refractive micro-optics. Recent results of micro-optics in CdZnTe focal planes are shown.

  11. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  12. Oximetry using multispectral imaging: theory and application

    NASA Astrophysics Data System (ADS)

    MacKenzie, Lewis E.; Harvey, Andrew R.

    2018-06-01

    Multispectral imaging (MSI) is a technique for measurement of blood oxygen saturation in vivo that can be applied using various imaging modalities to provide new insights into physiology and disease development. This tutorial aims to provide a thorough introduction to the theory and application of MSI oximetry for researchers new to the field, whilst also providing detailed information for more experienced researchers. The optical theory underlying two-wavelength oximetry, three-wavelength oximetry, pulse oximetry, and multispectral oximetry algorithms are described in detail. The varied challenges of applying MSI oximetry to in vivo applications are outlined and discussed, covering: the optical properties of blood and tissue, optical paths in blood vessels, tissue auto-fluorescence, oxygen diffusion, and common oximetry artefacts. Essential image processing techniques for MSI are discussed, in particular, image acquisition, image registration strategies, and blood vessel line profile fitting. Calibration and validation strategies for MSI are discussed, including comparison techniques, physiological interventions, and phantoms. The optical principles and unique imaging capabilities of various cutting-edge MSI oximetry techniques are discussed, including photoacoustic imaging, spectroscopic optical coherence tomography, and snapshot MSI.

  13. A high-precision voltage source for EIT

    PubMed Central

    Saulnier, Gary J; Liu, Ning; Ross, Alexander S

    2006-01-01

    Electrical impedance tomography (EIT) utilizes electrodes placed on the surface of a body to determine the complex conductivity distribution within the body. EIT can be performed by applying currents through the electrodes and measuring the electrode voltages or by applying electrode voltages and measuring the currents. Techniques have also been developed for applying the desired currents using voltage sources. This paper describes a voltage source for use in applied-voltage EIT that includes the capability of measuring both the applied voltage and applied current. A calibration circuit and calibration algorithm are described which enables all voltage sources in an EIT system to be calibrated to a common standard. The calibration minimizes the impact of stray shunt impedance, passive component variability and active component non-ideality. Simulation data obtained using PSpice are used to demonstrate the effectiveness of the circuits and calibration algorithm. PMID:16636413

  14. Application of fuzzy AHP method to IOCG prospectivity mapping: A case study in Taherabad prospecting area, eastern Iran

    NASA Astrophysics Data System (ADS)

    Najafi, Ali; Karimpour, Mohammad Hassan; Ghaderi, Majid

    2014-12-01

    Using fuzzy analytical hierarchy process (AHP) technique, we propose a method for mineral prospectivity mapping (MPM) which is commonly used for exploration of mineral deposits. The fuzzy AHP is a popular technique which has been applied for multi-criteria decision-making (MCDM) problems. In this paper we used fuzzy AHP and geospatial information system (GIS) to generate prospectivity model for Iron Oxide Copper-Gold (IOCG) mineralization on the basis of its conceptual model and geo-evidence layers derived from geological, geochemical, and geophysical data in Taherabad area, eastern Iran. The FuzzyAHP was used to determine the weights belonging to each criterion. Three geoscientists knowledge on exploration of IOCG-type mineralization have been applied to assign weights to evidence layers in fuzzy AHP MPM approach. After assigning normalized weights to all evidential layers, fuzzy operator was applied to integrate weighted evidence layers. Finally for evaluating the ability of the applied approach to delineate reliable target areas, locations of known mineral deposits in the study area were used. The results demonstrate the acceptable outcomes for IOCG exploration.

  15. Ar+ and CuBr laser-assisted chemical bleaching of teeth: estimation of whiteness degree

    NASA Astrophysics Data System (ADS)

    Dimitrov, S.; Todorovska, Roumyana; Gizbrecht, Alexander I.; Raychev, L.; Petrov, Lyubomir P.

    2003-11-01

    In this work the results of adaptation of impartial methods for color determination aimed at developing of techniques for estimation of human teeth whiteness degree, sufficiently handy for common use in clinical practice are presented. For approbation and by the way of illustration of the techniques, standards of teeth colors were used as well as model and naturally discolored human teeth treated by two bleaching chemical compositions activated by three light sources each: Ar+ and CuBr lasers, and a standard halogen photopolymerization lamp. Typical reflection and fluorescence spectra of some samples are presented; the samples colors were estimated by a standard computer processing in RGB and B coordinates. The results of the applied spectral and colorimetric techniques are in a good agreement with those of the standard computer processing of the corresponding digital photographs and complies with the visually estimated degree of the teeth whiteness judged according to the standard reference scale commonly used in the aesthetic dentistry.

  16. Genotyping the factor VIII intron 22 inversion locus using fluorescent in situ hybridization.

    PubMed

    Sheen, Campbell R; McDonald, Margaret A; George, Peter M; Smith, Mark P; Morris, Christine M

    2011-02-15

    The factor VIII intron 22 inversion is the most common cause of hemophilia A, accounting for approximately 40% of all severe cases of the disease. Southern hybridization and multiplex long distance PCR are the most commonly used techniques to detect the inversion in a diagnostic setting, although both have significant limitations. Here we describe our experience establishing a multicolor fluorescent in situ hybridization (FISH) based assay as an alternative to existing methods for genetic diagnosis of the inversion. Our assay was designed to apply three differentially labelled BAC DNA probes that when hybridized to interphase nuclei would exhibit signal patterns that are consistent with the normal or the inversion locus. When the FISH assay was applied to five normal and five inversion male samples, the correct genotype was assignable with p<0.001 for all samples. When applied to carrier female samples the assay could not assign a genotype to all female samples, probably due to a lower proportion of informative nuclei in female samples caused by the added complexity of a second X chromosome. Despite this complication, these pilot findings show that the assay performs favourably compared to the commonly used methods. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    NASA Astrophysics Data System (ADS)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  18. Divided and Sliding Superficial Temporal Artery Flap for Primary Donor-site Closure

    PubMed Central

    Sugio, Yuta; Seike, Shien; Hosokawa, Ko

    2016-01-01

    Summary: Superficial temporal artery (STA) flaps are often used for reconstruction of hair-bearing areas. However, primary closure of the donor site is not easy when the size of the necessary skin island is relatively large. In such cases, skin grafts are needed at the donor site, resulting in baldness. We have solved this issue by applying the divided and sliding flap technique, which was first reported for primary donor-site closure of a latissimus dorsi musculocutaneous flap. We applied this technique to the hair-bearing STA flap, where primary donor-site closure is extremely beneficial for preventing baldness consequent to skin grafting. The STA flap was divided into 3, and creation of large flap was possible. Therefore, we concluded that the divided and sliding STA flap could at least partially solve the donor-site problem. Although further investigation is necessary to validate the maximum possible flap size, this technique may be applicable to at least small defects that are common after skin cancer ablation or trauma. PMID:27975020

  19. Structural characterization of thioether-bridged bacteriocins.

    PubMed

    Lohans, Christopher T; Vederas, John C

    2014-01-01

    Bacteriocins are a group of ribosomally synthesized antimicrobial peptides produced by bacteria, some of which are extensively post-translationally modified. Some bacteriocins, namely the lantibiotics and sactibiotics, contain one or more thioether bridges. However, these modifications complicate the structural elucidation of these bacteriocins using conventional techniques. This review will discuss the techniques and strategies that have been applied to determine the primary structures of lantibiotics and sactibiotics. A major challenge is to identify the topology of thioether bridges in these peptides (i.e., which amino-acid residues are involved in which bridges). Edman degradation, NMR spectroscopy and tandem MS have all been commonly applied to characterize these bacteriocins, but can be incompatible with the post-translational modifications present. Chemical modifications to the modified residues, such as desulfurization and reduction, make the treated bacteriocins more compatible to analysis by these standard peptide analytical techniques. Despite their differences in structure, similar strategies have proved useful to study the structures of both lantibiotics and sactibiotics.

  20. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  1. Regression: The Apple Does Not Fall Far From the Tree.

    PubMed

    Vetter, Thomas R; Schober, Patrick

    2018-05-15

    Researchers and clinicians are frequently interested in either: (1) assessing whether there is a relationship or association between 2 or more variables and quantifying this association; or (2) determining whether 1 or more variables can predict another variable. The strength of such an association is mainly described by the correlation. However, regression analysis and regression models can be used not only to identify whether there is a significant relationship or association between variables but also to generate estimations of such a predictive relationship between variables. This basic statistical tutorial discusses the fundamental concepts and techniques related to the most common types of regression analysis and modeling, including simple linear regression, multiple regression, logistic regression, ordinal regression, and Poisson regression, as well as the common yet often underrecognized phenomenon of regression toward the mean. The various types of regression analysis are powerful statistical techniques, which when appropriately applied, can allow for the valid interpretation of complex, multifactorial data. Regression analysis and models can assess whether there is a relationship or association between 2 or more observed variables and estimate the strength of this association, as well as determine whether 1 or more variables can predict another variable. Regression is thus being applied more commonly in anesthesia, perioperative, critical care, and pain research. However, it is crucial to note that regression can identify plausible risk factors; it does not prove causation (a definitive cause and effect relationship). The results of a regression analysis instead identify independent (predictor) variable(s) associated with the dependent (outcome) variable. As with other statistical methods, applying regression requires that certain assumptions be met, which can be tested with specific diagnostics.

  2. Probing plasmodesmata function with biochemical inhibitors.

    PubMed

    White, Rosemary G

    2015-01-01

    To investigate plasmodesmata (PD) function, a useful technique is to monitor the effect on cell-to-cell transport of applying an inhibitor of a physiological process, protein, or other cell component of interest. Changes in PD transport can then be monitored in one of several ways, most commonly by measuring the cell-to-cell movement of fluorescent tracer dyes or of free fluorescent proteins. Effects on PD structure can be detected in thin sections of embedded tissue observed using an electron microscope, most commonly a Transmission Electron Microscope (TEM). This chapter outlines commonly used inhibitors, methods for treating different tissues, how to detect altered cell-to-cell transport and PD structure, and important caveats.

  3. Common aero vehicle autonomous reentry trajectory optimization satisfying waypoint and no-fly zone constraints

    NASA Astrophysics Data System (ADS)

    Jorris, Timothy R.

    2007-12-01

    To support the Air Force's Global Reach concept, a Common Aero Vehicle is being designed to support the Global Strike mission. "Waypoints" are specified for reconnaissance or multiple payload deployments and "no-fly zones" are specified for geopolitical restrictions or threat avoidance. Due to time critical targets and multiple scenario analysis, an autonomous solution is preferred over a time-intensive, manually iterative one. Thus, a real-time or near real-time autonomous trajectory optimization technique is presented to minimize the flight time, satisfy terminal and intermediate constraints, and remain within the specified vehicle heating and control limitations. This research uses the Hypersonic Cruise Vehicle (HCV) as a simplified two-dimensional platform to compare multiple solution techniques. The solution techniques include a unique geometric approach developed herein, a derived analytical dynamic optimization technique, and a rapidly emerging collocation numerical approach. This up-and-coming numerical technique is a direct solution method involving discretization then dualization, with pseudospectral methods and nonlinear programming used to converge to the optimal solution. This numerical approach is applied to the Common Aero Vehicle (CAV) as the test platform for the full three-dimensional reentry trajectory optimization problem. The culmination of this research is the verification of the optimality of this proposed numerical technique, as shown for both the two-dimensional and three-dimensional models. Additionally, user implementation strategies are presented to improve accuracy and enhance solution convergence. Thus, the contributions of this research are the geometric approach, the user implementation strategies, and the determination and verification of a numerical solution technique for the optimal reentry trajectory problem that minimizes time to target while satisfying vehicle dynamics and control limitation, and heating, waypoint, and no-fly zone constraints.

  4. Recent progress in the applications of layer-by-layer assembly to the preparation of nanostructured ion-rejecting water purification membranes.

    PubMed

    Sanyal, Oishi; Lee, Ilsoon

    2014-03-01

    Reverse osmosis (RO) and nanofiltration (NF) are the two dominant membrane separation processes responsible for ion rejection. While RO is highly efficient in removal of ions it needs a high operating pressure and offers very low selectivity between ions. Nanofiltration on the other hand has a comparatively low operating pressure and most commercial membranes offer selectivity in terms of ion rejection. However in many nanofiltration operations rejection of monovalent ions is not appreciable. Therefore a high flux high rejection membrane is needed that can be applied to water purification systems. One such alternative is the usage of polyelectrolyte multilayer membranes that are prepared by the deposition of alternately charged polyelectrolytes via layer-by-layer (LbL) assembly method. LbL is one of the most common self-assembly techniques and finds application in various areas. It has a number of tunable parameters like deposition conditions, number of bilayers deposited etc. which can be manipulated as per the type of application. This technique can be applied to make a nanothin membrane skin which gives high rejection and at the same time allow a high water flux across it. Several research groups have applied this highly versatile technique to prepare membranes that can be employed for water purification. Some of these membranes have shown better performance than the commercial nanofiltration and reverse osmosis membranes. These membranes have the potential to be applied to various different aspects of water treatment like water softening, desalination and recovery of certain ions. Besides the conventional method of LbL technique other alternative methods have also been suggested that can make the technique fast, more efficient and thereby make it more commercially acceptable.

  5. Methodological flaws introduce strong bias into molecular analysis of microbial populations.

    PubMed

    Krakat, N; Anjum, R; Demirel, B; Schröder, P

    2017-02-01

    In this study, we report how different cell disruption methods, PCR primers and in silico analyses can seriously bias results from microbial population studies, with consequences for the credibility and reproducibility of the findings. Our results emphasize the pitfalls of commonly used experimental methods that can seriously weaken the interpretation of results. Four different cell lysis methods, three commonly used primer pairs and various computer-based analyses were applied to investigate the microbial diversity of a fermentation sample composed of chicken dung. The fault-prone, but still frequently used, amplified rRNA gene restriction analysis was chosen to identify common weaknesses. In contrast to other studies, we focused on the complete analytical process, from cell disruption to in silico analysis, and identified potential error rates. This identified a wide disagreement of results between applied experimental approaches leading to very different community structures depending on the chosen approach. The interpretation of microbial diversity data remains a challenge. In order to accurately investigate the taxonomic diversity and structure of prokaryotic communities, we suggest a multi-level approach combining DNA-based and DNA-independent techniques. The identified weaknesses of commonly used methods to study microbial diversity can be overcome by a multi-level approach, which produces more reliable data about the fate and behaviour of microbial communities of engineered habitats such as biogas plants, so that the best performance can be ensured. © 2016 The Society for Applied Microbiology.

  6. Influence of two different surgical techniques on the difficulty of impacted lower third molar extraction and their post-operative complications.

    PubMed

    Mavrodi, Alexandra; Ohanyan, Ani; Kechagias, Nikos; Tsekos, Antonis; Vahtsevanos, Konstantinos

    2015-09-01

    Post-operative complications of various degrees of severity are commonly observed in third molar impaction surgery. For this reason, a surgical procedure that decreases the trauma of bone and soft tissues should be a priority for surgeons. In the present study, we compare the efficacy and the post-operative complications of patients to whom two different surgical techniques were applied for impacted lower third molar extraction. Patients of the first group underwent the classical bur technique, while patients of the second group underwent another technique, in which an elevator was placed on the buccal surface of the impacted molar in order to luxate the alveolar socket more easily. Comparing the two techniques, we observed a statistically significant decrease in the duration of the procedure and in the need for tooth sectioning when applying the second surgical technique, while the post-operative complications were similar in the two groups. We also found a statistically significant lower incidence of lingual nerve lesions and only a slightly higher frequency of sharp mandibular bone irregularities in the second group, which however was not statistically significant. The results of our study indicate that the surgical technique using an elevator on the buccal surface of the tooth seems to be a reliable method to extract impacted third molars safely, easily, quickly and with the minimum trauma to the surrounding tissues.

  7. Pheromone-assisted techniques to improve the efficacy of insecticide sprays against Linepithema humile (Hymenoptera: Formicidae).

    PubMed

    Choe, Dong-Hwan; Tsai, Kasumi; Lopez, Carlos M; Campbell, Kathleen

    2014-02-01

    Outdoor residual sprays are among the most common methods for targeting pestiferous ants in urban pest management programs. If impervious surfaces such as concrete are treated with these insecticides, the active ingredients can be washed from the surface by rain or irrigation. As a result, residual sprays with fipronil and pyrethroids are found in urban waterways and aquatic sediments. Given the amount of insecticides applied to urban settings for ant control and their possible impact on urban waterways, the development of alternative strategies is critical to decrease the overall amounts of insecticides applied, while still achieving effective control of target ant species. Herein we report a "pheromone-assisted technique" as an economically viable approach to maximize the efficacy of conventional sprays targeting the Argentine ant. By applying insecticide sprays supplemented with an attractive pheromone compound, (Z)-9-hexadecenal, Argentine ants were diverted from nearby trails and nest entrances and subsequently exposed to insecticide residues. Laboratory experiments with fipronil and bifenthrin sprays indicated that the overall kill of the insecticides on Argentine ant colonies was significantly improved (57-142% increase) by incorporating (Z)-9-hexadecenal in the insecticide sprays. This technique, once it is successfully implemented in practical pest management programs, has the potential of providing maximum control efficacy with reduced amount of insecticides applied in the environment.

  8. Why do I need to know this? Optics/photonics problem-based learning in the math classroom

    NASA Astrophysics Data System (ADS)

    Donnelly, Matthew J.; Donnelly, Judith F.; Donnelly, Stephanie

    2017-08-01

    A common complaint of engineering managers is that new employees at all levels, technician through engineer, tend to have rote calculation ability but are unable to think critically and use structured problem solving techniques to apply mathematical concepts. Further, they often have poor written and oral communication skills and difficulty working in teams. Ironically, a common question of high school mathematics students is "Why do I need to know this?" In this paper we describe a project using optics/photonics and Problem Based Learning (PBL) to address these issues in a high school calculus classroom.

  9. Results From a Pressure Sensitive Paint Test Conducted at the National Transonic Facility on Test 197: The Common Research Model

    NASA Technical Reports Server (NTRS)

    Watkins, A. Neal; Lipford, William E.; Leighty, Bradley D.; Goodman, Kyle Z.; Goad, William K.; Goad, Linda R.

    2011-01-01

    This report will serve to present results of a test of the pressure sensitive paint (PSP) technique on the Common Research Model (CRM). This test was conducted at the National Transonic Facility (NTF) at NASA Langley Research Center. PSP data was collected on several surfaces with the tunnel operating in both cryogenic mode and standard air mode. This report will also outline lessons learned from the test as well as possible approaches to challenges faced in the test that can be applied to later entries.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Küchemann, Stefan; Mahn, Carsten; Samwer, Konrad

    The investigation of short time dynamics using X-ray scattering techniques is commonly limited either by the read out frequency of the detector or by a low intensity. In this paper, we present a chopper system, which can increase the temporal resolution of 2D X-ray detectors by a factor of 13. This technique only applies to amorphous or polycrystalline samples due to their circular diffraction patterns. Using the chopper, we successfully increased the temporal resolution up to 5.1 ms during synchrotron experiments. For the construction, we provide a mathematical formalism, which, in principle, allows an even higher increase of the temporalmore » resolution.« less

  11. A comparative review of optical surface contamination assessment techniques

    NASA Technical Reports Server (NTRS)

    Heaney, James B.

    1987-01-01

    This paper will review the relative sensitivities and practicalities of the common surface analytical methods that are used to detect and identify unwelcome adsorbants on optical surfaces. The compared methods include visual inspection, simple reflectometry and transmissiometry, ellipsometry, infrared absorption and attenuated total reflectance spectroscopy (ATR), Auger electron spectroscopy (AES), scanning electron microscopy (SEM), secondary ion mass spectrometry (SIMS), and mass accretion determined by quartz crystal microbalance (QCM). The discussion is biased toward those methods that apply optical thin film analytical techniques to spacecraft optical contamination problems. Examples are cited from both ground based and in-orbit experiments.

  12. Behavior driven testing in ALMA telescope calibration software

    NASA Astrophysics Data System (ADS)

    Gil, Juan P.; Garces, Mario; Broguiere, Dominique; Shen, Tzu-Chiang

    2016-07-01

    ALMA software development cycle includes well defined testing stages that involves developers, testers and scientists. We adapted Behavior Driven Development (BDD) to testing activities applied to Telescope Calibration (TELCAL) software. BDD is an agile technique that encourages communication between roles by defining test cases using natural language to specify features and scenarios, what allows participants to share a common language and provides a high level set of automated tests. This work describes how we implemented and maintain BDD testing for TELCAL, the infrastructure needed to support it and proposals to expand this technique to other subsystems.

  13. [Advancements of computer chemistry in separation of Chinese medicine].

    PubMed

    Li, Lingjuan; Hong, Hong; Xu, Xuesong; Guo, Liwei

    2011-12-01

    Separating technique of Chinese medicine is not only a key technique in the field of Chinese medicine' s research and development, but also a significant step in the modernization of Chinese medicinal preparation. Computer chemistry can build model and look for the regulations from Chinese medicine system which is full of complicated data. This paper analyzed the applicability, key technology, basic mode and common algorithm of computer chemistry applied in the separation of Chinese medicine, introduced the mathematic mode and the setting methods of Extraction kinetics, investigated several problems which based on traditional Chinese medicine membrane procession, and forecasted the application prospect.

  14. Pilot self-coding applied in optical OFDM systems

    NASA Astrophysics Data System (ADS)

    Li, Changping; Yi, Ying; Lee, Kyesan

    2015-04-01

    This paper studies the frequency offset correction technique which can be applied in optical OFDM systems. Through theoretical analysis and computer simulations, we can observe that our proposed scheme named pilot self-coding (PSC) has a distinct influence for rectifying the frequency offset, which could mitigate the OFDM performance deterioration because of inter-carrier interference and common phase error. The main approach is to assign a pilot subcarrier before data subcarriers and copy this subcarrier sequence to the symmetric side. The simulation results verify that our proposed PSC is indeed effective against the high degree of frequency offset.

  15. Response Manual for Combating Spills of Floating Hazardous CHRIS chemicals

    DTIC Science & Technology

    1989-01-01

    CHRIS Chemicals CHRIS Chemical Name Code Floatability PARAFORMALDEHYDE PFA No PARALDEHYDE PDH No PARATHION PTO No PENTABORANE PTB No PENTACHLOROETHANE...5.2.1.3 Weir Skimmers ................................ 76 5.2.2 Chemical Removal Techniques ........................... 77 5.2.2.1 Sorption ...an- ditions such as high winds and rain 5.2.2.1 Sorption Sorption is commonly applied in water treatment processes. Being a surface process

  16. Three Dimensional Positron Annihilation Momentum Measurement Technique Applied to Measure Oxygen-Atom Defects in 6H Silicon Carbide

    DTIC Science & Technology

    2010-03-01

    Stormer et al [9] measured 6H SiC’s positron work function (Φ + ),-3.0 ± 0.2 eV, which is the same value for the most commonly used positron...Subjected to various Treatments‖, Materials Science Forum, Vols. 255-7, pp. 662-4. 9. Stormer J, Goodyear A, Anwand W, Brauer G, Coleman P, and

  17. Applied Computational Electromagnetics Society Journal and Newletter, Volume 14 No. 1

    DTIC Science & Technology

    1999-03-01

    code validation, performance analysis, and input/output standardization; code or technique optimization and error minimization; innovations in...SOUTH AFRICA Alamo, CA, 94507-0516 USA Washington, DC 20330 USA MANAGING EDITOR Kueichien C. Hill Krishna Naishadham Richard W. Adler Wright Laboratory...INSTITUTIONAL MEMBERS ALLGON DERA Nasvagen 17 Common Road, Funtington INNOVATIVE DYNAMICS Akersberga, SWEDEN S-18425 Chichester, P018 9PD UK 2560 N. Triphammer

  18. Nowcasting Cloud Fields for U.S. Air Force Special Operations

    DTIC Science & Technology

    2017-03-01

    application of Bayes’ Rule offers many advantages over Kernel Density Estimation (KDE) and other commonly used statistical post-processing methods...reflectance and probability of cloud. A statistical post-processing technique is applied using Bayesian estimation to train the system from a set of past...nowcasting, low cloud forecasting, cloud reflectance, ISR, Bayesian estimation, statistical post-processing, machine learning 15. NUMBER OF PAGES

  19. Nanoroughened plasmonic films for enhanced biosensing detection

    NASA Astrophysics Data System (ADS)

    LeMoal, Eric; Lévêque-Fort, Sandrine; Potier, Marie-Claude; Fort, Emmanuel

    2009-06-01

    Although fluorescence is the prevailing labeling technique in biosensing applications, sensitivity improvement is still a striving challenge. We show that coating standard microscope slides with nanoroughened silver films provides a high fluorescence signal enhancement due to plasmonic interactions. As a proof of concept, we applied these films with tailored plasmonic properties to DNA microarrays. Using common optical scanning devices, we achieved signal amplifications of more than 40-fold.

  20. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007.

    PubMed

    Gore, Sally A; Nordberg, Judith M; Palmer, Lisa A; Piorun, Mary E

    2009-07-01

    This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991-2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians.

  1. Robust watermark technique using masking and Hermite transform.

    PubMed

    Coronel, Sandra L Gomez; Ramírez, Boris Escalante; Mosqueda, Marco A Acevedo

    2016-01-01

    The following paper evaluates a watermark algorithm designed for digital images by using a perceptive mask and a normalization process, thus preventing human eye detection, as well as ensuring its robustness against common processing and geometric attacks. The Hermite transform is employed because it allows a perfect reconstruction of the image, while incorporating human visual system properties; moreover, it is based on the Gaussian functions derivates. The applied watermark represents information of the digital image proprietor. The extraction process is blind, because it does not require the original image. The following techniques were utilized in the evaluation of the algorithm: peak signal-to-noise ratio, the structural similarity index average, the normalized crossed correlation, and bit error rate. Several watermark extraction tests were performed, with against geometric and common processing attacks. It allowed us to identify how many bits in the watermark can be modified for its adequate extraction.

  2. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  3. Craniosynostosis of the Lambdoid Suture

    PubMed Central

    Rhodes, Jennifer L.; Tye, Gary W.; Fearon, Jeffrey A.

    2014-01-01

    Craniosynostosis affecting the lambdoid suture is uncommon. The definition of lambdoid craniosynostosis solely applies to those cases demonstrating true suture obliteration, similar to other forms of craniosynostosis. In patients presenting with posterior plagiocephaly, true lambdoid craniosynostosis must be differentiated from the much more common positional molding. It can occur in a unilateral form, a bilateral form, or as part of a complex craniosynostosis. In children with craniofacial syndromes, synostosis of the lambdoid suture most often is seen within the context of a pansynostotic picture. Chiari malformations are commonly seen in multisutural and syndromic types of craniosynostosis that affect the lambdoid sutures. Posterior cranial vault remodeling is recommended to provide adequate intracranial volume to allow for brain growth and to normalize the skull shape. Although many techniques have been described for the correction of lambdoid synostosis, optimal outcomes may result from those techniques based on the concept of occipital advancement. PMID:25210507

  4. Craniosynostosis of the lambdoid suture.

    PubMed

    Rhodes, Jennifer L; Tye, Gary W; Fearon, Jeffrey A

    2014-08-01

    Craniosynostosis affecting the lambdoid suture is uncommon. The definition of lambdoid craniosynostosis solely applies to those cases demonstrating true suture obliteration, similar to other forms of craniosynostosis. In patients presenting with posterior plagiocephaly, true lambdoid craniosynostosis must be differentiated from the much more common positional molding. It can occur in a unilateral form, a bilateral form, or as part of a complex craniosynostosis. In children with craniofacial syndromes, synostosis of the lambdoid suture most often is seen within the context of a pansynostotic picture. Chiari malformations are commonly seen in multisutural and syndromic types of craniosynostosis that affect the lambdoid sutures. Posterior cranial vault remodeling is recommended to provide adequate intracranial volume to allow for brain growth and to normalize the skull shape. Although many techniques have been described for the correction of lambdoid synostosis, optimal outcomes may result from those techniques based on the concept of occipital advancement.

  5. Application of biofilm bioreactors in white biotechnology.

    PubMed

    Muffler, K; Lakatos, M; Schlegel, C; Strieth, D; Kuhne, S; Ulber, R

    2014-01-01

    The production of valuable compounds in industrial biotechnology is commonly done by cultivation of suspended cells or use of (immobilized) enzymes rather than using microorganisms in an immobilized state. Within the field of wastewater as well as odor treatment the application of immobilized cells is a proven technique. The cells are entrapped in a matrix of extracellular polymeric compounds produced by themselves. The surface-associated agglomerate of encapsulated cells is termed biofilm. In comparison to common immobilization techniques, toxic effects of compounds used for cell entrapment may be neglected. Although the economic impact of biofilm processes used for the production of valuable compounds is negligible, many prospective approaches were examined in the laboratory and on a pilot scale. This review gives an overview of biofilm reactors applied to the production of valuable compounds. Moreover, the characteristics of the utilized materials are discussed with respect to support of surface-attached microbial growth.

  6. Quality factors and local adaption (with applications in Eulerian hydrodynamics)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowley, W.P.

    1992-06-17

    Adapting the mesh to suit the solution is a technique commonly used for solving both ode`s and pde`s. For Lagrangian hydrodynamics, ALE and Free-Lagrange are examples of structured and unstructured adaptive methods. For Eulerian hydrodynamics the two basic approaches are the macro-unstructuring technique pioneered by Oliger and Berger and the micro-structuring technique due to Lohner and others. Here we will describe a new micro-unstructuring technique, LAM, (for Local Adaptive Mesh) as applied to Eulerian hydrodynamics. The LAM technique consists of two independent parts: (1) the time advance scheme is a variation on the artificial viscosity method; (2) the adaption schememore » uses a micro-unstructured mesh with quadrilateral mesh elements. The adaption scheme makes use of quality factors and the relation between these and truncation errors is discussed. The time advance scheme; the adaption strategy; and the effect of different adaption parameters on numerical solutions are described.« less

  7. Quality factors and local adaption (with applications in Eulerian hydrodynamics)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowley, W.P.

    1992-06-17

    Adapting the mesh to suit the solution is a technique commonly used for solving both ode's and pde's. For Lagrangian hydrodynamics, ALE and Free-Lagrange are examples of structured and unstructured adaptive methods. For Eulerian hydrodynamics the two basic approaches are the macro-unstructuring technique pioneered by Oliger and Berger and the micro-structuring technique due to Lohner and others. Here we will describe a new micro-unstructuring technique, LAM, (for Local Adaptive Mesh) as applied to Eulerian hydrodynamics. The LAM technique consists of two independent parts: (1) the time advance scheme is a variation on the artificial viscosity method; (2) the adaption schememore » uses a micro-unstructured mesh with quadrilateral mesh elements. The adaption scheme makes use of quality factors and the relation between these and truncation errors is discussed. The time advance scheme; the adaption strategy; and the effect of different adaption parameters on numerical solutions are described.« less

  8. Key-Node-Separated Graph Clustering and Layouts for Human Relationship Graph Visualization.

    PubMed

    Itoh, Takayuki; Klein, Karsten

    2015-01-01

    Many graph-drawing methods apply node-clustering techniques based on the density of edges to find tightly connected subgraphs and then hierarchically visualize the clustered graphs. However, users may want to focus on important nodes and their connections to groups of other nodes for some applications. For this purpose, it is effective to separately visualize the key nodes detected based on adjacency and attributes of the nodes. This article presents a graph visualization technique for attribute-embedded graphs that applies a graph-clustering algorithm that accounts for the combination of connections and attributes. The graph clustering step divides the nodes according to the commonality of connected nodes and similarity of feature value vectors. It then calculates the distances between arbitrary pairs of clusters according to the number of connecting edges and the similarity of feature value vectors and finally places the clusters based on the distances. Consequently, the technique separates important nodes that have connections to multiple large clusters and improves the visibility of such nodes' connections. To test this technique, this article presents examples with human relationship graph datasets, including a coauthorship and Twitter communication network dataset.

  9. Usability evaluation techniques in mobile commerce applications: A systematic review

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Obviously, there are a number of literatures concerning the usability of mobile commerce (m-commerce) applications and related areas, but they do not adequately provide knowledge about usability techniques used in most of the empirical usability evaluation for m-commerce application. Therefore, this paper is aimed at producing the usability techniques frequently used in the aspect of usability evaluation for m-commerce applications. To achieve the stated objective, systematic literature review was employed. Sixty seven papers were downloaded in usability evaluation for m-commerce and related areas; twenty one most relevant studies were selected for review in order to extract the appropriate information. The results from the review shows that heuristic evaluation, formal test and think aloud methods are the most commonly used methods in m-commerce application in comparison to cognitive walkthrough and the informal test methods. Moreover, most of the studies applied control experiment (33.3% of the total studies); other studies that applied case study for usability evaluation are 14.28%. The results from this paper provide additional knowledge to the usability practitioners and research community for the current state and use of usability techniques in m-commerce application.

  10. Differentiation of tea varieties using UV-Vis spectra and pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Palacios-Morillo, Ana; Alcázar, Ángela.; de Pablos, Fernando; Jurado, José Marcos

    2013-02-01

    Tea, one of the most consumed beverages all over the world, is of great importance in the economies of a number of countries. Several methods have been developed to classify tea varieties or origins based in pattern recognition techniques applied to chemical data, such as metal profile, amino acids, catechins and volatile compounds. Some of these analytical methods become tedious and expensive to be applied in routine works. The use of UV-Vis spectral data as discriminant variables, highly influenced by the chemical composition, can be an alternative to these methods. UV-Vis spectra of methanol-water extracts of tea have been obtained in the interval 250-800 nm. Absorbances have been used as input variables. Principal component analysis was used to reduce the number of variables and several pattern recognition methods, such as linear discriminant analysis, support vector machines and artificial neural networks, have been applied in order to differentiate the most common tea varieties. A successful classification model was built by combining principal component analysis and multilayer perceptron artificial neural networks, allowing the differentiation between tea varieties. This rapid and simple methodology can be applied to solve classification problems in food industry saving economic resources.

  11. What can we learn from corporate sustainability reporting? Deriving propositions for research and practice from over 9,500 corporate sustainability reports published between 1999 and 2015 using topic modelling technique

    PubMed Central

    vom Brocke, Jan

    2017-01-01

    Organizations are increasingly using sustainability reports to inform their stakeholders and the public about their sustainability practices. We apply topic modelling to 9,514 sustainability reports published between 1999 and 2015 in order to identify common topics and, thus, the most common practices described in these reports. In particular, we identify forty-two topics that reflect sustainability and focus on the coverage and trends of economic, environmental, and social sustainability topics. Among the first to analyse such a large amount of data on organizations’ sustainability reporting, the paper serves as an example of how to apply natural language processing as a strategy of inquiry in sustainability research. The paper also derives from the data analysis ten propositions for future research and practice that are of immediate value for organizations and researchers. PMID:28403158

  12. What can we learn from corporate sustainability reporting? Deriving propositions for research and practice from over 9,500 corporate sustainability reports published between 1999 and 2015 using topic modelling technique.

    PubMed

    Székely, Nadine; Vom Brocke, Jan

    2017-01-01

    Organizations are increasingly using sustainability reports to inform their stakeholders and the public about their sustainability practices. We apply topic modelling to 9,514 sustainability reports published between 1999 and 2015 in order to identify common topics and, thus, the most common practices described in these reports. In particular, we identify forty-two topics that reflect sustainability and focus on the coverage and trends of economic, environmental, and social sustainability topics. Among the first to analyse such a large amount of data on organizations' sustainability reporting, the paper serves as an example of how to apply natural language processing as a strategy of inquiry in sustainability research. The paper also derives from the data analysis ten propositions for future research and practice that are of immediate value for organizations and researchers.

  13. Detection of genetically modified organisms (GMOs) using isothermal amplification of target DNA sequences.

    PubMed

    Lee, David; La Mura, Maurizio; Allnutt, Theo R; Powell, Wayne

    2009-02-02

    The most common method of GMO detection is based upon the amplification of GMO-specific DNA amplicons using the polymerase chain reaction (PCR). Here we have applied the loop-mediated isothermal amplification (LAMP) method to amplify GMO-related DNA sequences, 'internal' commonly-used motifs for controlling transgene expression and event-specific (plant-transgene) junctions. We have tested the specificity and sensitivity of the technique for use in GMO studies. Results show that detection of 0.01% GMO in equivalent background DNA was possible and dilutions of template suggest that detection from single copies of the template may be possible using LAMP. This work shows that GMO detection can be carried out using LAMP for routine screening as well as for specific events detection. Moreover, the sensitivity and ability to amplify targets, even with a high background of DNA, here demonstrated, highlights the advantages of this isothermal amplification when applied for GMO detection.

  14. International Seminar on Laser and Opto-Electronic Technology in Industry: State-of-the-Art Review, Xiamen, People's Republic of China, June 25-28, 1986, Proceedings

    NASA Astrophysics Data System (ADS)

    Ke, Jingtang; Pryputniewicz, Ryszard J.

    Various papers on the state of the art in laser and optoelectronic technology in industry are presented. Individual topics addressed include: wavelength compensation for holographic optical element, optoelectronic techniques for measurement and inspection, new optical measurement methods in Western Europe, applications of coherent optics at ISL, imaging techniques for gas turbine development, the Rolls-Royce experience with industrial holography, panoramic holocamera for tube and borehole inspection, optical characterization of electronic materials, optical strain measurement of rotating components, quantitative interpretation of holograms and specklegrams, laser speckle technique for hydraulic structural model test, study of holospeckle interferometry, common path shearing fringe scanning interferometer, and laser interferometry applied to nondestructive testing of tires.

  15. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  16. Visual mining geo-related data using pixel bar charts

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Keim, Daniel A.; Dayal, Umeshwar; Wright, Peter; Schneidewind, Joern

    2005-03-01

    A common approach to analyze geo-related data is using bar charts or x-y plots. They are intuitive and easy to use. But important information often gets lost. In this paper, we introduce a new interactive visualization technique called Geo Pixel Bar Charts, which combines the advantages of Pixel Bar Charts and interactive maps. This technique allows analysts to visualize large amounts of spatial data without aggregation and shows the geographical regions corresponding to the spatial data attribute at the same time. In this paper, we apply Geo Pixel Bar Charts to visually mining sales transactions and Internet usage from different locations. Our experimental results show the effectiveness of this technique for providing data distribution and exceptions from the map.

  17. Performance of Grey Wolf Optimizer on large scale problems

    NASA Astrophysics Data System (ADS)

    Gupta, Shubham; Deep, Kusum

    2017-01-01

    For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.

  18. Scalable Methods for Eulerian-Lagrangian Simulation Applied to Compressible Multiphase Flows

    NASA Astrophysics Data System (ADS)

    Zwick, David; Hackl, Jason; Balachandar, S.

    2017-11-01

    Multiphase flows can be found in countless areas of physics and engineering. Many of these flows can be classified as dispersed two-phase flows, meaning that there are solid particles dispersed in a continuous fluid phase. A common technique for simulating such flow is the Eulerian-Lagrangian method. While useful, this method can suffer from scaling issues on larger problem sizes that are typical of many realistic geometries. Here we present scalable techniques for Eulerian-Lagrangian simulations and apply it to the simulation of a particle bed subjected to expansion waves in a shock tube. The results show that the methods presented here are viable for simulation of larger problems on modern supercomputers. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1315138. This work was supported in part by the U.S. Department of Energy under Contract No. DE-NA0002378.

  19. Industrial metrology as applied to large physics experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veal, D.

    1993-05-01

    A physics experiment is a large complex 3-D object (typ. 1200 m{sup 3}, 35000 tonnes), with sub-millimetric alignment requirements. Two generic survey alignment tasks can be identified; first, an iterative positioning of the apparatus subsystems in space and, second, a quantification of as-built parameters. The most convenient measurement technique is industrial triangulation but the complexity of the measured object and measurement environment constraints frequently requires a more sophisticated approach. To enlarge the ``survey alignment toolbox`` measurement techniques commonly associated with other disciplines such as geodesy, applied geodesy for accelerator alignment, and mechanical engineering are also used. Disparate observables require amore » heavy reliance on least squares programs for campaign pre-analysis and calculation. This paper will offer an introduction to the alignment of physics experiments and will identify trends for the next generation of SSC experiments.« less

  20. Null steering of adaptive beamforming using linear constraint minimum variance assisted by particle swarm optimization, dynamic mutated artificial immune system, and gravitational search algorithm.

    PubMed

    Darzi, Soodabeh; Kiong, Tiong Sieh; Islam, Mohammad Tariqul; Ismail, Mahamod; Kibria, Salehin; Salem, Balasem

    2014-01-01

    Linear constraint minimum variance (LCMV) is one of the adaptive beamforming techniques that is commonly applied to cancel interfering signals and steer or produce a strong beam to the desired signal through its computed weight vectors. However, weights computed by LCMV usually are not able to form the radiation beam towards the target user precisely and not good enough to reduce the interference by placing null at the interference sources. It is difficult to improve and optimize the LCMV beamforming technique through conventional empirical approach. To provide a solution to this problem, artificial intelligence (AI) technique is explored in order to enhance the LCMV beamforming ability. In this paper, particle swarm optimization (PSO), dynamic mutated artificial immune system (DM-AIS), and gravitational search algorithm (GSA) are incorporated into the existing LCMV technique in order to improve the weights of LCMV. The simulation result demonstrates that received signal to interference and noise ratio (SINR) of target user can be significantly improved by the integration of PSO, DM-AIS, and GSA in LCMV through the suppression of interference in undesired direction. Furthermore, the proposed GSA can be applied as a more effective technique in LCMV beamforming optimization as compared to the PSO technique. The algorithms were implemented using Matlab program.

  1. Null Steering of Adaptive Beamforming Using Linear Constraint Minimum Variance Assisted by Particle Swarm Optimization, Dynamic Mutated Artificial Immune System, and Gravitational Search Algorithm

    PubMed Central

    Sieh Kiong, Tiong; Tariqul Islam, Mohammad; Ismail, Mahamod; Salem, Balasem

    2014-01-01

    Linear constraint minimum variance (LCMV) is one of the adaptive beamforming techniques that is commonly applied to cancel interfering signals and steer or produce a strong beam to the desired signal through its computed weight vectors. However, weights computed by LCMV usually are not able to form the radiation beam towards the target user precisely and not good enough to reduce the interference by placing null at the interference sources. It is difficult to improve and optimize the LCMV beamforming technique through conventional empirical approach. To provide a solution to this problem, artificial intelligence (AI) technique is explored in order to enhance the LCMV beamforming ability. In this paper, particle swarm optimization (PSO), dynamic mutated artificial immune system (DM-AIS), and gravitational search algorithm (GSA) are incorporated into the existing LCMV technique in order to improve the weights of LCMV. The simulation result demonstrates that received signal to interference and noise ratio (SINR) of target user can be significantly improved by the integration of PSO, DM-AIS, and GSA in LCMV through the suppression of interference in undesired direction. Furthermore, the proposed GSA can be applied as a more effective technique in LCMV beamforming optimization as compared to the PSO technique. The algorithms were implemented using Matlab program. PMID:25147859

  2. Signal Detection and Monitoring Based on Longitudinal Healthcare Data

    PubMed Central

    Suling, Marc; Pigeot, Iris

    2012-01-01

    Post-marketing detection and surveillance of potential safety hazards are crucial tasks in pharmacovigilance. To uncover such safety risks, a wide set of techniques has been developed for spontaneous reporting data and, more recently, for longitudinal data. This paper gives a broad overview of the signal detection process and introduces some types of data sources typically used. The most commonly applied signal detection algorithms are presented, covering simple frequentistic methods like the proportional reporting rate or the reporting odds ratio, more advanced Bayesian techniques for spontaneous and longitudinal data, e.g., the Bayesian Confidence Propagation Neural Network or the Multi-item Gamma-Poisson Shrinker and methods developed for longitudinal data only, like the IC temporal pattern detection. Additionally, the problem of adjustment for underlying confounding is discussed and the most common strategies to automatically identify false-positive signals are addressed. A drug monitoring technique based on Wald’s sequential probability ratio test is presented. For each method, a real-life application is given, and a wide set of literature for further reading is referenced. PMID:24300373

  3. Comparison of extraction techniques of robenidine from poultry feed samples.

    PubMed

    Wilga, Joanna; Wasik, Agata Kot-; Namieśnik, Jacek

    2007-10-31

    In this paper, effectiveness of six different commonly applied extraction techniques for the determination of robenidine in poultry feed has been compared. The sample preparation techniques included shaking, Soxhlet, Soxtec, ultrasonically assisted extraction, microwave - assisted extraction and accelerated solvent extraction. Comparison of these techniques was done with respect to the recovery extraction, temperature and time, reproducibility and solvent consumption. Every single extract was subjected to clean - up using aluminium oxide column (Pasteur pipette filled with 1g of aluminium oxide), from which robenidine was eluted with 10ml of methanol. The eluate from the clean-up column was collected in a volumetric flask, and finally it was analysed by HPLC-DAD-MS. In general, all extraction techniques were capable of isolating of robenidine from poultry feed, but the recovery obtained using modern extraction techniques was higher than that obtained using conventional techniques. In particular, accelerated solvent extraction was more superior to other techniques, which highlights the advantages of this sample preparation technique. However, in routine analysis, shaking and ultrasonically assisted extraction is still the preferred method for the solution of robenidine and other coccidiostatics.

  4. Evaluation of immobilized metal membrane affinity chromatography for purification of an immunoglobulin G1 monoclonal antibody.

    PubMed

    Serpa, Gisele; Augusto, Elisabeth Fátima Pires; Tamashiro, Wirla Maria Silva Cunha; Ribeiro, Mariana Borçoe; Miranda, Everson Alves; Bueno, Sônia Maria Alves

    2005-02-25

    The large scale production of monoclonal antibodies (McAbs) has gaining increased relevance with the development of the hybridoma cell culture in bioreactors creating a need for specific efficient bioseparation techniques. Conventional fixed bead affinity adsorption commonly applied for McAbs purification has the drawback of low flow rates and colmatage. We developed and evaluated a immobilized metal affinity chromatographies (IMAC) affinity membrane for the purification of anti-TNP IgG(1) mouse McAbs. We immobilized metal ions on a poly(ethylene vinyl alcohol) hollow fiber membrane (Me(2+)-IDA-PEVA) and applied it for the purification of this McAbs from cell culture supernatant after precipitation with 50% saturation of ammonium sulphate. The purity of IgG(1) in the eluate fractions was high when eluted from Zn(2+) complex. The anti-TNP antibody could be eluted under conditions causing no loss of antigen binding capacity. The purification procedure can be considered as an alternative to the biospecific adsorbent commonly applied for mouse IgG(1) purification, the protein G-Sepharose.

  5. Analysis of the United States Marine Corps Continuous Process Improvement Program Applied to the Contracting Process at Marine Corps Regional Contracting Office - Southwest

    DTIC Science & Technology

    2007-12-01

    37 3. Poka - yoke ............................................................................................37 4. Systems for...Standard operating procedures • Visual displays for workflow and communication • Total productive maintenance • Poka - yoke techniques to prevent...process step or eliminating non-value-added steps, and reducing the seven common wastes, will decrease the total time of a process. 3. Poka - yoke

  6. The Effect of Multispectral Image Fusion Enhancement on Human Efficiency

    DTIC Science & Technology

    2017-03-20

    human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of

  7. Force Enhancement Packages for Countering Nuclear Threats in the 2022-2027 Time Frame

    DTIC Science & Technology

    2015-09-01

    characterization methods . • Apply proper radioisotope identification techniques. c. A one-week CNT operations exercise at Fort Belvoir, Virginia. Team members...on experiments to seek better methods , holding active teaching until later. The team expects that better methods would involve collection using...conduct more effective wide-area searches than those commonly employed by civil law enforcement agencies. The IDA team suggests that better methods

  8. Valuing a Global Environmental Good: U.S. Residents' Willingness to Pay to Protect Tropical Rain Forests

    Treesearch

    Randall A. Kramer; D. Evan Mercer

    1997-01-01

    (CV) is the most common technique for valuing nonmarket environmental resources, rarely has it been applied to global environmental goods. This study uses CV in a national survey to assess the value U.S. residents place on tropical rain forest protection. On average, respondents were willing to make a one-time payment of approximately $21-31 per household to protect an...

  9. Aircraft Survivability: Rotorcraft Survivability. Summer 2010

    DTIC Science & Technology

    2010-01-01

    Loading of the shafts was conducted using two techniques. The first tech- nique applied a torsion load up to the design limit load after the article...show the ballistic impact and impact damage. Figure 11 shows a 45-degree shaft failure, a common failure type, when loaded to design limit after...SUMMER 2010 ROTORCRAFT Survivability STUDY ON ROTORCRAFT SURVIVABILITY V-22 INTEGRATED SURVIVABILITY DESIGN CH-53K HEAVY LIFT HELICOPTER 9 20 25

  10. Managing Variation in Services in a Software Product Line Context

    DTIC Science & Technology

    2010-05-01

    Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and

  11. Summer Research Program (1992). High School Apprenticeship Program (HSAP) Reports. Volume 14. Rome Laboratory.

    DTIC Science & Technology

    1992-12-28

    analysis. Marvin Minsky , carefully applying mathematical techniques, developed rigo.,ous theorems regarding netwcrk operation. His research led to the...electrical circuits but was later convened to computer simulation, which is still commonly used today. Early success by - Marvirn Minsky , Frank...publication of the book Perceptrons ( Minsky and Papert 1969), in which he and Seymore Papert proved that the single-layer networks then in use were

  12. Physical and Clinical Evaluation of Hip Spica Cast applied with Three-slab Technique using Fibreglass Material

    PubMed Central

    Bitar, KM; Ferdhany, ME; Saw, A

    2016-01-01

    Introduction: Hip spica casting is an important component of treatment for developmental dysplasia of the hip (DDH) and popular treatment method for femur fractures in children. Breakage at the hip region is a relatively common problem of this cast. We have developed a three-slab technique of hip spica application using fibreglass as the cast material. The purpose of this review was to evaluate the physical durability of the spica cast and skin complications with its use. Methodology: A retrospective review of children with various conditions requiring hip spica immobilisation which was applied using our method. Study duration was from 1st of January 2014 until 31st December 2015. Our main outcomes were cast breakage and skin complications. For children with hip instability, the first cast would be changed after one month, and the second cast about two months later. Results: Twenty-one children were included, with an average age of 2.2 years. The most common indication for spica immobilisation was developmental dysplasia of the hip. One child had skin irritation after spica application. No spica breakage was noted. Conclusion: This study showed that the three-slab method of hip spica cast application using fibreglass material was durable and safe with low risk of skin complications. PMID:28553442

  13. On the performance of SART and ART algorithms for microwave imaging

    NASA Astrophysics Data System (ADS)

    Aprilliyani, Ria; Prabowo, Rian Gilang; Basari

    2018-02-01

    The development of advanced technology leads to the change of human lifestyle in current society. One of the disadvantage impact is arising the degenerative diseases such as cancers and tumors, not just common infectious diseases. Every year, victims of cancers and tumors grow significantly leading to one of the death causes in the world. In early stage, cancer/tumor does not have definite symptoms, but it will grow abnormally as tissue cells and damage normal tissue. Hence, early cancer detection is required. Some common diagnostics modalities such as MRI, CT and PET are quite difficult to be operated in home or mobile environment such as ambulance. Those modalities are also high cost, unpleasant, complex, less safety and harder to move. Hence, this paper proposes a microwave imaging system due to its portability and low cost. In current study, we address on the performance of simultaneous algebraic reconstruction technique (SART) algorithm that was applied in microwave imaging. In addition, SART algorithm performance compared with our previous work on algebraic reconstruction technique (ART), in order to have performance comparison, especially in the case of reconstructed image quality. The result showed that by applying SART algorithm on microwave imaging, suspicious cancer/tumor can be detected with better image quality.

  14. The longitudinal offset technique for apodization of coupled resonator optical waveguide devices: concept and fabrication tolerance analysis.

    PubMed

    Doménech, José David; Muñoz, Pascual; Capmany, José

    2009-11-09

    In this paper, a novel technique to set the coupling constant between cells of a coupled resonator optical waveguide (CROW) device, in order to tailor the filter response, is presented. The technique is demonstrated by simulation assuming a racetrack ring resonator geometry. It consists on changing the effective length of the coupling section by applying a longitudinal offset between the resonators. On the contrary, the conventional techniques are based in the transversal change of the distance between the ring resonators, in steps that are commonly below the current fabrication resolution step (nm scale), leading to strong restrictions in the designs. The proposed longitudinal offset technique allows a more precise control of the coupling and presents an increased robustness against the fabrication limitations, since the needed resolution step is two orders of magnitude higher. Both techniques are compared in terms of the transmission esponse of CROW devices, under finite fabrication resolution steps.

  15. Distinguishing between debris flows and floods from field evidence in small watersheds

    USGS Publications Warehouse

    Pierson, Thomas C.

    2005-01-01

    Post-flood indirect measurement techniques to back-calculate flood magnitude are not valid for debris flows, which commonly occur in small steep watersheds during intense rainstorms. This is because debris flows can move much faster than floods in steep channel reaches and much slower than floods in low-gradient reaches. In addition, debris-flow deposition may drastically alter channel geometry in reaches where slope-area surveys are applied. Because high-discharge flows are seldom witnessed and automated samplers are commonly plugged or destroyed, determination of flow type often must be made on the basis of field evidence preserved at the site.

  16. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  17. OARSI Clinical Trials Recommendations for Hip Imaging in Osteoarthritis

    PubMed Central

    Gold, Garry E.; Cicuttini, Flavia; Crema, Michel D.; Eckstein, Felix; Guermazi, Ali; Kijowski, Richard; Link, Thomas M.; Maheu, Emmanuel; Martel-Pelletier, Johanne; Miller, Colin G.; Pelletier, Jean-Pierre; Peterfy, Charles G.; Potter, Hollis G.; Roemer, Frank W.; Hunter, David. J

    2015-01-01

    Imaging of hip in osteoarthritis (OA) has seen considerable progress in the past decade, with the introduction of new techniques that may be more sensitive to structural disease changes. The purpose of this expert opinion, consensus driven recommendation is to provide detail on how to apply hip imaging in disease modifying clinical trials. It includes information on acquisition methods/ techniques (including guidance on positioning for radiography, sequence/protocol recommendations/ hardware for MRI); commonly encountered problems (including positioning, hardware and coil failures, artifacts associated with various MRI sequences); quality assurance/ control procedures; measurement methods; measurement performance (reliability, responsiveness, and validity); recommendations for trials; and research recommendations. PMID:25952344

  18. Using Single Drop Microextraction for Headspace Analysis with Gas Chromatography

    NASA Astrophysics Data System (ADS)

    Riccio, Daniel; Wood, Derrick C.; Miller, James M.

    2008-07-01

    Headspace (HS) gas chromatography (GC) is commonly used to analyze samples that contain non-volatiles. In 1996, a new sampling technique called single drop microextraction, SDME, was introduced, and in 2001 it was applied to HS analysis. It is a simple technique that uses equipment normally found in the undergraduate laboratory, making it ideal for instructional use, especially to illustrate HS analysis or as an alternative to solid-phase microextraction (SPME) to which it is very similar. The basic principles and practice of HS-GC using SDME are described, including a complete review of the literature. Some possible experiments are suggested using water and N -methylpyrrolidone (NMP) as solvents.

  19. Propagation of sound in turbulent media

    NASA Technical Reports Server (NTRS)

    Wenzel, A. R.

    1976-01-01

    Perturbation methods commonly used to study the propagation of acoustic waves in turbulent media are reviewed. Emphasis is on those techniques which are applicable to problems involving long-range propagation in the atmosphere and ocean. Characteristic features of the various methods are illustrated by applying them to particular problems. It is shown that conventional perturbation techniques, such as the Born approximation, yield solutions which contain secular terms, and which therefore have a relatively limited range of validity. In contrast, it is found that solutions obtained with the aid of the Rytov method or the smoothing method do not contain secular terms, and consequently have a much greater range of validity.

  20. Nonadditivity of van der Waals forces on liquid surfaces

    NASA Astrophysics Data System (ADS)

    Venkataram, Prashanth S.; Whitton, Jeremy D.; Rodriguez, Alejandro W.

    2016-09-01

    We present an approach for modeling nanoscale wetting and dewetting of textured solid surfaces that exploits recently developed, sophisticated techniques for computing exact long-range dispersive van der Waals (vdW) or (more generally) Casimir forces in arbitrary geometries. We apply these techniques to solve the variational formulation of the Young-Laplace equation and predict the equilibrium shapes of liquid-vacuum interfaces near solid gratings. We show that commonly employed methods of computing vdW interactions based on additive Hamaker or Derjaguin approximations, which neglect important electromagnetic boundary effects, can result in large discrepancies in the shapes and behaviors of liquid surfaces compared to exact methods.

  1. Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach.

    PubMed

    Zakov, Shay; Tsur, Dekel; Ziv-Ukelson, Michal

    2011-08-18

    RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms.

  2. Reducing the worst case running times of a family of RNA and CFG problems, using Valiant's approach

    PubMed Central

    2011-01-01

    Background RNA secondary structure prediction is a mainstream bioinformatic domain, and is key to computational analysis of functional RNA. In more than 30 years, much research has been devoted to defining different variants of RNA structure prediction problems, and to developing techniques for improving prediction quality. Nevertheless, most of the algorithms in this field follow a similar dynamic programming approach as that presented by Nussinov and Jacobson in the late 70's, which typically yields cubic worst case running time algorithms. Recently, some algorithmic approaches were applied to improve the complexity of these algorithms, motivated by new discoveries in the RNA domain and by the need to efficiently analyze the increasing amount of accumulated genome-wide data. Results We study Valiant's classical algorithm for Context Free Grammar recognition in sub-cubic time, and extract features that are common to problems on which Valiant's approach can be applied. Based on this, we describe several problem templates, and formulate generic algorithms that use Valiant's technique and can be applied to all problems which abide by these templates, including many problems within the world of RNA Secondary Structures and Context Free Grammars. Conclusions The algorithms presented in this paper improve the theoretical asymptotic worst case running time bounds for a large family of important problems. It is also possible that the suggested techniques could be applied to yield a practical speedup for these problems. For some of the problems (such as computing the RNA partition function and base-pair binding probabilities), the presented techniques are the only ones which are currently known for reducing the asymptotic running time bounds of the standard algorithms. PMID:21851589

  3. The use of applied software for the professional training of students studying humanities

    NASA Astrophysics Data System (ADS)

    Sadchikova, A. S.; Rodin, M. M.

    2017-01-01

    Research practice is an integral part of humanities students' training process. In this regard the training process is to include modern information techniques of the training process of students studying humanities. This paper examines the most popular applied software products used for data processing in social science. For testing purposes we selected the most commonly preferred professional packages: MS Excel, IBM SPSS Statistics, STATISTICA, STADIA. Moreover the article contains testing results of a specialized software Prikladnoy Sotsiolog that is applicable for the preparation stage of the research. The specialised software were tested during one term in groups of students studying humanities.

  4. Six stirrup handled Moche ceramic vessels from pre-Colombian Peru: a technical study applying PIXE spectrometry

    NASA Astrophysics Data System (ADS)

    Swann, C. P.; Caspi, Sara; Carlson, Janice

    1999-04-01

    Much has been said recently concerning the Moche culture of Peru (100 B.C. to 700 A.D. ) which preceded the more widely known Inca society. Since there was no written language, what is known about the Moche people has been through the artifacts produced, their design and craftsmanship. This study concerns the PIXE analysis of the matrix and the red and white surface patterns of six stirrup handled earthenware vessels. The results suggest different techniques were applied for producing the coloring and surface designs and that a common source of clays was not used in the construction.

  5. Natural hazard metaphors for financial crises

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2001-02-01

    Linguistic metaphors drawn from natural hazards are commonly used at times of financial crisis. A brewing storm, a seismic shock, etc., evoke the abruptness and severity of a market collapse. If the language of windstorms, earthquakes and volcanic eruptions is helpful in illustrating a financial crisis, what about the mathematics of natural catastrophes? Already, earthquake prediction methods have been applied to economic recessions, and volcanic eruption forecasting techniques have been applied to market crashes. The purpose of this contribution is to survey broadly the mathematics of natural catastrophes, so as to convey the range of underlying principles, some of which may serve as mathematical metaphors for financial applications.

  6. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  7. geneCBR: a translational tool for multiple-microarray analysis and integrative information retrieval for aiding diagnosis in cancer research.

    PubMed

    Glez-Peña, Daniel; Díaz, Fernando; Hernández, Jesús M; Corchado, Juan M; Fdez-Riverola, Florentino

    2009-06-18

    Bioinformatics and medical informatics are two research fields that serve the needs of different but related communities. Both domains share the common goal of providing new algorithms, methods and technological solutions to biomedical research, and contributing to the treatment and cure of diseases. Although different microarray techniques have been successfully used to investigate useful information for cancer diagnosis at the gene expression level, the true integration of existing methods into day-to-day clinical practice is still a long way off. Within this context, case-based reasoning emerges as a suitable paradigm specially intended for the development of biomedical informatics applications and decision support systems, given the support and collaboration involved in such a translational development. With the goals of removing barriers against multi-disciplinary collaboration and facilitating the dissemination and transfer of knowledge to real practice, case-based reasoning systems have the potential to be applied to translational research mainly because their computational reasoning paradigm is similar to the way clinicians gather, analyze and process information in their own practice of clinical medicine. In addressing the issue of bridging the existing gap between biomedical researchers and clinicians who work in the domain of cancer diagnosis, prognosis and treatment, we have developed and made accessible a common interactive framework. Our geneCBR system implements a freely available software tool that allows the use of combined techniques that can be applied to gene selection, clustering, knowledge extraction and prediction for aiding diagnosis in cancer research. For biomedical researches, geneCBR expert mode offers a core workbench for designing and testing new techniques and experiments. For pathologists or oncologists, geneCBR diagnostic mode implements an effective and reliable system that can diagnose cancer subtypes based on the analysis of microarray data using a CBR architecture. For programmers, geneCBR programming mode includes an advanced edition module for run-time modification of previous coded techniques. geneCBR is a new translational tool that can effectively support the integrative work of programmers, biomedical researches and clinicians working together in a common framework. The code is freely available under the GPL license and can be obtained at http://www.genecbr.org.

  8. Critical Evaluation of Soil Pore Water Extraction Methods on a Natural Soil

    NASA Astrophysics Data System (ADS)

    Orlowski, Natalie; Pratt, Dyan; Breuer, Lutz; McDonnell, Jeffrey

    2017-04-01

    Soil pore water extraction is an important component in ecohydrological studies for the measurement of δ2H and δ18O. The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of commonly applied lab-based soil water extraction techniques on a natural soil: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and two types of cryogenic extraction systems. We applied these extraction methods to a natural summer-dry (gravimetric water contents ranging from 8% to 15%) glacio-lacustrine, moderately fine textured clayey soil; excavated in 10 cm sampling increments to a depth of 1 meter. Isotope results were analyzed via OA-ICOS and compared for each extraction technique that produced liquid water. From our previous intercomparison study among the same extraction techniques but with standard soils, we discovered that extraction methods are not comparable. We therefore tested the null hypothesis that all extraction techniques would be able to replicate the natural evaporation front in a comparable manner occurring in a summer-dry soil. Our results showed that the extraction technique utilized had a significant effect on the soil water isotopic composition. High pressure mechanical squeezing and vapor equilibration techniques produced similar results with similarly sloped evaporation lines. Due to the nature of soil properties and dryness, centrifugation was unsuccessful in obtaining pore water for isotopic analysis. Cryogenic extraction on both tested techniques produced similar results to each other on a similar sloping evaporation line, but dissimilar with depth.

  9. Performance and cost characteristics of multi-electron transfer, common ion exchange non-aqueous redox flow batteries

    NASA Astrophysics Data System (ADS)

    Laramie, Sydney M.; Milshtein, Jarrod D.; Breault, Tanya M.; Brushett, Fikile R.; Thompson, Levi T.

    2016-09-01

    Non-aqueous redox flow batteries (NAqRFBs) have recently received considerable attention as promising high energy density, low cost grid-level energy storage technologies. Despite these attractive features, NAqRFBs are still at an early stage of development and innovative design techniques are necessary to improve performance and decrease costs. In this work, we investigate multi-electron transfer, common ion exchange NAqRFBs. Common ion systems decrease the supporting electrolyte requirement, which subsequently improves active material solubility and decreases electrolyte cost. Voltammetric and electrolytic techniques are used to study the electrochemical performance and chemical compatibility of model redox active materials, iron (II) tris(2,2‧-bipyridine) tetrafluoroborate (Fe(bpy)3(BF4)2) and ferrocenylmethyl dimethyl ethyl ammonium tetrafluoroborate (Fc1N112-BF4). These results help disentangle complex cycling behavior observed in flow cell experiments. Further, a simple techno-economic model demonstrates the cost benefits of employing common ion exchange NAqRFBs, afforded by decreasing the salt and solvent contributions to total chemical cost. This study highlights two new concepts, common ion exchange and multi-electron transfer, for NAqRFBs through a demonstration flow cell employing model active species. In addition, the compatibility analysis developed for asymmetric chemistries can apply to other promising species, including organics, metal coordination complexes (MCCs) and mixed MCC/organic systems, enabling the design of low cost NAqRFBs.

  10. Integration of different data gap filling techniques to facilitate ...

    EPA Pesticide Factsheets

    Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used data gap filling techniques which are applied to estimate relative potencies for mixtures of chemicals that contribute to an adverse outcome through a common biological target. For example, The TEF approach has been used for dioxin-like effects comparing individual chemical activity to that of the most toxic dioxin: 2,3,7,8-tetrachlorodibenzo-p-dioxin. The aim of this case study was to determine whether integration of two data gap filling techniques: QSARs and TEFs improved the predictive outcome for the assessment of a set of polychlorinated biphenyl (PCB) congeners and their mixtures. PCBs are associated with many different adverse effects, including their potential for neurotoxicity, which is the endpoint of interest in this study. The dataset comprised 209 PCB congeners, out of which 87 altered in vitro Ca(2+) homeostasis from which neurotoxic equivalency values (NEQs) were derived. The preliminary objective of this case study was to develop a QSAR model to predict NEQ values for the 122 untested PCB congeners. A decision tree model was developed using the number of position specific chlorine substitutions on the biphenyl scaffold as a fingerprint descriptor. Three different positiona

  11. Evaluation of solution procedures for material and/or geometrically nonlinear structural analysis by the direct stiffness method.

    NASA Technical Reports Server (NTRS)

    Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.

    1972-01-01

    This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.

  12. Evaluation of phase-diversity techniques for solar-image restoration

    NASA Technical Reports Server (NTRS)

    Paxman, Richard G.; Seldin, John H.; Lofdahl, Mats G.; Scharmer, Goran B.; Keller, Christoph U.

    1995-01-01

    Phase-diversity techniques provide a novel observational method for overcomming the effects of turbulence and instrument-induced aberrations in ground-based astronomy. Two implementations of phase-diversity techniques that differ with regard to noise model, estimator, optimization algorithm, method of regularization, and treatment of edge effects are described. Reconstructions of solar granulation derived by applying these two implementations to common data sets are shown to yield nearly identical images. For both implementations, reconstructions from phase-diverse speckle data (involving multiple realizations of turbulence) are shown to be superior to those derived from conventional phase-diversity data (involving a single realization). Phase-diverse speckle reconstructions are shown to achieve near diffraction-limited resolution and are validated by internal and external consistency tests, including a comparison with a reconstruction using a well-accepted speckle-imaging method.

  13. Automated quantitative micro-mineralogical characterization for environmental applications

    USGS Publications Warehouse

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  14. An automated cross-correlation based event detection technique and its application to surface passive data set

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  15. Parameter Estimation in Atmospheric Data Sets

    NASA Technical Reports Server (NTRS)

    Wenig, Mark; Colarco, Peter

    2004-01-01

    In this study the structure tensor technique is used to estimate dynamical parameters in atmospheric data sets. The structure tensor is a common tool for estimating motion in image sequences. This technique can be extended to estimate other dynamical parameters such as diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. As a test scenario this technique will be applied to modeled dust data. In this case vertically integrated dust concentrations were used to derive wind information. Those results can be compared to the wind vector fields which served as input to the model. Based on this analysis, a method to compute atmospheric data parameter fields will be presented. .

  16. Neural networks for dimensionality reduction of fluorescence spectra and prediction of drinking water disinfection by-products.

    PubMed

    Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C

    2018-06-01

    The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007*

    PubMed Central

    Gore, Sally A.; Nordberg, Judith M.; Palmer, Lisa A.

    2009-01-01

    Objective: This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Methodology: Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991–2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Results: Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. Conclusion: This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians. PMID:19626146

  18. "TuNa-saving" endoscopic medial maxillectomy: a surgical technique for maxillary inverted papilloma.

    PubMed

    Pagella, Fabio; Pusateri, Alessandro; Matti, Elina; Avato, Irene; Zaccari, Dario; Emanuelli, Enzo; Volo, Tiziana; Cazzador, Diego; Citraro, Leonardo; Ricci, Giampiero; Tomacelli, Giovanni Leo

    2017-07-01

    The maxillary sinus is the most common site of sinonasal inverted papilloma. Endoscopic sinus surgery, in particular endoscopic medial maxillectomy, is currently the gold standard for treatment of maxillary sinus papilloma. Although a common technique, complications such as stenosis of the lacrimal pathway and consequent development of epiphora are still possible. To avoid these problems, we propose a modification of this surgical technique that preserves the head of the inferior turbinate and the nasolacrimal duct. A retrospective analysis was performed on patients treated for maxillary inverted papilloma in three tertiary medical centres between 2006 and 2014. Pedicle-oriented endoscopic surgery principles were applied and, in select cases where the tumour pedicle was located on the anterior wall, a modified endoscopic medial maxillectomy was carried out as described in this paper. From 2006 to 2014 a total of 84 patients were treated. A standard endoscopic medial maxillectomy was performed in 55 patients (65.4%), while the remaining 29 (34.6%) had a modified technique performed. Three recurrences (3/84; 3.6%) were observed after a minimum follow-up of 24 months. A new surgical approach for select cases of maxillary sinus inverted papilloma is proposed in this paper. In this technique, the endoscopic medial maxillectomy was performed while preserving the head of the inferior turbinate and the nasolacrimal duct ("TuNa-saving"). This technique allowed for good visualization of the maxillary sinus, good oncological control and a reduction in the rate of complications.

  19. Antegrade Dissection and Reentry as Part of the Hybrid Chronic Total Occlusion Revascularization Strategy: A Subanalysis of the RECHARGE Registry (Registry of CrossBoss and Hybrid Procedures in France, the Netherlands, Belgium and United Kingdom).

    PubMed

    Maeremans, Joren; Dens, Jo; Spratt, James C; Bagnall, Alan J; Stuijfzand, Wynand; Nap, Alexander; Agostoni, Pierfrancesco; Wilson, William; Hanratty, Colm G; Wilson, Simon; Faurie, Benjamin; Avran, Alexandre; Bressollette, Erwan; Egred, Mohaned; Knaapen, Paul; Walsh, Simon

    2017-06-01

    Development of the CrossBoss and Stingray devices for antegrade dissection and reentry (ADR) of chronic total occlusions has improved historically suboptimal outcomes. However, the outcomes, safety, and failure modes of the technique have to be studied in a larger patient cohort. This preplanned substudy of the RECHARGE registry (Registry of CrossBoss and Hybrid Procedures in France, the Netherlands, Belgium and United Kingdom) aims to evaluate the value and use of ADR and determine its future position in contemporary chronic total occlusion intervention. Patients were selected if an ADR strategy was applied. Outcomes, safety, and failure modes of the technique were assessed. The ADR technique was used in 23% (n=292/1253) of the RECHARGE registry and was mainly applied for complex lesions (Japanese chronic total occlusion score=2.7±1.1). ADR was the primary strategy in 30% (n=88/292), of which 67% were successful. Bail-out ADR strategies were successful in 63% (n=133/210). The Controlled ADR (ie, combined CrossBoss-Stingray) subtype was applied most frequently (32%; n=93/292) and successfully (81%; n=75/93). Overall per-lesion success rate was 78% (n=229/292), after use of additional bail-out strategies. The inability to reach the distal target zone (n=48/100) or to reenter (n=43/100) most commonly led to failure. ADR-associated major events occurred in 3.4% (n=10/292). Although mostly applied as a bail-out strategy for complex lesions, the frequency, outcomes, and low complication rate of the ADR technique and its subtypes confirm the benefit and value of the technique in hybrid chronic total occlusion percutaneous coronary intervention, especially when antegrade wiring or retrograde approaches are not feasible. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02075372. © 2017 American Heart Association, Inc.

  20. Synthetic Graphene Grown by Chemical Vapor Deposition on Copper Foils

    DTIC Science & Technology

    2013-04-11

    b) Transparent PMMA /graphene membrane floating on copper etchant. (c) Three layers of stacked CVD graphene on a cover glass made by consecutively...insulating substrate is a critical step for fabricating electronic devices. PMMA -assisted transfer techniques are commonly applied because of their...simplicity and repeatability.13 In a typical transfer, a graphene film on Cu substrate was first coated with PMMA (950PMMA-A4, MicroChem)b by spin

  1. Ultrascalable Techniques Applied to the Global Intelligence Community Information Awareness Common Operating Picture (IA COP)

    DTIC Science & Technology

    2005-11-01

    more random. Autonomous systems can exchange entropy statistics for packet streams with no confidentiality concerns, potentially enabling timely and... analysis began with simulation results, which were validated by analysis of actual data from an Autonomous System (AS). A scale-free network is one...traffic—for example, time series of flux at given nodes and mean path length Outputs the time series from any node queried Calculates

  2. Application of separable parameter space techniques to multi-tracer PET compartment modeling.

    PubMed

    Zhang, Jeff L; Michael Morey, A; Kadrmas, Dan J

    2016-02-07

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.

  3. Application of separable parameter space techniques to multi-tracer PET compartment modeling

    NASA Astrophysics Data System (ADS)

    Zhang, Jeff L.; Morey, A. Michael; Kadrmas, Dan J.

    2016-02-01

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg-Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models.

  4. Copy-move forgery detection through stationary wavelets and local binary pattern variance for forensic analysis in digital images.

    PubMed

    Mahmood, Toqeer; Irtaza, Aun; Mehmood, Zahid; Tariq Mahmood, Muhammad

    2017-10-01

    The most common image tampering often for malicious purposes is to copy a region of the same image and paste to hide some other region. As both regions usually have same texture properties, therefore, this artifact is invisible for the viewers, and credibility of the image becomes questionable in proof centered applications. Hence, means are required to validate the integrity of the image and identify the tampered regions. Therefore, this study presents an efficient way of copy-move forgery detection (CMFD) through local binary pattern variance (LBPV) over the low approximation components of the stationary wavelets. CMFD technique presented in this paper is applied over the circular regions to address the possible post processing operations in a better way. The proposed technique is evaluated on CoMoFoD and Kodak lossless true color image (KLTCI) datasets in the presence of translation, flipping, blurring, rotation, scaling, color reduction, brightness change and multiple forged regions in an image. The evaluation reveals the prominence of the proposed technique compared to state of the arts. Consequently, the proposed technique can reliably be applied to detect the modified regions and the benefits can be obtained in journalism, law enforcement, judiciary, and other proof critical domains. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Deriving Function-failure Similarity Information for Failure-free Rotorcraft Component Design

    NASA Technical Reports Server (NTRS)

    Roberts, Rory A.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the design that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. The aircraft design needs to be passed through a general technique to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to certain components, which are described by their functionality. In turn, the failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using the technique proposed in this paper, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. This method was previously applied to a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.

  6. Cervical spine mobilisation forces applied by physiotherapy students.

    PubMed

    Snodgrass, Suzanne J; Rivett, Darren A; Robertson, Val J; Stojanovski, Elizabeth

    2010-06-01

    Postero-anterior (PA) mobilisation is commonly used in cervical spine treatment and included in physiotherapy curricula. The manual forces that students apply while learning cervical mobilisation are not known. Quantifying these forces informs the development of strategies for learning to apply cervical mobilisation effectively and safely. This study describes the mechanical properties of cervical PA mobilisation techniques applied by students, and investigates factors associated with force application. Physiotherapy students (n=120) mobilised one of 32 asymptomatic subjects. Students applied Grades I to IV central and unilateral PA mobilisation to C2 and C7 of one asymptomatic subject. Manual forces were measured in three directions using an instrumented treatment table. Spinal stiffness of mobilised subjects was measured at C2 and C7 using a device that applied a standard oscillating force while measuring this force and its concurrent displacement. Analysis of variance was used to determine differences between techniques and grades, intraclass correlation coefficients (ICC) were used to calculate the inter- and intrastudent repeatability of forces, and linear regression was used to determine the associations between applied forces and characteristics of students and mobilised subjects. Mobilisation forces increased from Grades I to IV (highest mean peak force, Grade IV C7 central PA technique: 63.7N). Interstudent reliability was poor [ICC(2,1)=0.23, 95% confidence interval (CI) 0.14 to 0.43], but intrastudent repeatability of forces was somewhat better (0.83, 95% CI 0.81 to 0.86). Higher applied force was associated with greater C7 stiffness, increased frequency of thumb pain, male gender of the student or mobilised subject, and a student being earlier in their learning process. Lower forces were associated with greater C2 stiffness. This study describes the cervical mobilisation forces applied by students, and the characteristics of the student and mobilised subject associated with these forces. These results form a basis for the development of strategies to provide objective feedback to students learning to apply cervical mobilisation. Copyright 2009 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  7. A basic review on the inferior alveolar nerve block techniques.

    PubMed

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.

  8. A basic review on the inferior alveolar nerve block techniques

    PubMed Central

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095

  9. Coupling Computer-Aided Process Simulation and ...

    EPA Pesticide Factsheets

    A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable

  10. Involuntary eye motion correction in retinal optical coherence tomography: Hardware or software solution?

    PubMed

    Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M

    2017-04-01

    In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  12. 3D Image Analysis of Geomaterials using Confocal Microscopy

    NASA Astrophysics Data System (ADS)

    Mulukutla, G.; Proussevitch, A.; Sahagian, D.

    2009-05-01

    Confocal microscopy is one of the most significant advances in optical microscopy of the last century. It is widely used in biological sciences but its application to geomaterials lingers due to a number of technical problems. Potentially the technique can perform non-invasive testing on a laser illuminated sample that fluoresces using a unique optical sectioning capability that rejects out-of-focus light reaching the confocal aperture. Fluorescence in geomaterials is commonly induced using epoxy doped with a fluorochrome that is impregnated into the sample to enable discrimination of various features such as void space or material boundaries. However, for many geomaterials, this method cannot be used because they do not naturally fluoresce and because epoxy cannot be impregnated into inaccessible parts of the sample due to lack of permeability. As a result, the confocal images of most geomaterials that have not been pre-processed with extensive sample preparation techniques are of poor quality and lack the necessary image and edge contrast necessary to apply any commonly used segmentation techniques to conduct any quantitative study of its features such as vesicularity, internal structure, etc. In our present work, we are developing a methodology to conduct a quantitative 3D analysis of images of geomaterials collected using a confocal microscope with minimal amount of prior sample preparation and no addition of fluorescence. Two sample geomaterials, a volcanic melt sample and a crystal chip containing fluid inclusions are used to assess the feasibility of the method. A step-by-step process of image analysis includes application of image filtration to enhance the edges or material interfaces and is based on two segmentation techniques: geodesic active contours and region competition. Both techniques have been applied extensively to the analysis of medical MRI images to segment anatomical structures. Preliminary analysis suggests that there is distortion in the shapes of the segmented vesicles, vapor bubbles, and void spaces due to the optical measurements, so corrective actions are being explored. This will establish a practical and reliable framework for an adaptive 3D image processing technique for the analysis of geomaterials using confocal microscopy.

  13. Artificial intelligence in medicine.

    PubMed Central

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. RESULTS: The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. DISCUSSION: Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting. PMID:15333167

  14. Errorless-based techniques can improve route finding in early Alzheimer's disease: a case study.

    PubMed

    Provencher, Véronique; Bier, Nathalie; Audet, Thérèse; Gagnon, Lise

    2008-01-01

    Topographical disorientation is a common and early manifestation of dementia of Alzheimer type, which threatens independence in activities of daily living. Errorless-based techniques appear to be effective in helping patients with amnesia to learn routes, but little is known about their effectiveness in early dementia of Alzheimer type. A 77-year-old woman with dementia of Alzheimer type had difficulty in finding her way around her seniors residence, which reduced her social activities. This study used an ABA design (A is the baseline and B is the intervention) with multiple baselines across routes for going to the rosary (target), laundry, and game rooms (controls). The errorless-based technique intervention was applied to 2 of the 3 routes. Analyses showed significant improvement only for the routes learned with errorless-based techniques. Following the study, the participant increased her topographical knowledge of her surroundings. Route learning interventions based on errorless-based techniques appear to be a promising approach for improving the independence in early dementia of Alzheimer type.

  15. Artificial intelligence in medicine.

    PubMed

    Ramesh, A N; Kambhampati, C; Monson, J R T; Drew, P J

    2004-09-01

    Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting.

  16. Non-invasive method for quantitative evaluation of exogenous compound deposition on skin.

    PubMed

    Stamatas, Georgios N; Wu, Jeff; Kollias, Nikiforos

    2002-02-01

    Topical application of active compounds on skin is common to both pharmaceutical and cosmetic industries. Quantification of the concentration of a compound deposited on the skin is important in determining the optimum formulation to deliver the pharmaceutical or cosmetic benefit. The most commonly used techniques to date are either invasive or not easily reproducible. In this study, we have developed a noninvasive alternative to these techniques based on spectrofluorimetry. A mathematical model based on diffusion approximation theory is utilized to correct fluorescence measurements for the attenuation caused by endogenous skin chromophore absorption. The limitation is that the compound of interest has to be either fluorescent itself or fluorescently labeled. We used the method to detect topically applied salicylic acid. Based on the mathematical model a calibration curve was constructed that is independent of endogenous chromophore concentration. We utilized the method to localize salicylic acid in epidermis and to follow its dynamics over a period of 3 d.

  17. Effects on Diagnostic Parameters After Removing Additional Synchronous Gear Meshes

    NASA Technical Reports Server (NTRS)

    Decker, Harry J.

    2003-01-01

    Gear cracks are typically difficult to diagnose with sufficient time before catastrophic damage occurs. Significant damage must be present before algorithms appear to be able to detect the damage. Frequently there are multiple gear meshes on a single shaft. Since they are all synchronous with the shaft frequency, the commonly used synchronous averaging technique is ineffective in removing other gear mesh effects. Carefully applying a filter to these extraneous gear mesh frequencies can reduce the overall vibration signal and increase the accuracy of commonly used vibration metrics. The vibration signals from three seeded fault tests were analyzed using this filtering procedure. Both the filtered and unfiltered vibration signals were then analyzed using commonly used fault detection metrics and compared. The tests were conducted on aerospace quality spur gears in a test rig. The tests were conducted at speeds ranging from 2500 to 5000 revolutions per minute and torques from 184 to 228 percent of design load. The inability to detect these cracks with high confidence results from the high loading which is causing fast fracture as opposed to stable crack growth. The results indicate that these techniques do not currently produce an indication of damage that significantly exceeds experimental scatter.

  18. A visual analysis of multi-attribute data using pixel matrix displays

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel; Schreck, Tobias

    2007-01-01

    Charts and tables are commonly used to visually analyze data. These graphics are simple and easy to understand, but charts show only highly aggregated data and present only a limited number of data values while tables often show too many data values. As a consequence, these graphics may either lose or obscure important information, so different techniques are required to monitor complex datasets. Users need more powerful visualization techniques to digest and compare detailed multi-attribute data to analyze the health of their business. This paper proposes an innovative solution based on the use of pixel-matrix displays to represent transaction-level information. With pixelmatrices, users can visualize areas of importance at a glance, a capability not provided by common charting techniques. We present our solutions to use colored pixel-matrices in (1) charts for visualizing data patterns and discovering exceptions, (2) tables for visualizing correlations and finding root-causes, and (3) time series for visualizing the evolution of long-running transactions. The solutions have been applied with success to product sales, Internet network performance analysis, and service contract applications demonstrating the benefits of our method over conventional graphics. The method is especially useful when detailed information is a key part of the analysis.

  19. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    PubMed

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  20. Kalman Filter Techniques for Accelerated Cartesian Dynamic Cardiac Imaging

    PubMed Central

    Feng, Xue; Salerno, Michael; Kramer, Christopher M.; Meyer, Craig H.

    2012-01-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories, because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and SNR. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. PMID:22926804

  1. Estimating Spectra from Photometry

    NASA Astrophysics Data System (ADS)

    Kalmbach, J. Bryce; Connolly, Andrew J.

    2017-12-01

    Measuring the physical properties of galaxies such as redshift frequently requires the use of spectral energy distributions (SEDs). SED template sets are, however, often small in number and cover limited portions of photometric color space. Here we present a new method to estimate SEDs as a function of color from a small training set of template SEDs. We first cover the mathematical background behind the technique before demonstrating our ability to reconstruct spectra based upon colors and then compare our results to other common interpolation and extrapolation methods. When the photometric filters and spectra overlap, we show that the error in the estimated spectra is reduced by more than 65% compared to the more commonly used techniques. We also show an expansion of the method to wavelengths beyond the range of the photometric filters. Finally, we demonstrate the usefulness of our technique by generating 50 additional SED templates from an original set of 10 and by applying the new set to photometric redshift estimation. We are able to reduce the photometric redshifts standard deviation by at least 22.0% and the outlier rejected bias by over 86.2% compared to original set for z ≤ 3.

  2. Drivers of nutritional change in four South Asian countries: a dynamic observational analysis.

    PubMed

    Headey, Derek; Hoddinott, John; Park, Seollee

    2016-05-01

    This paper quantifies the factors explaining long-term improvements in child height for age z-scores in Bangladesh (1996/1997-2011), India (1992/1993-2005/2006), Nepal (1997-2011) and Pakistan (1991-2013). We apply the same statistical techniques to data from a common data source from which we have extracted a set of common explanatory variables that capture 'nutrition-sensitive' factors. Three are particularly important in explaining height for age z-score changes over these timeframes: improvements in material well-being; increases in female education; and improvements in sanitation. These factors have comparable associations across all four countries. © 2016 The Authors. Maternal & Child Nutrition published by John Wiley & Sons Ltd.

  3. Drivers of nutritional change in four South Asian countries: a dynamic observational analysis

    PubMed Central

    Hoddinott, John; Park, Seollee

    2016-01-01

    Abstract This paper quantifies the factors explaining long‐term improvements in child height for age z‐scores in Bangladesh (1996/1997–2011), India (1992/1993–2005/2006), Nepal (1997–2011) and Pakistan (1991–2013). We apply the same statistical techniques to data from a common data source from which we have extracted a set of common explanatory variables that capture ‘nutrition‐sensitive’ factors. Three are particularly important in explaining height for age z‐score changes over these timeframes: improvements in material well‐being; increases in female education; and improvements in sanitation. These factors have comparable associations across all four countries. PMID:27187917

  4. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2012-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.

  5. Determination of the authenticity of plastron-derived functional foods based on amino acid profiles analysed by MEKC.

    PubMed

    Li, Lin-Qiu; Baibado, Joewel T; Shen, Qing; Cheung, Hon-Yeung

    2017-12-01

    Plastron is a nutritive and superior functional food. Due to its limited supply yet enormous demands, some functional foods supposed to contain plastron may be forged with other substitutes. This paper reports a novel and simple method for determination of the authenticity of plastron-derived functional foods based on comparison of the amino acid (AA) profiles of plastron and its possible substitutes. By applying micellar electrokinetic chromatography (MEKC), 18 common AAs along with another 2 special AAs - hydroxyproline (Hyp) and hydroxylysine (Hyl) were detected in all plastron samples. Since chicken, egg, fish, milk, pork, nail and hair lacked of Hyp and Hyl, plastron could be easily distinguished. For those containing collagen, a statistical analysis technique - principal component analysis (PCA) was adopted and plastron was successfully distinguished. When applied the proposed method to authenticate turtle shell glue in the market, fake products were commonly found. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  7. Technical support for creating an artificial intelligence system for feature extraction and experimental design

    NASA Technical Reports Server (NTRS)

    Glick, B. J.

    1985-01-01

    Techniques for classifying objects into groups or clases go under many different names including, most commonly, cluster analysis. Mathematically, the general problem is to find a best mapping of objects into an index set consisting of class identifiers. When an a priori grouping of objects exists, the process of deriving the classification rules from samples of classified objects is known as discrimination. When such rules are applied to objects of unknown class, the process is denoted classification. The specific problem addressed involves the group classification of a set of objects that are each associated with a series of measurements (ratio, interval, ordinal, or nominal levels of measurement). Each measurement produces one variable in a multidimensional variable space. Cluster analysis techniques are reviewed and methods for incuding geographic location, distance measures, and spatial pattern (distribution) as parameters in clustering are examined. For the case of patterning, measures of spatial autocorrelation are discussed in terms of the kind of data (nominal, ordinal, or interval scaled) to which they may be applied.

  8. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  9. Development of an oximeter for neurology

    NASA Astrophysics Data System (ADS)

    Aleinik, A.; Serikbekova, Z.; Zhukova, N.; Zhukova, I.; Nikitina, M.

    2016-06-01

    Cerebral desaturation can occur during surgery manipulation, whereas other parameters vary insignificantly. Prolonged intervals of cerebral anoxia can cause serious damage to the nervous system. Commonly used method for measurement of cerebral blood flow uses invasive catheters. Other techniques include single photon emission computed tomography (SPECT), positron emission tomography (PET), magnetic resonance imaging (MRI). Tomographic methods frequently use isotope administration, that may result in anaphylactic reactions to contrast media and associated nerve diseases. Moreover, the high cost and the need for continuous monitoring make it difficult to apply these techniques in clinical practice. Cerebral oximetry is a method for measuring oxygen saturation using infrared spectrometry. Moreover reflection pulse oximetry can detect sudden changes in sympathetic tone. For this purpose the reflectance pulse oximeter for use in neurology is developed. Reflectance oximeter has a definite advantage as it can be used to measure oxygen saturation in any part of the body. Preliminary results indicate that the device has a good resolution and high reliability. Modern applied schematics have improved device characteristics compared with existing ones.

  10. Speckle noise reduction of 1-look SAR imagery

    NASA Technical Reports Server (NTRS)

    Nathan, Krishna S.; Curlander, John C.

    1987-01-01

    Speckle noise is inherent to synthetic aperture radar (SAR) imagery. Since the degradation of the image due to this noise results in uncertainties in the interpretation of the scene and in a loss of apparent resolution, it is desirable to filter the image to reduce this noise. In this paper, an adaptive algorithm based on the calculation of the local statistics around a pixel is applied to 1-look SAR imagery. The filter adapts to the nonstationarity of the image statistics since the size of the blocks is very small compared to that of the image. The performance of the filter is measured in terms of the equivalent number of looks (ENL) of the filtered image and the resulting resolution degradation. The results are compared to those obtained from different techniques applied to similar data. The local adaptive filter (LAF) significantly increases the ENL of the final image. The associated loss of resolution is also lower than that for other commonly used speckle reduction techniques.

  11. Acute effects of triazolam on false recognition.

    PubMed

    Mintzer, M Z; Griffiths, R R

    2000-12-01

    Neuropsychological, neuroimaging, and electrophysiological techniques have been applied to the study of false recognition; however, psychopharmacological techniques have not been applied. Benzodiazepine sedative/anxiolytic drugs produce memory deficits similar to those observed in organic amnesia and may be useful tools for studying normal and abnormal memory mechanisms. The present double-blind, placebo-controlled repeated measures study examined the acute effects of orally administered triazolam (Halcion; 0.125 and 0.25 mg/70 kg), a benzodiazepine hypnotic, on performance in the Deese (1959)/Roediger-McDermott (1995) false recognition paradigm in 24 healthy volunteers. Paralleling previous demonstrations in amnesic patients, triazolam produced significant dose-related reductions in false recognition rates to nonstudied words associatively related to studied words, suggesting that false recognition relies on normal memory mechanisms impaired in benzodiazepine-induced amnesia. The results also suggested that relative to placebo, triazolam reduced participants' reliance on memory for item-specific versus list-common semantic information and reduced participants' use of remember versus know responses.

  12. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  13. Application of Fourier-wavelet regularized deconvolution for improving image quality of free space propagation x-ray phase contrast imaging.

    PubMed

    Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin

    2012-11-21

    New x-ray phase contrast imaging techniques without using synchrotron radiation confront a common problem from the negative effects of finite source size and limited spatial resolution. These negative effects swamp the fine phase contrast fringes and make them almost undetectable. In order to alleviate this problem, deconvolution procedures should be applied to the blurred x-ray phase contrast images. In this study, three different deconvolution techniques, including Wiener filtering, Tikhonov regularization and Fourier-wavelet regularized deconvolution (ForWaRD), were applied to the simulated and experimental free space propagation x-ray phase contrast images of simple geometric phantoms. These algorithms were evaluated in terms of phase contrast improvement and signal-to-noise ratio. The results demonstrate that the ForWaRD algorithm is most appropriate for phase contrast image restoration among above-mentioned methods; it can effectively restore the lost information of phase contrast fringes while reduce the amplified noise during Fourier regularization.

  14. Surface inspection of flat products by means of texture analysis: on-line implementation using neural networks

    NASA Astrophysics Data System (ADS)

    Fernandez, Carlos; Platero, Carlos; Campoy, Pascual; Aracil, Rafael

    1994-11-01

    This paper describes some texture-based techniques that can be applied to quality assessment of flat products continuously produced (metal strips, wooden surfaces, cork, textile products, ...). Since the most difficult task is that of inspecting for product appearance, human-like inspection ability is required. A common feature to all these products is the presence of non- deterministic texture on their surfaces. Two main subjects are discussed: statistical techniques for both surface finishing determination and surface defect analysis as well as real-time implementation for on-line inspection in high-speed applications. For surface finishing determination a Gray Level Difference technique is presented to perform over low resolution images, that is, no-zoomed images. Defect analysis is performed by means of statistical texture analysis over defective portions of the surface. On-line implementation is accomplished by means of neural networks. When a defect arises, textural analysis is applied which result in a data-vector, acting as input of a neural net, previously trained in a supervised way. This approach tries to reach on-line performance in automated visual inspection applications when texture is presented in flat product surfaces.

  15. An investigation of matched index of refraction technique and its application in optical measurements of fluid flow

    NASA Astrophysics Data System (ADS)

    Amini, Noushin; Hassan, Yassin A.

    2012-12-01

    Optical distortions caused by non-uniformities of the refractive index within the measurement volume is a major impediment for all laser diagnostic imaging techniques applied in experimental fluid dynamic studies. Matching the refractive indices of the working fluid and the test section walls and interfaces provides an effective solution to this problem. The experimental set-ups designed to be used along with laser imaging techniques are typically constructed of transparent solid materials. In this investigation, different types of aqueous salt solutions and various organic fluids are studied for refractive index matching with acrylic and fused quartz, which are commonly used in construction of the test sections. One aqueous CaCl2·2H2O solution (63 % by weight) and two organic fluids, Dibutyl Phthalate and P-Cymene, are suggested for refractive index matching with fused quartz and acrylic, respectively. Moreover, the temperature dependence of the refractive indices of these fluids is investigated, and the Thermooptic Constant is calculated for each fluid. Finally, the fluid viscosity for different shear rates is measured as a function of temperature and is applied to characterize the physical behavior of the proposed fluids.

  16. Direct torsional actuation of microcantilevers using magnetic excitation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gosvami, Nitya Nand; Nalam, Prathima C.; Tam, Qizhan

    2014-09-01

    Torsional mode dynamic force microscopy can be used for a wide range of studies including mapping lateral contact stiffness, torsional frequency or amplitude modulation imaging, and dynamic friction measurements of various materials. Piezo-actuation of the cantilever is commonly used, but it introduces spurious resonances, limiting the frequency range that can be sampled, and rendering the technique particularly difficult to apply in liquid medium where the cantilever oscillations are significantly damped. Here, we demonstrate a method that enables direct torsional actuation of cantilevers with high uniformity over wide frequency ranges by attaching a micrometer-scale magnetic bead on the back side ofmore » the cantilever. We show that when beads are magnetized along the width of the cantilever, efficient torsional actuation of the cantilevers can be achieved using a magnetic field produced from a solenoid placed underneath the sample. We demonstrate the capability of this technique by imaging atomic steps on graphite surfaces in tapping mode near the first torsional resonance of the cantilever in dodecane. The technique is also applied to map the variations in the lateral contact stiffness on the surface of graphite and polydiacetylene monolayers.« less

  17. Design approach of an aquaculture cage system for deployment in the constructed channel flow environments of a power plant

    PubMed Central

    Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.

    2018-01-01

    This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954

  18. Surgical practices in total knee arthroplasty in Turkey.

    PubMed

    Erduran, Mehmet; Akseki, Devrim; Araç, Sükrü

    2012-01-01

    The aim of this study was to determine the current practices in the total knee arthroplasty (TKA) and the differences of practice among the orthopedic surgeons in Turkey. Data in this cross-sectional and descriptive study was collected through a questionnaire from 76 orthopaedic surgeons performing TKA. The questionnaire form contained 57 questions under four main headings, covering the professional properties of the surgeon, pre-surgery approach, surgical technique applied for TKA and the surgical details peculiar to the technique with solutions applied for complication scenarios, and finally the postoperative approach. It was determined that 39.7% of the TKA applications were performed in operating theatres without laminar airflow or HEPA filters. Nearly 1/5 of the surgeons used more than one antibiotic for prophylaxis, and more than 85% continued prophylaxis use over 3 days. Low-molecular-weight heparin was the most commonly used method for thromboprophylaxis. 94.67% of the surgeons used only the cemented technique in primary TKA. 44% indicated that they performed simultaneous bilateral arthroplasty, 89% did not use any scoring system and 72.37% preferred fixed-bearing and posterior-cruciate-retaining type prosthesis. Results showed no standardization in TKA surgery among surgeons in Turkey, and important educational deficiencies were noted.

  19. Bridging the gap between high and low acceleration for planetary escape

    NASA Astrophysics Data System (ADS)

    Indrikis, Janis; Preble, Jeffrey C.

    With the exception of the often time consuming analysis by numerical optimization, no single orbit transfer analysis technique exists that can be applied over a wide range of accelerations. Using the simple planetary escape (parabolic trajectory) mission some of the more common techniques are considered as the limiting bastions at the high and the extremely low acceleration regimes. The brachistochrone, the minimum time of flight path, is proposed as the technique to bridge the gap between the high and low acceleration regions, providing a smooth bridge over the entire acceleration spectrum. A smooth and continuous velocity requirement is established for the planetary escape mission. By using these results, it becomes possible to determine the effect of finite accelerations on mission performance and target propulsion and power system designs which are consistent with a desired mission objective.

  20. Pyeloplasty techniques using minimally invasive surgery (MIS) in pediatric patients.

    PubMed

    Turrà, Francesco; Escolino, Maria; Farina, Alessandra; Settimi, Alessandro; Esposito, Ciro; Varlet, François

    2016-10-01

    Hydronephrosis is the most common presentation of ureteropelvic junction (UPJ) obstruction. We reviewed literature, collecting data from Medline, to evaluate the current status of minimally invasive surgery (MIS) approach to pyeloplasty. Since the first pyeloplasty was described in 1939, several techniques has been applied to correct UPJ obstruction, but Anderson-Hynes dismembered pyeloplasty is established as the gold standard, to date also in MIS technique. According to literature several studies underline the safety and effectiveness of this approach for both trans- and retro-peritoneal routes, with a success rate between 81-100% and an operative time between 90-228 min. These studies have demonstrated the safety and efficacy of this procedure in the management of UPJ obstruction in children. Whether better the transperitoneal, than the retroperitoneal approach is still debated. A long learning curve is needed especially in suturing and knotting.

  1. Looking for Common Fingerprints in Leonardo's Pupils Using Nondestructive Pigment Characterization.

    PubMed

    Bonizzoni, Letizia; Gargano, Marco; Ludwig, Nicola; Martini, Marco; Galli, Anna

    2017-08-01

    Non-invasive, portable analytical techniques are becoming increasingly widespread for the study and conservation in the field of cultural heritage, proving that a good data handling, supported by a deep knowledge of the techniques themselves, and the right synergy can give surprisingly substantial results when using portable but reliable instrumentation. In this work, pigment characterization was carried out on 21 Leonardesque paintings applying in situ X-ray fluorescence (XRF) and fiber optic reflection spectroscopy (FORS) analyses. In-depth data evaluation allowed to get information on the color palette and the painting technique of the different artists and workshops . Particular attention was paid to green pigments (for which a deeper study of possible pigments and alterations was performed with FORS analyses), flesh tones (for which a comparison with available data from cross-sections was made), and ground preparation.

  2. DISCO-SCA and Properly Applied GSVD as Swinging Methods to Find Common and Distinctive Processes

    PubMed Central

    Van Deun, Katrijn; Van Mechelen, Iven; Thorrez, Lieven; Schouteden, Martijn; De Moor, Bart; van der Werf, Mariët J.; De Lathauwer, Lieven; Smilde, Age K.; Kiers, Henk A. L.

    2012-01-01

    Background In systems biology it is common to obtain for the same set of biological entities information from multiple sources. Examples include expression data for the same set of orthologous genes screened in different organisms and data on the same set of culture samples obtained with different high-throughput techniques. A major challenge is to find the important biological processes underlying the data and to disentangle therein processes common to all data sources and processes distinctive for a specific source. Recently, two promising simultaneous data integration methods have been proposed to attain this goal, namely generalized singular value decomposition (GSVD) and simultaneous component analysis with rotation to common and distinctive components (DISCO-SCA). Results Both theoretical analyses and applications to biologically relevant data show that: (1) straightforward applications of GSVD yield unsatisfactory results, (2) DISCO-SCA performs well, (3) provided proper pre-processing and algorithmic adaptations, GSVD reaches a performance level similar to that of DISCO-SCA, and (4) DISCO-SCA is directly generalizable to more than two data sources. The biological relevance of DISCO-SCA is illustrated with two applications. First, in a setting of comparative genomics, it is shown that DISCO-SCA recovers a common theme of cell cycle progression and a yeast-specific response to pheromones. The biological annotation was obtained by applying Gene Set Enrichment Analysis in an appropriate way. Second, in an application of DISCO-SCA to metabolomics data for Escherichia coli obtained with two different chemical analysis platforms, it is illustrated that the metabolites involved in some of the biological processes underlying the data are detected by one of the two platforms only; therefore, platforms for microbial metabolomics should be tailored to the biological question. Conclusions Both DISCO-SCA and properly applied GSVD are promising integrative methods for finding common and distinctive processes in multisource data. Open source code for both methods is provided. PMID:22693578

  3. Refinement of Methods for Evaluation of Near-Hypersingular Integrals in BEM Formulations

    NASA Technical Reports Server (NTRS)

    Fink, Patricia W.; Khayat, Michael A.; Wilton, Donald R.

    2006-01-01

    In this paper, we present advances in singularity cancellation techniques applied to integrals in BEM formulations that are nearly hypersingular. Significant advances have been made recently in singularity cancellation techniques applied to 1 R type kernels [M. Khayat, D. Wilton, IEEE Trans. Antennas and Prop., 53, pp. 3180-3190, 2005], as well as to the gradients of these kernels [P. Fink, D. Wilton, and M. Khayat, Proc. ICEAA, pp. 861-864, Torino, Italy, 2005] on curved subdomains. In these approaches, the source triangle is divided into three tangent subtriangles with a common vertex at the normal projection of the observation point onto the source element or the extended surface containing it. The geometry of a typical tangent subtriangle and its local rectangular coordinate system with origin at the projected observation point is shown in Fig. 1. Whereas singularity cancellation techniques for 1 R type kernels are now nearing maturity, the efficient handling of near-hypersingular kernels still needs attention. For example, in the gradient reference above, techniques are presented for computing the normal component of the gradient relative to the plane containing the tangent subtriangle. These techniques, summarized in the transformations in Table 1, are applied at the sub-triangle level and correspond particularly to the case in which the normal projection of the observation point lies within the boundary of the source element. They are found to be highly efficient as z approaches zero. Here, we extend the approach to cover two instances not previously addressed. First, we consider the case in which the normal projection of the observation point lies external to the source element. For such cases, we find that simple modifications to the transformations of Table 1 permit significant savings in computational cost. Second, we present techniques that permit accurate computation of the tangential components of the gradient; i.e., tangent to the plane containing the source element.

  4. Comparison of ITRF2014 station coordinate input time series of DORIS, VLBI and GNSS

    NASA Astrophysics Data System (ADS)

    Tornatore, Vincenza; Tanır Kayıkçı, Emine; Roggero, Marco

    2016-12-01

    In this paper station coordinate time series from three space geodesy techniques that have contributed to the realization of the International Terrestrial Reference Frame 2014 (ITRF2014) are compared. In particular the height component time series extracted from official combined intra-technique solutions submitted for ITRF2014 by DORIS, VLBI and GNSS Combination Centers have been investigated. The main goal of this study is to assess the level of agreement among these three space geodetic techniques. A novel analytic method, modeling time series as discrete-time Markov processes, is presented and applied to the compared time series. The analysis method has proven to be particularly suited to obtain quasi-cyclostationary residuals which are an important property to carry out a reliable harmonic analysis. We looked for common signatures among the three techniques. Frequencies and amplitudes of the detected signals have been reported along with their percentage of incidence. Our comparison shows that two of the estimated signals, having one-year and 14 days periods, are common to all the techniques. Different hypotheses on the nature of the signal having a period of 14 days are presented. As a final check we have compared the estimated velocities and their standard deviations (STD) for the sites that co-located the VLBI, GNSS and DORIS stations, obtaining a good agreement among the three techniques both in the horizontal (1.0 mm/yr mean STD) and in the vertical (0.7 mm/yr mean STD) component, although some sites show larger STDs, mainly due to lack of data, different data spans or noisy observations.

  5. Self organising hypothesis networks: a new approach for representing and structuring SAR knowledge

    PubMed Central

    2014-01-01

    Background Combining different sources of knowledge to build improved structure activity relationship models is not easy owing to the variety of knowledge formats and the absence of a common framework to interoperate between learning techniques. Most of the current approaches address this problem by using consensus models that operate at the prediction level. We explore the possibility to directly combine these sources at the knowledge level, with the aim to harvest potentially increased synergy at an earlier stage. Our goal is to design a general methodology to facilitate knowledge discovery and produce accurate and interpretable models. Results To combine models at the knowledge level, we propose to decouple the learning phase from the knowledge application phase using a pivot representation (lingua franca) based on the concept of hypothesis. A hypothesis is a simple and interpretable knowledge unit. Regardless of its origin, knowledge is broken down into a collection of hypotheses. These hypotheses are subsequently organised into hierarchical network. This unification permits to combine different sources of knowledge into a common formalised framework. The approach allows us to create a synergistic system between different forms of knowledge and new algorithms can be applied to leverage this unified model. This first article focuses on the general principle of the Self Organising Hypothesis Network (SOHN) approach in the context of binary classification problems along with an illustrative application to the prediction of mutagenicity. Conclusion It is possible to represent knowledge in the unified form of a hypothesis network allowing interpretable predictions with performances comparable to mainstream machine learning techniques. This new approach offers the potential to combine knowledge from different sources into a common framework in which high level reasoning and meta-learning can be applied; these latter perspectives will be explored in future work. PMID:24959206

  6. The Design-To-Cost Manifold

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1990-01-01

    Design-to-cost is a popular technique for controlling costs. Although qualitative techniques exist for implementing design to cost, quantitative methods are sparse. In the launch vehicle and spacecraft engineering process, the question whether to minimize mass is usually an issue. The lack of quantification in this issue leads to arguments on both sides. This paper presents a mathematical technique which both quantifies the design-to-cost process and the mass/complexity issue. Parametric cost analysis generates and applies mathematical formulas called cost estimating relationships. In their most common forms, they are continuous and differentiable. This property permits the application of the mathematics of differentiable manifolds. Although the terminology sounds formidable, the application of the techniques requires only a knowledge of linear algebra and ordinary differential equations, common subjects in undergraduate scientific and engineering curricula. When the cost c is expressed as a differentiable function of n system metrics, setting the cost c to be a constant generates an n-1 dimensional subspace of the space of system metrics such that any set of metric values in that space satisfies the constant design-to-cost criterion. This space is a differentiable manifold upon which all mathematical properties of a differentiable manifold may be applied. One important property is that an easily implemented system of ordinary differential equations exists which permits optimization of any function of the system metrics, mass for example, over the design-to-cost manifold. A dual set of equations defines the directions of maximum and minimum cost change. A simplified approximation of the PRICE H(TM) production-production cost is used to generate this set of differential equations over [mass, complexity] space. The equations are solved in closed form to obtain the one dimensional design-to-cost trade and design-for-cost spaces. Preliminary results indicate that cost is relatively insensitive to changes in mass and that the reduction of complexity, both in the manufacturing process and of the spacecraft, is dominant in reducing cost.

  7. Computerized mass detection in whole breast ultrasound images: reduction of false positives using bilateral subtraction technique

    NASA Astrophysics Data System (ADS)

    Ikedo, Yuji; Fukuoka, Daisuke; Hara, Takeshi; Fujita, Hiroshi; Takada, Etsuo; Endo, Tokiko; Morita, Takako

    2007-03-01

    The comparison of left and right mammograms is a common technique used by radiologists for the detection and diagnosis of masses. In mammography, computer-aided detection (CAD) schemes using bilateral subtraction technique have been reported. However, in breast ultrasonography, there are no reports on CAD schemes using comparison of left and right breasts. In this study, we propose a scheme of false positive reduction based on bilateral subtraction technique in whole breast ultrasound images. Mass candidate regions are detected by using the information of edge directions. Bilateral breast images are registered with reference to the nipple positions and skin lines. A false positive region is detected based on a comparison of the average gray values of a mass candidate region and a region with the same position and same size as the candidate region in the contralateral breast. In evaluating the effectiveness of the false positive reduction method, three normal and three abnormal bilateral pairs of whole breast images were employed. These abnormal breasts included six masses larger than 5 mm in diameter. The sensitivity was 83% (5/6) with 13.8 (165/12) false positives per breast before applying the proposed reduction method. By applying the method, false positives were reduced to 4.5 (54/12) per breast without removing a true positive region. This preliminary study indicates that the bilateral subtraction technique is effective for improving the performance of a CAD scheme in whole breast ultrasound images.

  8. Application of separable parameter space techniques to multi-tracer PET compartment modeling

    PubMed Central

    Zhang, Jeff L; Morey, A Michael; Kadrmas, Dan J

    2016-01-01

    Multi-tracer positron emission tomography (PET) can image two or more tracers in a single scan, characterizing multiple aspects of biological functions to provide new insights into many diseases. The technique uses dynamic imaging, resulting in time-activity curves that contain contributions from each tracer present. The process of separating and recovering separate images and/or imaging measures for each tracer requires the application of kinetic constraints, which are most commonly applied by fitting parallel compartment models for all tracers. Such multi-tracer compartment modeling presents challenging nonlinear fits in multiple dimensions. This work extends separable parameter space kinetic modeling techniques, previously developed for fitting single-tracer compartment models, to fitting multi-tracer compartment models. The multi-tracer compartment model solution equations were reformulated to maximally separate the linear and nonlinear aspects of the fitting problem, and separable least-squares techniques were applied to effectively reduce the dimensionality of the nonlinear fit. The benefits of the approach are then explored through a number of illustrative examples, including characterization of separable parameter space multi-tracer objective functions and demonstration of exhaustive search fits which guarantee the true global minimum to within arbitrary search precision. Iterative gradient-descent algorithms using Levenberg–Marquardt were also tested, demonstrating improved fitting speed and robustness as compared to corresponding fits using conventional model formulations. The proposed technique overcomes many of the challenges in fitting simultaneous multi-tracer PET compartment models. PMID:26788888

  9. Laparoscopic completion cholecystectomy and common bile duct exploration for retained gallbladder after single-incision cholecystectomy.

    PubMed

    Kroh, Matthew; Chalikonda, Sricharan; Chand, Bipan; Walsh, R Matthew

    2013-01-01

    Recent enthusiasm in the surgical community for less invasive surgical approaches has resulted in widespread application of single-incision techniques. This has been most commonly applied in laparoscopic cholecystectomy in general surgery. Cosmesis appears to be improved, but other advantages remain to be seen. Feasibility has been demonstrated, but there is little description in the current literature regarding complications. We report the case of a patient who previously underwent single-incision laparoscopic cholecystectomy for symptomatic gallstone disease. After a brief symptom-free interval, she developed acute pancreatitis. At evaluation, imaging results of ultrasonography and magnetic resonance cholangiopancreatography demonstrated a retained gallbladder with cholelithiasis. The patient was subsequently referred to our hospital, where she underwent further evaluation and surgical intervention. Our patient underwent 4-port laparoscopic remnant cholecystectomy with transcystic common bile duct exploration. Operative exploration demonstrated a large remnant gallbladder and a partially obstructed cystic duct with many stones. Transcystic exploration with balloon extraction resulted in duct clearance. The procedure took 75 minutes, with minimal blood loss. The patient's postoperative course was uneventful. Final pathology results demonstrated a remnant gallbladder with cholelithiasis and cholecystitis. This report is the first in the literature to describe successful laparoscopic remnant cholecystectomy and transcystic common bile duct exploration after previous single-port cholecystectomy. Although inadvertent partial cholecystectomy is not unique to this technique, single-port laparoscopic procedures may result in different and significant complications.

  10. Comparison of Attenuated Total Reflectance Mid-Infrared, Near Infrared, and 1H-Nuclear Magnetic Resonance Spectroscopies for the Determination of Coffee's Geographical Origin.

    PubMed

    Medina, Jessica; Caro Rodríguez, Diana; Arana, Victoria A; Bernal, Andrés; Esseiva, Pierre; Wist, Julien

    2017-01-01

    The sensorial properties of Colombian coffee are renowned worldwide, which is reflected in its market value. This raises the threat of fraud by adulteration using coffee grains from other countries, thus creating a demand for robust and cost-effective methods for the determination of geographical origin of coffee samples. Spectroscopic techniques such as Nuclear Magnetic Resonance (NMR), near infrared (NIR), and mid-infrared (mIR) have arisen as strong candidates for the task. Although a body of work exists that reports on their individual performances, a faithful comparison has not been established yet. We evaluated the performance of 1 H-NMR, Attenuated Total Reflectance mIR (ATR-mIR), and NIR applied to fraud detection in Colombian coffee. For each technique, we built classification models for discrimination by species ( C. arabica versus C. canephora (or robusta )) and by origin (Colombia versus other C. arabica ) using a common set of coffee samples. All techniques successfully discriminated samples by species, as expected. Regarding origin determination, ATR-mIR and 1 H-NMR showed comparable capacity to discriminate Colombian coffee samples, while NIR fell short by comparison. In conclusion, ATR-mIR, a less common technique in the field of coffee adulteration and fraud detection, emerges as a strong candidate, faster and with lower cost compared to 1 H-NMR and more discriminating compared to NIR.

  11. Rapid bonding of polydimethylsiloxane (PDMS) to various stereolithographically (STL) structurable epoxy resins using photochemically cross-linked intermediary siloxane layers

    NASA Astrophysics Data System (ADS)

    Wilhelm, Elisabeth; Neumann, Christiane; Sachsenheimer, Kai; Länge, Kerstin; Rapp, Bastian E.

    2014-03-01

    In this paper we present a fast, low cost bonding technology for combining rigid epoxy components with soft membranes made out of polydimethylsiloxane (PDMS). Both materials are commonly used for microfluidic prototyping. Epoxy resins are often applied when rigid channels are required, that will not deform if exposed to high pressure. PDMS, on the other hand, is a flexible material, which allows integration of membrane valves on the chip. However, the integration of pressure driven components, such as membrane valves and pumps, into a completely flexible device leads to pressure losses. In order to build up pressure driven components with maximum energy efficiency a combination of rigid guiding channels and flexible membranes would be advisable. Stereolithographic (STL) structuring would be an ideal fabrication technique for this purpose, because complex 3D-channels structures can easily be fabricated using this technology. Unfortunately, the STL epoxies cannot be bonded using common bonding techniques. For this reason we propose two UV-light based silanization techniques that enable plasma induced bonding of epoxy components. The entire process including silanization and corona discharge bonding can be carried out within half an hour. Average bond strengths up to 350 kPa (depending on the silane) were determined in ISO-conform tensile testing. The applicability of both techniques for microfluidic applications was proven by hydrolytic stability testing lasting more than 40 hours.

  12. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  13. Computerized Dead-Space Volume Measurement of Face Masks Applied to Simulated Faces.

    PubMed

    Amirav, Israel; Luder, Anthony S; Halamish, Asaf; Marzuk, Chatib; Daitzchman, Marcelo; Newhouse, Michael T

    2015-09-01

    The dead-space volume (VD) of face masks for metered-dose inhaler treatments is particularly important in infants and young children with asthma, who have relatively low tidal volumes. Data about VD have been traditionally obtained from water displacement measurements, in which masks are held against a flat surface. Because, in real life, masks are placed against the face, VD is likely to differ considerably between masks depending upon their contour and fit. The aim of this study was to develop an accurate and reliable way to measure VD electronically and to apply this technique by comparing the electronic VD of commonly available face masks. Average digital faces were obtained from 3-dimensional images of 270 infants and children. Commonly used face masks (small and medium) from various manufacturers (Monaghan Medical, Pari Respiratory Equipment, Philips Respironics, and InspiRx) were scanned and digitized by means of computed tomography. Each mask was electronically applied to its respective digital face, and the VD enclosed (mL) was computerized and precisely measured. VD varied between 22.6 mL (SootherMask, InspiRx) and 43.1 mL (Vortex, Pari) for small masks and between 41.7 mL (SootherMask) and 71.5 mL (AeroChamber, Monaghan Medical) for medium masks. These values were significantly lower and less variable than measurements obtained by water displacement. Computerized techniques provide an innovative and relatively simple way of accurately measuring the VD of face masks applied to digital faces. As determined by computerized measurement using average-size virtual faces, the InspiRx masks had a significantly smaller VD for both small and medium masks compared with the other masks. This is of considerable importance with respect to aerosol dose and delivery time, particularly in young children. (ClinicalTrials.gov registration NCT01274299.). Copyright © 2015 by Daedalus Enterprises.

  14. A high temperature testing system for ceramic composites

    NASA Technical Reports Server (NTRS)

    Hemann, John

    1994-01-01

    Ceramic composites are presently being developed for high temperature use in heat engine and space power system applications. The operating temperature range is expected to be 1090 to 1650 C (2000 F to 3000 F). Very little material data is available at these temperatures and, therefore, it is desirable to thoroughly characterize the basic unidirectional fiber reinforced ceramic composite. This includes testing mainly for mechanical material properties at high temperatures. The proper conduct of such characterization tests requires the development of a tensile testing system includes unique gripping, heating, and strain measuring devices which require special considerations. The system also requires an optimized specimen shape. The purpose of this paper is to review various techniques for measuring displacements or strains, preferably at elevated temperatures. Due to current equipment limitations it is assumed that the specimen is to be tested at a temperature of 1430 C (2600F) in an oxidizing atmosphere. For the most part, previous high temperature material characterization tests, such as flexure and tensile tests, have been performed in inert atmospheres. Due to the harsh environment in which the ceramic specimen is to be tested, many conventional strain measuring techniques can not be applied. Initially a brief description of the more commonly used mechanical strain measuring techniques is given. Major advantages and disadvantages with their application to high temperature tensile testing of ceramic composites are discussed. Next, a general overview is given for various optical techniques. Advantages and disadvantages which are common to these techniques are noted. The optical methods for measuring strain or displacement are categorized into two sections. These include real-time techniques. Finally, an optical technique which offers optimum performance with the high temperature tensile testing of ceramic composites is recommended.

  15. Novel Multidimensional Cross-Correlation Data Comparison Techniques for Spectroscopic Discernment in a Volumetrically Sensitive, Moderating Type Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony

    2014-03-01

    A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.

  16. Systems Engineering in NASA's R&TD Programs

    NASA Technical Reports Server (NTRS)

    Jones, Harry

    2005-01-01

    Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.

  17. Roughness Measurement of Dental Materials

    NASA Astrophysics Data System (ADS)

    Shulev, Assen; Roussev, Ilia; Karpuzov, Simeon; Stoilov, Georgi; Ignatova, Detelina; See, Constantin von; Mitov, Gergo

    2016-06-01

    This paper presents a roughness measurement of zirconia ceramics, widely used for dental applications. Surface roughness variations caused by the most commonly used dental instruments for intraoral grinding and polishing are estimated. The applied technique is simple and utilizes the speckle properties of the scattered laser light. It could be easily implemented even in dental clinic environment. The main criteria for roughness estimation is the average speckle size, which varies with the roughness of zirconia. The algorithm used for the speckle size estimation is based on the normalized autocorrelation approach.

  18. Instrumental biosensors: new perspectives for the analysis of biomolecular interactions.

    PubMed

    Nice, E C; Catimel, B

    1999-04-01

    The use of instrumental biosensors in basic research to measure biomolecular interactions in real time is increasing exponentially. Applications include protein-protein, protein-peptide, DNA-protein, DNA-DNA, and lipid-protein interactions. Such techniques have been applied to, for example, antibody-antigen, receptor-ligand, signal transduction, and nuclear receptor studies. This review outlines the principles of two of the most commonly used instruments and highlights specific operating parameters that will assist in optimising experimental design, data generation, and analysis.

  19. Custom blending of lamp phosphors

    NASA Technical Reports Server (NTRS)

    Klemm, R. E.

    1978-01-01

    Spectral output of fluorescent lamps can be precisely adjusted by using computer-assisted analysis for custom blending lamp phosphors. With technique, spectrum of main bank of lamps is measured and stored in computer memory along with emission characteristics of commonly available phosphors. Computer then calculates ratio of green and blue intensities for each phosphor according to manufacturer's specifications and plots them as coordinates on graph. Same ratios are calculated for measured spectrum. Once proper mix is determined, it is applied as coating to fluorescent tubing.

  20. A simple apparatus for controlling nucleation and size in protein crystal growth

    NASA Technical Reports Server (NTRS)

    Gernert, Kim M.; Smith, Robert; Carter, Daniel C.

    1988-01-01

    A simple device is described for controlling vapor equilibrium in macromolecular crystallization as applied to the protein crystal growth technique commonly referred to as the 'hanging drop' method. Crystal growth experiments with hen egg white lysozyme have demonstrated control of the nucleation rate. Nucleation rate and final crystal size have been found to be highly dependent upon the rate at which critical supersaturation is approached. Slower approaches show a marked decrease in the nucleation rate and an increase in crystal size.

  1. Higher-order neural networks, Polyà polynomials, and Fermi cluster diagrams

    NASA Astrophysics Data System (ADS)

    Kürten, K. E.; Clark, J. W.

    2003-09-01

    The problem of controlling higher-order interactions in neural networks is addressed with techniques commonly applied in the cluster analysis of quantum many-particle systems. For multineuron synaptic weights chosen according to a straightforward extension of the standard Hebbian learning rule, we show that higher-order contributions to the stimulus felt by a given neuron can be readily evaluated via Polyà’s combinatoric group-theoretical approach or equivalently by exploiting a precise formal analogy with fermion diagrammatics.

  2. Coordinate Conversion Technique for OTH Backscatter Radar

    DTIC Science & Technology

    1977-05-01

    obliquity of the earth’s equator (=23.0󈧓), A is the mean longitude of the sun measured in the ecliptic counterclockwise from the first point of...MODEL FOR Fo-LAYER CORRECTION FACTORS-VERTICAL IO NO GRAM 11. MODEL FOR Fg-LAYER CORRECTION FACTORS- OBLIQUE IO NO GRAM 12. ELEMENTS OF COMMON BLOCK...simulation in (1) to a given oblique ionogram generate range gradient factors to apply to f F9 and I\\1(3000)F„ to force agreement; (3) from the

  3. Development and First Results of the Width-Tapered Beam Method for Adhesion Testing of Photovoltaic Material Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Tracy, Jared; Dauskardt, Reinhold

    2016-11-21

    A fracture mechanics based approach for quantifying adhesion at every interface within the PV module laminate is presented. The common requirements of monitoring crack length and specimen compliance are circumvented through development of a width-tapered cantilever beam method. This technique may be applied at both the module and coupon level to yield a similar, quantitative, measurement. Details of module and sample preparation are described and first results on field-exposed modules deployed for over 27 years presented.

  4. Preferred Pricing Technique Used in Tourism Small and Medium Enterprises in Badung, Bali, Indonesia

    NASA Astrophysics Data System (ADS)

    Armoni, N. L. E.; Nadra, N. M.; Suarta, I. K.; Widia, I. W.

    2018-01-01

    This research aims to examine various pricing techniques used in 3 types of tourism small and medium enterprises (SMEs) as well as to identify dominant techniques applied in support to sustainable business and tourism. The method used is qualitative method by means of interviewing pricing decision makers in tourism SMEs in Badung regency, in Bali. The results showed that there are 5 techniques used by Tourism SMEs, in Badung regency; among those, 2 pricing methods are dominantly used, these include: the accommodation and transportation businesses are more dominant in using competitor-based pricing techniques, whilst for restaurant business generally using cost-plus pricing techniques. Except for the motor/car rental, which has methodically assessed the financial sustainability of the business by devising ‘breakeven point’ pricing technique, the others still use a traditional way of common sense and gut feeling in assuring the sustainability of the business and the return of the investment. ‘Competitor-based pricing’ becomes the most preferred technic of tourism SMEs, however entrepreneurs should also apply long-term oriented pricing that ensure the profitability and sustainability of the business, as well as deliver value for customers. ‘Value-based pricing’ is practiced by one business respondent - an accommodation provider. This could be used as a reference by other tourism SMEs, as this shows customer orientation and uses quality as an advantage to compete and survive. To be able to operate in this domain tourism SMEs need to offer quality products, have customer service orientation and use them as a ‘competitive advantage’. The research results could become a reference for tourism SMEs in setting prices for sustainable enterprises. Academically, it will enrich the knowledge about tourism that related to product-costing particularly in tourism SMEs.

  5. Proceedings: Fourth Workshop on Mining Scientific Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, C

    Commercial applications of data mining in areas such as e-commerce, market-basket analysis, text-mining, and web-mining have taken on a central focus in the JCDD community. However, there is a significant amount of innovative data mining work taking place in the context of scientific and engineering applications that is not well represented in the mainstream KDD conferences. For example, scientific data mining techniques are being developed and applied to diverse fields such as remote sensing, physics, chemistry, biology, astronomy, structural mechanics, computational fluid dynamics etc. In these areas, data mining frequently complements and enhances existing analysis methods based on statistics, exploratorymore » data analysis, and domain-specific approaches. On the surface, it may appear that data from one scientific field, say genomics, is very different from another field, such as physics. However, despite their diversity, there is much that is common across the mining of scientific and engineering data. For example, techniques used to identify objects in images are very similar, regardless of whether the images came from a remote sensing application, a physics experiment, an astronomy observation, or a medical study. Further, with data mining being applied to new types of data, such as mesh data from scientific simulations, there is the opportunity to apply and extend data mining to new scientific domains. This one-day workshop brings together data miners analyzing science data and scientists from diverse fields to share their experiences, learn how techniques developed in one field can be applied in another, and better understand some of the newer techniques being developed in the KDD community. This is the fourth workshop on the topic of Mining Scientific Data sets; for information on earlier workshops, see http://www.ahpcrc.org/conferences/. This workshop continues the tradition of addressing challenging problems in a field where the diversity of applications is matched only by the opportunities that await a practitioner.« less

  6. Application of zonal model on indoor air sensor network design

    NASA Astrophysics Data System (ADS)

    Chen, Y. Lisa; Wen, Jin

    2007-04-01

    Growing concerns over the safety of the indoor environment have made the use of sensors ubiquitous. Sensors that detect chemical and biological warfare agents can offer early warning of dangerous contaminants. However, current sensor system design is more informed by intuition and experience rather by systematic design. To develop a sensor system design methodology, a proper indoor airflow modeling approach is needed. Various indoor airflow modeling techniques, from complicated computational fluid dynamics approaches to simplified multi-zone approaches, exist in the literature. In this study, the effects of two airflow modeling techniques, multi-zone modeling technique and zonal modeling technique, on indoor air protection sensor system design are discussed. Common building attack scenarios, using a typical CBW agent, are simulated. Both multi-zone and zonal models are used to predict airflows and contaminant dispersion. Genetic Algorithm is then applied to optimize the sensor location and quantity. Differences in the sensor system design resulting from the two airflow models are discussed for a typical office environment and a large hall environment.

  7. Condensation enhancement by means of electrohydrodynamic techniques

    NASA Astrophysics Data System (ADS)

    Butrymowicz, Dariusz; Karwacki, Jarosław; Trela, Marian

    2014-12-01

    Short state-of-the-art on the enhancement of condensation heat transfer techniques by means of condensate drainage is presented in this paper. The electrohydrodynamic (EHD) technique is suitable for dielectric media used in refrigeration, organic Rankine cycles and heat pump devices. The electric field is commonly generated in the case of horizontal tubes by means of a rod-type electrode or mesh electrodes. Authors proposed two geometries in the presented own experimental investigations. The first one was an electrode placed just beneath the tube bottom and the second one consisted of a horizontal finned tube with a double electrode placed beneath the tube. The experimental investigations of these two configurations for condensation of refrigerant R-123 have been accomplished. The obtained results confirmed that the application of the EHD technique for the investigated tube and electrode arrangement caused significant increase in heat transfer coefficient. The condensation enhancement depends both on the geometry of the electrode system and on the applied voltage.

  8. Study of different concentric rings inside gallstones with LIBS.

    PubMed

    Pathak, Ashok Kumar; Singh, Vivek Kumar; Rai, Nilesh Kumar; Rai, Awadhesh Kumar; Rai, Pradeep Kumar; Rai, Pramod Kumar; Rai, Suman; Baruah, G D

    2011-07-01

    Gallstones obtained from patients from the north-east region of India (Assam) were studied using laser-induced breakdown spectroscopy (LIBS) technique. LIBS spectra of the different layers (in cross-section) of the gallstones were recorded in the spectral region 200-900 nm. Several elements, including calcium, magnesium, manganese, copper, silicon, phosphorus, iron, sodium and potassium, were detected in the gallstones. Lighter elements, including carbon, hydrogen, nitrogen and oxygen were also detected, which demonstrates the superiority of the LIBS technique over other existing analytical techniques. The LIBS technique was applied to investigate the evolution of C(2) swan bands and CN violet bands in the LIBS spectra of the gallstones in air and an argon atmosphere. The different layers (dark and light layers) of the gallstones were discriminated on the basis of the presence and intensities of the spectral lines for carbon, hydrogen, nitrogen, oxygen and copper. An attempt was also made to correlate the presence of major and minor elements in the gallstones with the common diet of the population of Assam.

  9. Minimally invasive knee arthroplasty: An overview

    PubMed Central

    Tria, Alfred J; Scuderi, Giles R

    2015-01-01

    Minimally invasive surgery (MIS) for arthroplasty of the knee began with surgery for unicondylar knee arthroplasty (UKA). Partial knee replacements were designed in the 1970s and were amenable to a more limited exposure. In the 1990s Repicci popularized the MIS for UKA. Surgeons began to apply his concepts to total knee arthroplasty. Four MIS surgical techniques were developed: quadriceps sparing, mini-mid vastus, mini-subvastus, and mini-medial parapatellar. The quadriceps sparing technique is the most limited one and is also the most difficult. However, it is the least invasive and allows rapid recovery. The mini-midvastus is the most common technique because it affords slightly better exposure and can be extended. The mini-subvastus technique entirely avoids incising the quadriceps extensor mechanism but is time consuming and difficult in the obese and in the muscular male patient. The mini-parapatellar technique is most familiar to surgeons and represents a good starting point for surgeons who are learning the techniques. The surgeries are easier with smaller instruments but can be performed with standard ones. The techniques are accurate and do lead to a more rapid recovery, with less pain, less blood loss, and greater motion if they are appropriately performed. PMID:26601062

  10. [Clinical and radiographic evaluation of a new percutaneous technique for moderate to severe hallux valgus deformity].

    PubMed

    Vélez-de Lachica, J C; Valdez-Jiménez, L A; Inzunza-Sánchez, J M

    2017-01-01

    Hallux valgus is considered the most common musculoskeletal deformity, with a prevalence of 88%. There are more than 130 surgical techniques for its treatment; currently, percutaneous ones are popular; however, they do not take into account the metatarsal-phalangeal correction angle. The aim of this study is to propose a modified technique for the correction of the percutaneous metatarsal-phalangeal and inter-metatarsal angles and to evaluate its clinical and radiological results. An experimental, prospective and longitudinal study in 10 patients with moderate to severe hallux valgus according to the classification of Coughlin and Mann were collected; the results were evaluated with the AOFAS scale at 15, 30, 60 and 90 days. The McBride technique and the technique of percutaneous anchor with the proposed amendment were performed. The AOFAS scale was applied as described, finding a progressive increase of the rating; the average correction of the inter-metatarsal angle was 8.8 degrees and of the metatarsal-phalangeal, 9.12. The modified technique of percutaneous anchor showed clear clinical and radiographic improvements in the short term. Our modified technique is proposed for future projects, including a large sample with long-term follow-up.

  11. Multisubject Learning for Common Spatial Patterns in Motor-Imagery BCI

    PubMed Central

    Devlaminck, Dieter; Wyns, Bart; Grosse-Wentrup, Moritz; Otte, Georges; Santens, Patrick

    2011-01-01

    Motor-imagery-based brain-computer interfaces (BCIs) commonly use the common spatial pattern filter (CSP) as preprocessing step before feature extraction and classification. The CSP method is a supervised algorithm and therefore needs subject-specific training data for calibration, which is very time consuming to collect. In order to reduce the amount of calibration data that is needed for a new subject, one can apply multitask (from now on called multisubject) machine learning techniques to the preprocessing phase. Here, the goal of multisubject learning is to learn a spatial filter for a new subject based on its own data and that of other subjects. This paper outlines the details of the multitask CSP algorithm and shows results on two data sets. In certain subjects a clear improvement can be seen, especially when the number of training trials is relatively low. PMID:22007194

  12. Common Eigenvectors of N Particles' Compatible Observables and its Squeezing Operator

    NASA Astrophysics Data System (ADS)

    Xu, Shi-Min; Xu, Xing-Lei; Li, Hong-Qi

    We construct 2N operators for a N-particle system, namely one center-of-mass coordinate operator, N-1 relative coordinate operators, one total momentum operator and N-1 mass-weighted relative momentum operators, and give common eigenvectors of N compatible observables \\{∑ Ni=1hat {p}i,hat {x}1-hat {x}2,hat {x}2-hat {x}3,hat {x}3-hat {x}4,...; ,hat {x}N-1-hat {x}N\\}, which are composed of N particles' coordinate hat {x}i and momentum hat {p}i. By compatible, we mean such observables can be simultaneously determined. Using the technique of integration within an ordered product of operators (IWOP), we prove that the common eigenvectors are complete and orthonormal, and hereby qualified for making up a representation. This new representation can be applied to solving some dynamic problems in quantum mechanics.

  13. Neurobehavioral Development of Common Marmoset Monkeys

    PubMed Central

    Schultz-Darken, Nancy; Braun, Katarina M.; Emborg, Marina E.

    2016-01-01

    Common marmoset (Callithrix jacchus) monkeys are a resource for biomedical research and their use is predicted to increase due to the suitability of this species for transgenic approaches. Identification of abnormal neurodevelopment due to genetic modification relies upon the comparison with validated patterns of normal behavior defined by unbiased methods. As scientists unfamiliar with nonhuman primate development are interested to apply genomic editing techniques in marmosets, it would be beneficial to the field that the investigators use validated methods of postnatal evaluation that are age and species appropriate. This review aims to analyze current available data on marmoset physical and behavioral postnatal development, describe the methods used and discuss next steps to better understand and evaluate marmoset normal and abnormal postnatal neurodevelopment PMID:26502294

  14. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  15. Assembly and microscopic characterization of DNA origami structures.

    PubMed

    Scheible, Max; Jungmann, Ralf; Simmel, Friedrich C

    2012-01-01

    DNA origami is a revolutionary method for the assembly of molecular nanostructures from DNA with precisely defined dimensions and with an unprecedented yield. This can be utilized to arrange nanoscale components such as proteins or nanoparticles into pre-defined patterns. For applications it will now be of interest to arrange such components into functional complexes and study their geometry-dependent interactions. While commonly DNA nanostructures are characterized by atomic force microscopy or electron microscopy, these techniques often lack the time-resolution to study dynamic processes. It is therefore of considerable interest to also apply fluorescence microscopic techniques to DNA nanostructures. Of particular importance here is the utilization of novel super-resolved microscopy methods that enable imaging beyond the classical diffraction limit.

  16. Local Positioning Systems in (Game) Sports

    PubMed Central

    Leser, Roland; Baca, Arnold; Ogris, Georg

    2011-01-01

    Position data of players and athletes are widely used in sports performance analysis for measuring the amounts of physical activities as well as for tactical assessments in game sports. However, positioning sensing systems are applied in sports as tools to gain objective information of sports behavior rather than as components of intelligent spaces (IS). The paper outlines the idea of IS for the sports context with special focus to game sports and how intelligent sports feedback systems can benefit from IS. Henceforth, the most common location sensing techniques used in sports and their practical application are reviewed, as location is among the most important enabling techniques for IS. Furthermore, the article exemplifies the idea of IS in sports on two applications. PMID:22163725

  17. Living specimen tomography by digital holographic microscopy: morphometry of testate amoeba

    NASA Astrophysics Data System (ADS)

    Charrière, Florian; Pavillon, Nicolas; Colomb, Tristan; Depeursinge, Christian; Heger, Thierry J.; Mitchell, Edward A. D.; Marquet, Pierre; Rappaz, Benjamin

    2006-08-01

    This paper presents an optical diffraction tomography technique based on digital holographic microscopy. Quantitative 2-dimensional phase images are acquired for regularly-spaced angular positions of the specimen covering a total angle of π, allowing to built 3-dimensional quantitative refractive index distributions by an inverse Radon transform. A 20x magnification allows a resolution better than 3 μm in all three dimensions, with accuracy better than 0.01 for the refractive index measurements. This technique is for the first time to our knowledge applied to living specimen (testate amoeba, Protista). Morphometric measurements are extracted from the tomographic reconstructions, showing that the commonly used method for testate amoeba biovolume evaluation leads to systematic under evaluations by about 50%.

  18. Fabrication of polyurethane and polyurethane based composite fibres by the electrospinning technique for soft tissue engineering of cardiovascular system.

    PubMed

    Kucinska-Lipka, J; Gubanska, I; Janik, H; Sienkiewicz, M

    2015-01-01

    Electrospinning is a unique technique, which provides forming of polymeric scaffolds for soft tissue engineering, which include tissue scaffolds for soft tissues of the cardiovascular system. Such artificial soft tissues of the cardiovascular system may possess mechanical properties comparable to native vascular tissues. Electrospinning technique gives the opportunity to form fibres with nm- to μm-scale in diameter. The arrangement of obtained fibres and their surface determine the biocompatibility of the scaffolds. Polyurethanes (PUs) are being commonly used as a prosthesis of cardiovascular soft tissues due to their excellent biocompatibility, non-toxicity, elasticity and mechanical properties. PUs also possess fine spinning properties. The combination of a variety of PU properties with an electrospinning technique, conducted at the well tailored conditions, gives unlimited possibilities of forming novel polyurethane materials suitable for soft tissue scaffolds applied in cardiovascular tissue engineering. This paper can help researches to gain more widespread and deeper understanding of designing electrospinable PU materials, which may be used as cardiovascular soft tissue scaffolds. In this paper we focus on reagents used in PU synthesis designed to increase PU biocompatibility (polyols) and biodegradability (isocyanates). We also describe suggested surface modifications of electrospun PUs, and the direct influence of surface wettability on providing enhanced biocompatibility of scaffolds. We indicate a great influence of electrospinning parameters (voltage, flow rate, working distance) and used solvents (mostly DMF, THF and HFIP) on fibre alignment and diameter - what impacts the biocompatibility and hemocompatibility of such electrospun PU scaffolds. Moreover, we present PU modifications with natural polymers with novel approach applied in electrospinning of PU scaffolds. This work may contribute with further developing of novel electrospun PUs, which may be applied as soft tissue scaffolds of the cardiovascular system. Copyright © 2014. Published by Elsevier B.V.

  19. Standoff laser-based spectroscopy for explosives detection

    NASA Astrophysics Data System (ADS)

    Gaft, M.; Nagli, L.

    2007-10-01

    Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS activity is based on a combination of laser-based spectroscopic methods with orthogonal capabilities. Our technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We has applied optical techniques including gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.

  20. High-precision buffer circuit for suppression of regenerative oscillation

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Hare, David A.; Tcheng, Ping

    1995-01-01

    Precision analog signal conditioning electronics have been developed for wind tunnel model attitude inertial sensors. This application requires low-noise, stable, microvolt-level DC performance and a high-precision buffered output. Capacitive loading of the operational amplifier output stages due to the wind tunnel analog signal distribution facilities caused regenerative oscillation and consequent rectification bias errors. Oscillation suppression techniques commonly used in audio applications were inadequate to maintain the performance requirements for the measurement of attitude for wind tunnel models. Feedback control theory is applied to develop a suppression technique based on a known compensation (snubber) circuit, which provides superior oscillation suppression with high output isolation and preserves the low-noise low-offset performance of the signal conditioning electronics. A practical design technique is developed to select the parameters for the compensation circuit to suppress regenerative oscillation occurring when typical shielded cable loads are driven.

  1. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  2. Thermophysical Properties Measurements of Zr62Cu20Al10Ni8

    NASA Technical Reports Server (NTRS)

    Bradshaw, Richard C.; Waren, Mary; Rogers, Jan R.; Rathz, Thomas J.; Gangopadhyay, Anup K.; Kelton, Ken F.; Hyers, Robert W.

    2006-01-01

    Thermophysical property studies performed at high temperature can prove challenging because of reactivity problems brought on by the elevated temperatures. Contaminants from measuring devices and container walls can cause changes in properties. To prevent this, containerless processing techniques can be employed to isolate a sample during study. A common method used for this is levitation. Typical levitation methods used for containerless processing are, aerodynamically, electromagnetically and electrostatically based. All levitation methods reduce heterogeneous nucleation sites, 'which in turn provide access to metastable undercooled phases. In particular, electrostatic levitation is appealing because sample motion and stirring are minimized; and by combining it with optically based non-contact measuring techniques, many thermophysical properties can be measured. Applying some of these techniques, surface tension, viscosity and density have been measured for the glass forming alloy Zr62Cu20Al10Ni8 and will be presented with a brief overview of the non-contact measuring method used.

  3. Applying and advancing behavior change theories and techniques in the context of a digital health revolution: Proposals for more effectively realizing untapped potential

    PubMed Central

    Moller, Arlen C.; Merchant, Gina; Conroy, David E.; West, Robert; Hekler, Eric B.; Kugler, Kari C.; Michie, Susan

    2017-01-01

    As more behavioral health interventions move from traditional to digital platforms, the application of evidence-based theories and techniques may be doubly advantageous. First, it can expedite digital health intervention development, improving efficacy, and increasing reach. Second, moving behavioral health interventions to digital platforms presents researchers with novel (potentially paradigm shifting) opportunities for advancing theories and techniques. In particular, the potential for technology to revolutionize theory refinement is made possible by leveraging the proliferation of “real-time” objective measurement and “big data” commonly generated and stored by digital platforms. Much more could be done to realize this potential. This paper offers proposals for better leveraging the potential advantages of digital health platforms, and reviews three of the cutting edge methods for doing so: optimization designs, dynamic systems modeling, and social network analysis. PMID:28058516

  4. When a Text Is Translated Does the Complexity of Its Vocabulary Change? Translations and Target Readerships

    PubMed Central

    Rêgo, Hênio Henrique Aragão; Braunstein, Lidia A.; D′Agostino, Gregorio; Stanley, H. Eugene; Miyazima, Sasuke

    2014-01-01

    In linguistic studies, the academic level of the vocabulary in a text can be described in terms of statistical physics by using a “temperature” concept related to the text's word-frequency distribution. We propose a “comparative thermo-linguistic” technique to analyze the vocabulary of a text to determine its academic level and its target readership in any given language. We apply this technique to a large number of books by several authors and examine how the vocabulary of a text changes when it is translated from one language to another. Unlike the uniform results produced using the Zipf law, using our “word energy” distribution technique we find variations in the power-law behavior. We also examine some common features that span across languages and identify some intriguing questions concerning how to determine when a text is suitable for its intended readership. PMID:25353343

  5. When a text is translated does the complexity of its vocabulary change? Translations and target readerships.

    PubMed

    Rêgo, Hênio Henrique Aragão; Braunstein, Lidia A; D'Agostino, Gregorio; Stanley, H Eugene; Miyazima, Sasuke

    2014-01-01

    In linguistic studies, the academic level of the vocabulary in a text can be described in terms of statistical physics by using a "temperature" concept related to the text's word-frequency distribution. We propose a "comparative thermo-linguistic" technique to analyze the vocabulary of a text to determine its academic level and its target readership in any given language. We apply this technique to a large number of books by several authors and examine how the vocabulary of a text changes when it is translated from one language to another. Unlike the uniform results produced using the Zipf law, using our "word energy" distribution technique we find variations in the power-law behavior. We also examine some common features that span across languages and identify some intriguing questions concerning how to determine when a text is suitable for its intended readership.

  6. Solvent-free melting techniques for the preparation of lipid-based solid oral formulations.

    PubMed

    Becker, Karin; Salar-Behzadi, Sharareh; Zimmer, Andreas

    2015-05-01

    Lipid excipients are applied for numerous purposes such as taste masking, controlled release, improvement of swallowability and moisture protection. Several melting techniques have evolved in the last decades. Common examples are melt coating, melt granulation and melt extrusion. The required equipment ranges from ordinary glass beakers for lab scale up to large machines such as fluid bed coaters, spray dryers or extruders. This allows for upscaling to pilot or production scale. Solvent free melt processing provides a cost-effective, time-saving and eco-friendly method for the food and pharmaceutical industries. This review intends to give a critical overview of the published literature on experiences, formulations and challenges and to show possibilities for future developments in this promising field. Moreover, it should serve as a guide for selecting the best excipients and manufacturing techniques for the development of a product with specific properties using solvent free melt processing.

  7. Applying projective techniques to formative research in health communication development.

    PubMed

    Wiehagen, Theresa; Caito, Nicole M; Thompson, Vetta Sanders; Casey, Christopher M; Weaver, Nancy L; Jupka, Keri; Kreuter, Matthew W

    2007-04-01

    This article describes a new approach to formative research in which projective techniques commonly used in psychological assessment were adapted for use in focus groups to help design colorectal-cancer screening materials for African American men and women. Participants (N = 20) were divided into six "design teams." Each team was given a selection of design supplies and asked to create and discuss a visual layout for screening materials. Participants chose design elements that reflected visual preferences that they felt would connect meaningfully with other African Americans. The dynamics within the design teams were different than in traditional focus groups, with participants having more control over the group's direction. Using projective techniques helped draw out unique information from participants by allowing them to "project" their opinions onto objects. This approach may be a valuable tool for health-promotion and health-communication practitioners seeking insight on the implicit values of a priority population.

  8. [Self-relaxation techniques for glaucoma patients. Significance of autogenic training, hypnosis and music therapy].

    PubMed

    Bertelmann, T; Strempel, I

    2016-02-01

    Glaucoma is currently the second most common cause of severe visual impairment and blindness worldwide. Standard pharmaceutical and surgical interventions often fail to prevent progression of glaucomatous optic neuropathy. To evaluate whether adjuvantly applied self-relaxation techniques can significantly impact intraocular pressure, ocular perfusion and the overall mental state of affected patients. A search of the literature was carried out and a comprehensive overview of currently available data is presented. Autogenic training, hypnosis and music therapy can significantly impact intraocular pressure, ocular perfusion and overall mental state of patients suffering from glaucoma. As all of these adjuvant therapeutic options are cost-effective, available almost everywhere and at anytime as well as without any known side effects, they can be useful additional techniques in the overall concept for treating glaucoma patients. Regular ocular examinations by an ophthalmologist are, however, mandatory.

  9. Application of the intelligent techniques in transplantation databases: a review of articles published in 2009 and 2010.

    PubMed

    Sousa, F S; Hummel, A D; Maciel, R F; Cohrs, F M; Falcão, A E J; Teixeira, F; Baptista, R; Mancini, F; da Costa, T M; Alves, D; Pisa, I T

    2011-05-01

    The replacement of defective organs with healthy ones is an old problem, but only a few years ago was this issue put into practice. Improvements in the whole transplantation process have been increasingly important in clinical practice. In this context are clinical decision support systems (CDSSs), which have reflected a significant amount of work to use mathematical and intelligent techniques. The aim of this article was to present consideration of intelligent techniques used in recent years (2009 and 2010) to analyze organ transplant databases. To this end, we performed a search of the PubMed and Institute for Scientific Information (ISI) Web of Knowledge databases to find articles published in 2009 and 2010 about intelligent techniques applied to transplantation databases. Among 69 retrieved articles, we chose according to inclusion and exclusion criteria. The main techniques were: Artificial Neural Networks (ANN), Logistic Regression (LR), Decision Trees (DT), Markov Models (MM), and Bayesian Networks (BN). Most articles used ANN. Some publications described comparisons between techniques or the use of various techniques together. The use of intelligent techniques to extract knowledge from databases of healthcare is increasingly common. Although authors preferred to use ANN, statistical techniques were equally effective for this enterprise. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Electromagnetic Launch Vehicle Fairing and Acoustic Blanket Model of Received Power Using FEKO

    NASA Technical Reports Server (NTRS)

    Trout, Dawn H.; Stanley, James E.; Wahid, Parveen F.

    2011-01-01

    Evaluating the impact of radio frequency transmission in vehicle fairings is important to electromagnetically sensitive spacecraft. This study employs the multilevel fast multipole method (MLFMM) from a commercial electromagnetic tool, FEKO, to model the fairing electromagnetic environment in the presence of an internal transmitter with improved accuracy over industry applied techniques. This fairing model includes material properties representative of acoustic blanketing commonly used in vehicles. Equivalent surface material models within FEKO were successfully applied to simulate the test case. Finally, a simplified model is presented using Nicholson Ross Weir derived blanket material properties. These properties are implemented with the coated metal option to reduce the model to one layer within the accuracy of the original three layer simulation.

  11. Applications of Two-Dimensional Electrophoresis Technology to the Study of Atherosclerosis

    PubMed Central

    Lepedda, Antonio J.

    2008-01-01

    Atherosclerosis is a multifactorial disease in which hypertension, diabetes, hyperlipidemia and other risk factors are thought to play a role. However, the molecular processes underlying plaque formation and progression are not yet completely known. In the last years some researchers applied proteomics technologies for the comprehension of biochemical pathways of atherogenesis and to search new cardiovascular biomarkers to be utilized either as early diagnostic traits or as targets for new drug therapies. Due to its intrinsic complexity, the problem has been approached by different strategies, all of which have some limitations. In this review, we summarize the most common critical experimental variables in two-dimensional electrophoresis-based techniques and recent data obtained by applying proteomic approaches in the study of atherosclerosis. PMID:27683313

  12. OARSI Clinical Trials Recommendations: Hand imaging in clinical trials in osteoarthritis.

    PubMed

    Hunter, D J; Arden, N; Cicuttini, F; Crema, M D; Dardzinski, B; Duryea, J; Guermazi, A; Haugen, I K; Kloppenburg, M; Maheu, E; Miller, C G; Martel-Pelletier, J; Ochoa-Albíztegui, R E; Pelletier, J-P; Peterfy, C; Roemer, F; Gold, G E

    2015-05-01

    Tremendous advances have occurred in our understanding of the pathogenesis of hand osteoarthritis (OA) and these are beginning to be applied to trials targeted at modification of the disease course. The purpose of this expert opinion, consensus driven exercise is to provide detail on how one might use and apply hand imaging assessments in disease modifying clinical trials. It includes information on acquisition methods/techniques (including guidance on positioning for radiography, sequence/protocol recommendations/hardware for MRI); commonly encountered problems (including positioning, hardware and coil failures, sequences artifacts); quality assurance/control procedures; measurement methods; measurement performance (reliability, responsiveness, validity); recommendations for trials; and research recommendations. Copyright © 2015 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  13. Pressure Autoregulation Measurement Techniques in Adult Traumatic Brain Injury, Part I: A Scoping Review of Intermittent/Semi-Intermittent Methods.

    PubMed

    Zeiler, Frederick A; Donnelly, Joseph; Calviello, Leanne; Menon, David K; Smielewski, Peter; Czosnyka, Marek

    2017-12-01

    The purpose of this study was to perform a systematic, scoping review of commonly described intermittent/semi-intermittent autoregulation measurement techniques in adult traumatic brain injury (TBI). Nine separate systematic reviews were conducted for each intermittent technique: computed tomographic perfusion (CTP)/Xenon-CT (Xe-CT), positron emission tomography (PET), magnetic resonance imaging (MRI), arteriovenous difference in oxygen (AVDO 2 ) technique, thigh cuff deflation technique (TCDT), transient hyperemic response test (THRT), orthostatic hypotension test (OHT), mean flow index (Mx), and transfer function autoregulation index (TF-ARI). MEDLINE ® , BIOSIS, EMBASE, Global Health, Scopus, Cochrane Library (inception to December 2016), and reference lists of relevant articles were searched. A two tier filter of references was conducted. The total number of articles utilizing each of the nine searched techniques for intermittent/semi-intermittent autoregulation techniques in adult TBI were: CTP/Xe-CT (10), PET (6), MRI (0), AVDO 2 (10), ARI-based TCDT (9), THRT (6), OHT (3), Mx (17), and TF-ARI (6). The premise behind all of the intermittent techniques is manipulation of systemic blood pressure/blood volume via either chemical (such as vasopressors) or mechanical (such as thigh cuffs or carotid compression) means. Exceptionally, Mx and TF-ARI are based on spontaneous fluctuations of cerebral perfusion pressure (CPP) or mean arterial pressure (MAP). The method for assessing the cerebral circulation during these manipulations varies, with both imaging-based techniques and TCD utilized. Despite the limited literature for intermittent/semi-intermittent techniques in adult TBI (minus Mx), it is important to acknowledge the availability of such tests. They have provided fundamental insight into human autoregulatory capacity, leading to the development of continuous and more commonly applied techniques in the intensive care unit (ICU). Numerous methods of intermittent/semi-intermittent pressure autoregulation assessment in adult TBI exist, including: CTP/Xe-CT, PET, AVDO 2 technique, TCDT-based ARI, THRT, OHT, Mx, and TF-ARI. MRI-based techniques in adult TBI are yet to be described, with the main focus of MRI techniques on metabolic-based cerebrovascular reactivity (CVR) and not pressure-based autoregulation.

  14. Resonance line transfer calculations by doubling thin layers. I - Comparison with other techniques. II - The use of the R-parallel redistribution function. [planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Yelle, Roger V.; Wallace, Lloyd

    1989-01-01

    A versatile and efficient technique for the solution of the resonance line scattering problem with frequency redistribution in planetary atmospheres is introduced. Similar to the doubling approach commonly used in monochromatic scattering problems, the technique has been extended to include the frequency dependence of the radiation field. Methods for solving problems with external or internal sources and coupled spectral lines are presented, along with comparison of some sample calculations with results from Monte Carlo and Feautrier techniques. The doubling technique has also been applied to the solution of resonance line scattering problems where the R-parallel redistribution function is appropriate, both neglecting and including polarization as developed by Yelle and Wallace (1989). With the constraint that the atmosphere is illuminated from the zenith, the only difficulty of consequence is that of performing precise frequency integrations over the line profiles. With that problem solved, it is no longer necessary to use the Monte Carlo method to solve this class of problem.

  15. Resting functional imaging tools (MRS, SPECT, PET and PCT).

    PubMed

    Van Der Naalt, J

    2015-01-01

    Functional imaging includes imaging techniques that provide information about the metabolic and hemodynamic status of the brain. Most commonly applied functional imaging techniques in patients with traumatic brain injury (TBI) include magnetic resonance spectroscopy (MRS), single photon emission computed tomography (SPECT), positron emission tomography (PET) and perfusion CT (PCT). These imaging modalities are used to determine the extent of injury, to provide information for the prediction of outcome, and to assess evidence of cerebral ischemia. In TBI, secondary brain damage mainly comprises ischemia and is present in more than 80% of fatal cases with traumatic brain injury (Graham et al., 1989; Bouma et al., 1991; Coles et al., 2004). In particular, while SPECT measures cerebral perfusion and MRS determines metabolism, PET is able to assess both perfusion and cerebral metabolism. This chapter will describe the application of these techniques in traumatic brain injury separately for the major groups of severity comprising the mild and moderate to severe group. The application in TBI and potential difficulties of each technique is described. The use of imaging techniques in children will be separately outlined. © 2015 Elsevier B.V. All rights reserved.

  16. Non-invasive imaging methods applied to neo- and paleo-ontological cephalopod research

    NASA Astrophysics Data System (ADS)

    Hoffmann, R.; Schultz, J. A.; Schellhorn, R.; Rybacki, E.; Keupp, H.; Gerden, S. R.; Lemanis, R.; Zachow, S.

    2014-05-01

    Several non-invasive methods are common practice in natural sciences today. Here we present how they can be applied and contribute to current topics in cephalopod (paleo-) biology. Different methods will be compared in terms of time necessary to acquire the data, amount of data, accuracy/resolution, minimum/maximum size of objects that can be studied, the degree of post-processing needed and availability. The main application of the methods is seen in morphometry and volumetry of cephalopod shells. In particular we present a method for precise buoyancy calculation. Therefore, cephalopod shells were scanned together with different reference bodies, an approach developed in medical sciences. It is necessary to know the volume of the reference bodies, which should have similar absorption properties like the object of interest. Exact volumes can be obtained from surface scanning. Depending on the dimensions of the study object different computed tomography techniques were applied.

  17. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE PAGES

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.; ...

    2017-08-25

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  18. Effects of scale of movement, detection probability, and true population density on common methods of estimating population density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiter, David A.; Davis, Amy J.; Rhodes, Olin E.

    Knowledge of population density is necessary for effective management and conservation of wildlife, yet rarely are estimators compared in their robustness to effects of ecological and observational processes, which can greatly influence accuracy and precision of density estimates. For this study, we simulate biological and observational processes using empirical data to assess effects of animal scale of movement, true population density, and probability of detection on common density estimators. We also apply common data collection and analytical techniques in the field and evaluate their ability to estimate density of a globally widespread species. We find that animal scale of movementmore » had the greatest impact on accuracy of estimators, although all estimators suffered reduced performance when detection probability was low, and we provide recommendations as to when each field and analytical technique is most appropriately employed. The large influence of scale of movement on estimator accuracy emphasizes the importance of effective post-hoc calculation of area sampled or use of methods that implicitly account for spatial variation. In particular, scale of movement impacted estimators substantially, such that area covered and spacing of detectors (e.g. cameras, traps, etc.) must reflect movement characteristics of the focal species to reduce bias in estimates of movement and thus density.« less

  19. Procedures for Obtaining and Analyzing Writing Samples of School-Age Children and Adolescents.

    PubMed

    Price, Johanna R; Jackson, Sandra C

    2015-10-01

    Many students' writing skills are below grade-level expectations, and students with oral language difficulties are at particular risk for writing difficulties. Speech-language pathologists' (SLPs') expertise in language applies to both the oral and written modalities, yet evidence suggests that SLPs' confidence regarding writing assessment is low. Writing samples are a clinically useful, criterion-referenced assessment technique that is relevant to helping students satisfy writing-related requirements of the Common Core State Standards (National Governors Association Center for Best Practices and Council of Chief State School Officers, 2010a). This article provides recommendations for obtaining and analyzing students' writing samples. In this tutorial, the authors provide a comprehensive literature review of methods regarding (a) collection of writing samples from narrative, expository (informational/explanatory), and persuasive (argument) genres; (b) variables of writing performance that are useful to assess; and (c) manual and computer-aided techniques for analyzing writing samples. The authors relate their findings to expectations for writing skills expressed in the Common Core State Standards (National Governors Association Center for Best Practices & Council of Chief State School Officers, 2010a). SLPs can readily implement many techniques for obtaining and analyzing writing samples. The information in this article provides SLPs with recommendations for the use of writing samples and may help increase SLPs' confidence regarding written language assessment.

  20. MR Imaging Applications in Mild Traumatic Brain Injury: An Imaging Update

    PubMed Central

    Wu, Xin; Kirov, Ivan I.; Gonen, Oded; Ge, Yulin; Grossman, Robert I.

    2016-01-01

    Mild traumatic brain injury (mTBI), also commonly referred to as concussion, affects millions of Americans annually. Although computed tomography is the first-line imaging technique for all traumatic brain injury, it is incapable of providing long-term prognostic information in mTBI. In the past decade, the amount of research related to magnetic resonance (MR) imaging of mTBI has grown exponentially, partly due to development of novel analytical methods, which are applied to a variety of MR techniques. Here, evidence of subtle brain changes in mTBI as revealed by these techniques, which are not demonstrable by conventional imaging, will be reviewed. These changes can be considered in three main categories of brain structure, function, and metabolism. Macrostructural and microstructural changes have been revealed with three-dimensional MR imaging, susceptibility-weighted imaging, diffusion-weighted imaging, and higher order diffusion imaging. Functional abnormalities have been described with both task-mediated and resting-state blood oxygen level–dependent functional MR imaging. Metabolic changes suggesting neuronal injury have been demonstrated with MR spectroscopy. These findings improve understanding of the true impact of mTBI and its pathogenesis. Further investigation may eventually lead to improved diagnosis, prognosis, and management of this common and costly condition. © RSNA, 2016 PMID:27183405

  1. Comparing the landcapes of common retroviral insertion sites across tumor models

    NASA Astrophysics Data System (ADS)

    Weishaupt, Holger; Čančer, Matko; Engström, Cristopher; Silvestrov, Sergei; Swartling, Fredrik J.

    2017-01-01

    Retroviral tagging represents an important technique, which allows researchers to screen for candidate cancer genes. The technique is based on the integration of retroviral sequences into the genome of a host organism, which might then lead to the artificial inhibition or expression of proximal genetic elements. The identification of potential cancer genes in this framework involves the detection of genomic regions (common insertion sites; CIS) which contain a number of such viral integration sites that is greater than expected by chance. During the last two decades, a number of different methods have been discussed for the identification of such loci and the respective techniques have been applied to a variety of different retroviruses and/or tumor models. We have previously established a retrovirus driven brain tumor model and reported the CISs which were found based on a Monte Carlo statistics derived detection paradigm. In this study, we consider a recently proposed alternative graph theory based method for identifying CISs and compare the resulting CIS landscape in our brain tumor dataset to those obtained when using the Monte Carlo approach. Finally, we also employ the graph-based method to compare the CIS landscape in our brain tumor model with those of other published retroviral tumor models.

  2. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  3. Joint inversion of multiple geophysical and petrophysical data using generalized fuzzy clustering algorithms

    NASA Astrophysics Data System (ADS)

    Sun, Jiajia; Li, Yaoguo

    2017-02-01

    Joint inversion that simultaneously inverts multiple geophysical data sets to recover a common Earth model is increasingly being applied to exploration problems. Petrophysical data can serve as an effective constraint to link different physical property models in such inversions. There are two challenges, among others, associated with the petrophysical approach to joint inversion. One is related to the multimodality of petrophysical data because there often exist more than one relationship between different physical properties in a region of study. The other challenge arises from the fact that petrophysical relationships have different characteristics and can exhibit point, linear, quadratic, or exponential forms in a crossplot. The fuzzy c-means (FCM) clustering technique is effective in tackling the first challenge and has been applied successfully. We focus on the second challenge in this paper and develop a joint inversion method based on variations of the FCM clustering technique. To account for the specific shapes of petrophysical relationships, we introduce several different fuzzy clustering algorithms that are capable of handling different shapes of petrophysical relationships. We present two synthetic and one field data examples and demonstrate that, by choosing appropriate distance measures for the clustering component in the joint inversion algorithm, the proposed joint inversion method provides an effective means of handling common petrophysical situations we encounter in practice. The jointly inverted models have both enhanced structural similarity and increased petrophysical correlation, and better represent the subsurface in the spatial domain and the parameter domain of physical properties.

  4. An expert system shell for inferring vegetation characteristics: Interface for the addition of techniques (Task H)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1993-01-01

    All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.

  5. Application of Tissue Culture and Transformation Techniques in Model Species Brachypodium distachyon.

    PubMed

    Sogutmaz Ozdemir, Bahar; Budak, Hikmet

    2018-01-01

    Brachypodium distachyon has recently emerged as a model plant species for the grass family (Poaceae) that includes major cereal crops and forage grasses. One of the important traits of a model species is its capacity to be transformed and ease of growing both in tissue culture and in greenhouse conditions. Hence, plant transformation technology is crucial for improvements in agricultural studies, both for the study of new genes and in the production of new transgenic plant species. In this chapter, we review an efficient tissue culture and two different transformation systems for Brachypodium using most commonly preferred gene transfer techniques in plant species, microprojectile bombardment method (biolistics) and Agrobacterium-mediated transformation.In plant transformation studies, frequently used explant materials are immature embryos due to their higher transformation efficiencies and regeneration capacity. However, mature embryos are available throughout the year in contrast to immature embryos. We explain a tissue culture protocol for Brachypodium using mature embryos with the selected inbred lines from our collection. Embryogenic calluses obtained from mature embryos are used to transform Brachypodium with both plant transformation techniques that are revised according to previously studied protocols applied in the grasses, such as applying vacuum infiltration, different wounding effects, modification in inoculation and cocultivation steps or optimization of bombardment parameters.

  6. Variational Bayesian Parameter Estimation Techniques for the General Linear Model

    PubMed Central

    Starke, Ludger; Ostwald, Dirk

    2017-01-01

    Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572

  7. Label-free in vivo analysis of intracellular lipid droplets in the oleaginous microalga Monoraphidium neglectum by coherent Raman scattering microscopy.

    PubMed

    Jaeger, Daniel; Pilger, Christian; Hachmeister, Henning; Oberländer, Elina; Wördenweber, Robin; Wichmann, Julian; Mussgnug, Jan H; Huser, Thomas; Kruse, Olaf

    2016-10-21

    Oleaginous photosynthetic microalgae hold great promise as non-food feedstocks for the sustainable production of bio-commodities. The algal lipid quality can be analysed by Raman micro-spectroscopy, and the lipid content can be imaged in vivo in a label-free and non-destructive manner by coherent anti-Stokes Raman scattering (CARS) microscopy. In this study, both techniques were applied to the oleaginous microalga Monoraphidium neglectum, a biotechnologically promising microalga resistant to commonly applied lipid staining techniques. The lipid-specific CARS signal was successfully separated from the interfering two-photon excited fluorescence of chlorophyll and for the first time, lipid droplet formation during nitrogen starvation could directly be analysed. We found that the neutral lipid content deduced from CARS image analysis strongly correlated with the neutral lipid content measured gravimetrically and furthermore, that the relative degree of unsaturation of fatty acids stored in lipid droplets remained similar. Interestingly, the lipid profile during cellular adaption to nitrogen starvation showed a two-phase characteristic with initially fatty acid recycling and subsequent de novo lipid synthesis. This works demonstrates the potential of quantitative CARS microscopy as a label-free lipid analysis technique for any microalgal species, which is highly relevant for future biotechnological applications and to elucidate the process of microalgal lipid accumulation.

  8. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  9. The in vitro use of the hair follicle closure technique to study the follicular and percutaneous permeation of topically applied drugs.

    PubMed

    Stahl, Jessica; Niedorf, Frank; Wohlert, Mareike; Kietzmann, Manfred

    2012-03-01

    Recent studies on follicular permeation emphasise the importance of hair follicles as diffusion pathways, but only a limited amount of data are available about the follicular permeation of topically applied drugs. This study examines the use of a hair follicle closure technique in vitro, to determine the participation of hair follicles in transdermal drug penetration. Various substances, with different lipophilicities, were tested: caffeine, diclofenac, flufenamic acid, ibuprofen, paracetamol, salicylic acid and testosterone. Diffusion experiments were conducted with porcine skin, the most common replacement material for human skin, in Franz-type diffusion cells over 28 hours. Different experimental settings allowed the differentiation between interfollicular and follicular permeation after topical application of the test compounds. A comparison of the apparent permeability coefficients of the drugs demonstrates that the percutaneous permeations of caffeine and flufenamic acid were significantly higher along the hair follicles. In the cases of paracetamol and testosterone, the follicular pathway appears to be of importance, while no difference was found between interfollicular and follicular permeation for diclofenac, ibuprofen and salicylic acid. Thus, the hair follicle closure technique represents an adequate in vitro method for gaining information about follicular or percutaneous permeation, and can replace in vivo testing in animals or humans. 2012 FRAME.

  10. Short-term response of Holcus lanatus L. (Common Velvetgrass) to chemical and manual control at Yosemite National Park, USA

    USGS Publications Warehouse

    Jones, Laura J.; Ostoja, Steven M.; Brooks, Matthew L.; Hutten, Martin

    2015-01-01

    One of the highest priority invasive species at both Yosemite and Sequoia and Kings Canyon national parks is Holcus lanatus L. (common velvetgrass), a perennial bunchgrass that invades mid-elevation montane meadows. Despite velvetgrass being a high priority species, there is little information available on control techniques. The goal of this project was to evaluate the short-term response of a single application of common chemical and manual velvetgrass control techniques. The study was conducted at three montane sites in Yosemite National Park. Glyphosate spot-spray treatments were applied at 0.5, 1.0, 1.5, and 2.0% concentrations, and compared with hand pulling to evaluate effects on cover of common velvetgrass, cover of other plant species, and community species richness. Posttreatment year 1 cover of common velvetgrass was 12.1% ± 1.6 in control plots, 6.3% ± 1.5 averaged over the four chemical treatments (all chemical treatments performed similarly), and 13.6% ± 1.7 for handpulled plots. This represents an approximately 50% reduction in common velvetgrass cover in chemically- treated plots recoded posttreatment year 1 and no statistically significant reduction in hand pulled plots compared with controls. However, there was no treatment effect in posttreatment year 2, and all herbicide application rates performed similarly. In addition, there were no significant treatment effects on nontarget species or species richness. These results suggest that for this level of infestation and habitat type, (1) one year of hand pulling is not an effective control method and (2) glyphosate provides some level of control in the short-term without impact to nontarget plant species, but the effect is temporary as a single year of glyphosate treatment is ineffective over a two-year period.

  11. Induction motor inter turn fault detection using infrared thermographic analysis

    NASA Astrophysics Data System (ADS)

    Singh, Gurmeet; Anil Kumar, T. Ch.; Naikan, V. N. A.

    2016-07-01

    Induction motors are the most commonly used prime movers in industries. These are subjected to various environmental, thermal and load stresses that ultimately reduces the motor efficiency and later leads to failure. Inter turn fault is the second most commonly observed faults in the motors and is considered the most severe. It can lead to the failure of complete phase and can even cause accidents, if left undetected or untreated. This paper proposes an online and non invasive technique that uses infrared thermography, in order to detect the presence of inter turn fault in induction motor drive. Two methods have been proposed that detect the fault and estimate its severity. One method uses transient thermal monitoring during the start of motor and other applies pseudo coloring technique on infrared image of the motor, after it reaches a thermal steady state. The designed template for pseudo-coloring is in acquiescence with the InterNational Electrical Testing Association (NETA) thermographic standard. An index is proposed to assess the severity of the fault present in the motor.

  12. Fluorescent in situ hybridization: an effective and less costly technique for genetic evaluation of products of conception in pregnancy losses.

    PubMed

    Fejgin, Moshe D; Pomeranz, Meir; Liberman, Meytal; Fishman, Ami; Amiel, Aliza

    2005-09-01

    In this study, we applied the fluorescent in situ hybridization (FISH) technique and compared the common numerical abnormalities with chromosomes 13, 16, 18, 21, X, and Y in spontaneous to artificial abortion. This would cover about 75% of the common aneuploidy in spontaneous abortion. Placentas were taken from 59 patients with a first trimester spontaneous abortion and 61 patients who underwent an elective first trimester pregnancy termination. The range of growth was from 5 to 12 gestational weeks. Placentas were processed according to direct chorionic villi preparation. Direct dual color FISH was performed according to Vysis protocol with the probes for the following chromosomes: 13, 16, 18, 21, X, and Y. The aneuploidy rate in spontaneous abortion was 55.9% and in artificial abortion 8.2%. There was a significant difference between the two groups in the aneuploidy rate (P = 6 x 10(-9)). FISH is a rapid, efficient, and relatively inexpensive tool in detecting aneuploidy in placentas from cases of spontaneous abortions. Our rate of detected aneuploidy is compatible with other reports in which conventional cytogenetics was utilized.

  13. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  14. Laser tracker orientation in confined space using on-board targets

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Kyle, Stephen; Lin, Jiarui; Yang, Linghui; Ren, Yu; Zhu, Jigui

    2016-08-01

    This paper presents a novel orientation method for two laser trackers using on-board targets attached to the tracker head and rotating with it. The technique extends an existing method developed for theodolite intersection systems which are now rarely used. This method requires only a very narrow space along the baseline between the instrument heads, in order to establish the orientation relationship. This has potential application in environments where space is restricted. The orientation parameters can be calculated by means of two-face reciprocal measurements to the on-board targets, and measurements to a common point close to the baseline. An accurate model is then applied which can be solved through nonlinear optimization. Experimental comparison has been made with the conventional orientation method, which is based on measurements to common intersection points located off the baseline. This requires more space and the comparison has demonstrated the feasibility of the more compact technique presented here. Physical setup and testing suggest that the method is practical. Uncertainties estimated by simulation indicate good performance in terms of measurement quality.

  15. Integrated geophysical investigations in a fault zone located on southwestern part of İzmir city, Western Anatolia, Turkey

    NASA Astrophysics Data System (ADS)

    Drahor, Mahmut G.; Berge, Meriç A.

    2017-01-01

    Integrated geophysical investigations consisting of joint application of various geophysical techniques have become a major tool of active tectonic investigations. The choice of integrated techniques depends on geological features, tectonic and fault characteristics of the study area, required resolution and penetration depth of used techniques and also financial supports. Therefore, fault geometry and offsets, sediment thickness and properties, features of folded strata and tectonic characteristics of near-surface sections of the subsurface could be thoroughly determined using integrated geophysical approaches. Although Ground Penetrating Radar (GPR), Electrical Resistivity Tomography (ERT) and Seismic Refraction Tomography (SRT) methods are commonly used in active tectonic investigations, other geophysical techniques will also contribute in obtaining of different properties in the complex geological environments of tectonically active sites. In this study, six different geophysical methods used to define faulting locations and characterizations around the study area. These are GPR, ERT, SRT, Very Low Frequency electromagnetic (VLF), magnetics and self-potential (SP). Overall integrated geophysical approaches used in this study gave us commonly important results about the near surface geological properties and faulting characteristics in the investigation area. After integrated interpretations of geophysical surveys, we determined an optimal trench location for paleoseismological studies. The main geological properties associated with faulting process obtained after trenching studies. In addition, geophysical results pointed out some indications concerning the active faulting mechanism in the area investigated. Consequently, the trenching studies indicate that the integrated approach of geophysical techniques applied on the fault problem reveals very useful and interpretative results in description of various properties of faulting zone in the investigation site.

  16. Using G-Theory to Enhance Evidence of Reliability and Validity for Common Uses of the Paulhus Deception Scales.

    PubMed

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-01-01

    We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.

  17. VALIDATION OF MICROSATELLITE MARKERS FOR USE IN GENOTYPING POLYCLONAL PLASMODIUM FALCIPARUM INFECTIONS

    PubMed Central

    GREENHOUSE, BRYAN; MYRICK, ALISSA; DOKOMAJILAR, CHRISTIAN; WOO, JONATHAN M.; CARLSON, ELAINE J.; ROSENTHAL, PHILIP J.; DORSEY, GRANT

    2006-01-01

    Genotyping methods for Plasmodium falciparum drug efficacy trials have not been standardized and may fail to accurately distinguish recrudescence from new infection, especially in high transmission areas where polyclonal infections are common. We developed a simple method for genotyping using previously identified microsatellites and capillary electrophoresis, validated this method using mixtures of laboratory clones, and applied the method to field samples. Two microsatellite markers produced accurate results for single-clone but not polyclonal samples. Four other microsatellite markers were as sensitive as, and more specific than, commonly used genotyping techniques based on merozoite surface proteins 1 and 2. When applied to samples from 15 patients in Burkina Faso with recurrent parasitemia after treatment with sulphadoxine-pyrimethamine, the addition of these four microsatellite markers to msp1 and msp2 genotyping resulted in a reclassification of outcomes that strengthened the association between dhfr 59R, an anti-folate resistance mutation, and recrudescence (P = 0.31 versus P = 0.03). Four microsatellite markers performed well on polyclonal samples and may provide a valuable addition to genotyping for clinical drug efficacy studies in high transmission areas. PMID:17123974

  18. Modeling EEG Waveforms with Semi-Supervised Deep Belief Nets: Fast Classification and Anomaly Measurement

    PubMed Central

    Wulsin, D. F.; Gupta, J. R.; Mani, R.; Blanco, J. A.; Litt, B.

    2011-01-01

    Clinical electroencephalography (EEG) records vast amounts of human complex data yet is still reviewed primarily by human readers. Deep Belief Nets (DBNs) are a relatively new type of multi-layer neural network commonly tested on two-dimensional image data, but are rarely applied to times-series data such as EEG. We apply DBNs in a semi-supervised paradigm to model EEG waveforms for classification and anomaly detection. DBN performance was comparable to standard classifiers on our EEG dataset, and classification time was found to be 1.7 to 103.7 times faster than the other high-performing classifiers. We demonstrate how the unsupervised step of DBN learning produces an autoencoder that can naturally be used in anomaly measurement. We compare the use of raw, unprocessed data—a rarity in automated physiological waveform analysis—to hand-chosen features and find that raw data produces comparable classification and better anomaly measurement performance. These results indicate that DBNs and raw data inputs may be more effective for online automated EEG waveform recognition than other common techniques. PMID:21525569

  19. Quantifying time-varying cellular secretions with local linear models.

    PubMed

    Byers, Jeff M; Christodoulides, Joseph A; Delehanty, James B; Raghu, Deepa; Raphael, Marc P

    2017-07-01

    Extracellular protein concentrations and gradients initiate a wide range of cellular responses, such as cell motility, growth, proliferation and death. Understanding inter-cellular communication requires spatio-temporal knowledge of these secreted factors and their causal relationship with cell phenotype. Techniques which can detect cellular secretions in real time are becoming more common but generalizable data analysis methodologies which can quantify concentration from these measurements are still lacking. Here we introduce a probabilistic approach in which local-linear models and the law of mass action are applied to obtain time-varying secreted concentrations from affinity-based biosensor data. We first highlight the general features of this approach using simulated data which contains both static and time-varying concentration profiles. Next we apply the technique to determine concentration of secreted antibodies from 9E10 hybridoma cells as detected using nanoplasmonic biosensors. A broad range of time-dependent concentrations was observed: from steady-state secretions of 230 pM near the cell surface to large transients which reached as high as 56 nM over several minutes and then dissipated.

  20. ComDim for explorative multi-block data analysis of Cantal-type cheeses: Effects of salts, gentle heating and ripening.

    PubMed

    Loudiyi, M; Rutledge, D N; Aït-Kaddour, A

    2018-10-30

    Common Dimension (ComDim) chemometrics method for multi-block data analysis was employed to evaluate the impact of different added salts and ripening times on physicochemical, color, dynamic low amplitude oscillatory rheology, texture profile, and molecular structure (fluorescence and MIR spectroscopies) of five Cantal-type cheeses. Firstly, Independent Components Analysis (ICA) was applied separately on fluorescence and MIR spectra in order to extract the relevant signal source and the associated proportions related to molecular structure characteristics. ComDim was then applied on the 31 data tables corresponding to the proportion of ICA signals obtained for spectral methods and the global analysis of cheeses by the other techniques. The ComDim results indicated that generally cheeses made with 50% NaCl or with 75:25% NaCl/KCl exhibit the equivalent characteristics in structural, textural, meltability and color properties. The proposed methodology demonstrates the applicability of ComDim for the characterization of samples when different techniques describe the same samples. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Thermal lensing compensation optics for high power lasers

    NASA Astrophysics Data System (ADS)

    Scaggs, Michael; Haas, Gil

    2011-03-01

    Athermalization of focusing objectives is a common technique for optimizing imaging systems in the infrared where thermal effects are a major concern. The athermalization is generally done within the spectrum of interest and not generally applied to a single wavelength. The predominate glass used with high power infrared lasers in the near infrared of one micron, such as Nd:YAG and fiber lasers, is fused silica which has excellent thermal properties. All glasses, however, have a temperature coefficient of index of refraction (dn/dT) where as the glass heats up its index of refraction changes. Most glasses, fused silica included, have a positive dn/dT. A positive dn/dT will cause the focal length of the lens to decrease with a temperature rise. Many of the fluoride glasses, like CaF2, BaF2, LiF2, etc. have a negative dn/dT. By applying athermalization techniques of glass selection and optical design, the thermal lensing in a laser objective of a high power laser system can be substantially mitigated. We describe a passive method for minimizing thermal lensing of high power laser optics.

  2. Synchroton and Simulations Techniques Applied to Problems in Materials Science: Catalysts and Azul Maya Pigments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chianelli, R.

    2005-01-12

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past twenty years due to the increasing availability of high flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory (SSRL). In this article we review the application of multiple synchrotron characterization techniques to two classes of materials defined as ''surface compounds.'' One class of surface compounds aremore » materials like MoS{sub 2-x}C{sub x} that are widely used petroleum catalysts used to improve the environmental properties of transportation fuels. These compounds may be viewed as ''sulfide supported carbides'' in their catalytically active states. The second class of ''surface compounds'' is the ''Maya Blue'' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic ''surface complexes'' consisting of the dye indigo and palygorskite, a common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described in this report.« less

  3. The role of printing techniques for large-area dye sensitized solar cells

    NASA Astrophysics Data System (ADS)

    Mariani, Paolo; Vesce, Luigi; Di Carlo, Aldo

    2015-10-01

    The versatility of printing technologies and their intrinsic ability to outperform other techniques in large-area deposition gives scope to revolutionize the photovoltaic (PV) manufacturing field. Printing methods are commonly used in conventional silicon-based PVs to cover part of the production process. Screen printing techniques, for example, are applied to deposit electrical contacts on the silicon wafer. However, it is with the advent of third generation PVs that printing/coating techniques have been extensively used in almost all of the manufacturing processes. Among all the third generation PVs, dye sensitized solar cell (DSSC) technology has been developed up to commercialization levels. DSSCs and modules can be fabricated by adopting all of the main printing techniques on both rigid and flexible substrates. This allows an easy tuning of cell/module characteristics to the desired application. Transparency, colour, shape, layout and other DSSC’s features can be easily varied by changing the printing parameters and paste/ink formulations used in the printing process. This review focuses on large-area printing/coating technologies for the fabrication of DSSCs devices. The most used and promising techniques are presented underlining the process parameters and applications.

  4. International Space Station Powered Bolt Nut Anomaly and Failure Analysis Summary

    NASA Technical Reports Server (NTRS)

    Sievers, Daniel E.; Warden, Harry K.

    2010-01-01

    A key mechanism used in the on-orbit assembly of the International Space Station (ISS) pressurized elements is the Common Berthing Mechanism. The mechanism that effects the structural connection of the Common Berthing Mechanism halves is the Powered Bolt Assembly. There are sixteen Powered Bolt Assemblies per Common Berthing Mechanism. The Common Berthing Mechanism has a bolt which engages a self aligning Powered Bolt Nut (PBN) on the mating interface (Figure 1). The Powered Bolt Assemblies are preloaded to approximately 84.5 kN (19000 lb) prior to pressurization of the CBM. The PBNs mentioned below, manufactured in 2009, will be used on ISS future missions. An on orbit functional failure of this hardware would be unacceptable and in some instances catastrophic due to the failure of modules to mate and seal the atmosphere, risking loss of crew and ISS functions. The manufacturing processes that create the PBNs need to be strictly controlled. Functional (torque vs. tension) acceptance test failures will be the result of processes not being strictly followed. Without the proper knowledge of thread tolerances, fabrication techniques, and dry film lubricant application processes, PBNs will be, and have been manufactured improperly. The knowledge gained from acceptance test failures and the resolution of those failures, thread fabrication techniques and thread dry film lubrication processes can be applied to many aerospace mechanisms to enhance their performance. Test data and manufactured PBN thread geometry will be discussed for both failed and successfully accepted PBNs.

  5. Comparison of Attenuated Total Reflectance Mid-Infrared, Near Infrared, and 1H-Nuclear Magnetic Resonance Spectroscopies for the Determination of Coffee's Geographical Origin

    PubMed Central

    Caro Rodríguez, Diana; Arana, Victoria A.; Bernal, Andrés; Esseiva, Pierre

    2017-01-01

    The sensorial properties of Colombian coffee are renowned worldwide, which is reflected in its market value. This raises the threat of fraud by adulteration using coffee grains from other countries, thus creating a demand for robust and cost-effective methods for the determination of geographical origin of coffee samples. Spectroscopic techniques such as Nuclear Magnetic Resonance (NMR), near infrared (NIR), and mid-infrared (mIR) have arisen as strong candidates for the task. Although a body of work exists that reports on their individual performances, a faithful comparison has not been established yet. We evaluated the performance of 1H-NMR, Attenuated Total Reflectance mIR (ATR-mIR), and NIR applied to fraud detection in Colombian coffee. For each technique, we built classification models for discrimination by species (C. arabica versus C. canephora (or robusta)) and by origin (Colombia versus other C. arabica) using a common set of coffee samples. All techniques successfully discriminated samples by species, as expected. Regarding origin determination, ATR-mIR and 1H-NMR showed comparable capacity to discriminate Colombian coffee samples, while NIR fell short by comparison. In conclusion, ATR-mIR, a less common technique in the field of coffee adulteration and fraud detection, emerges as a strong candidate, faster and with lower cost compared to 1H-NMR and more discriminating compared to NIR. PMID:29201055

  6. The Comparison of Matching Methods Using Different Measures of Balance: Benefits and Risks Exemplified within a Study to Evaluate the Effects of German Disease Management Programs on Long-Term Outcomes of Patients with Type 2 Diabetes.

    PubMed

    Fullerton, Birgit; Pöhlmann, Boris; Krohn, Robert; Adams, John L; Gerlach, Ferdinand M; Erler, Antje

    2016-10-01

    To present a case study on how to compare various matching methods applying different measures of balance and to point out some pitfalls involved in relying on such measures. Administrative claims data from a German statutory health insurance fund covering the years 2004-2008. We applied three different covariance balance diagnostics to a choice of 12 different matching methods used to evaluate the effectiveness of the German disease management program for type 2 diabetes (DMPDM2). We further compared the effect estimates resulting from applying these different matching techniques in the evaluation of the DMPDM2. The choice of balance measure leads to different results on the performance of the applied matching methods. Exact matching methods performed well across all measures of balance, but resulted in the exclusion of many observations, leading to a change of the baseline characteristics of the study sample and also the effect estimate of the DMPDM2. All PS-based methods showed similar effect estimates. Applying a higher matching ratio and using a larger variable set generally resulted in better balance. Using a generalized boosted instead of a logistic regression model showed slightly better performance for balance diagnostics taking into account imbalances at higher moments. Best practice should include the application of several matching methods and thorough balance diagnostics. Applying matching techniques can provide a useful preprocessing step to reveal areas of the data that lack common support. The use of different balance diagnostics can be helpful for the interpretation of different effect estimates found with different matching methods. © Health Research and Educational Trust.

  7. Synchronization of world economic activity

    NASA Astrophysics Data System (ADS)

    Groth, Andreas; Ghil, Michael

    2017-12-01

    Common dynamical properties of business cycle fluctuations are studied in a sample of more than 100 countries that represent economic regions from all around the world. We apply the methodology of multivariate singular spectrum analysis (M-SSA) to identify oscillatory modes and to detect whether these modes are shared by clusters of phase- and frequency-locked oscillators. An extension of the M-SSA approach is introduced to help analyze structural changes in the cluster configuration of synchronization. With this novel technique, we are able to identify a common mode of business cycle activity across our sample, and thus point to the existence of a world business cycle. Superimposed on this mode, we further identify several major events that have markedly influenced the landscape of world economic activity in the postwar era.

  8. Synchronization of world economic activity.

    PubMed

    Groth, Andreas; Ghil, Michael

    2017-12-01

    Common dynamical properties of business cycle fluctuations are studied in a sample of more than 100 countries that represent economic regions from all around the world. We apply the methodology of multivariate singular spectrum analysis (M-SSA) to identify oscillatory modes and to detect whether these modes are shared by clusters of phase- and frequency-locked oscillators. An extension of the M-SSA approach is introduced to help analyze structural changes in the cluster configuration of synchronization. With this novel technique, we are able to identify a common mode of business cycle activity across our sample, and thus point to the existence of a world business cycle. Superimposed on this mode, we further identify several major events that have markedly influenced the landscape of world economic activity in the postwar era.

  9. Signal-Noise Identification of Magnetotelluric Signals Using Fractal-Entropy and Clustering Algorithm for Targeted De-Noising

    NASA Astrophysics Data System (ADS)

    Li, Jin; Zhang, Xian; Gong, Jinzhe; Tang, Jingtian; Ren, Zhengyong; Li, Guang; Deng, Yanli; Cai, Jin

    A new technique is proposed for signal-noise identification and targeted de-noising of Magnetotelluric (MT) signals. This method is based on fractal-entropy and clustering algorithm, which automatically identifies signal sections corrupted by common interference (square, triangle and pulse waves), enabling targeted de-noising and preventing the loss of useful information in filtering. To implement the technique, four characteristic parameters — fractal box dimension (FBD), higuchi fractal dimension (HFD), fuzzy entropy (FuEn) and approximate entropy (ApEn) — are extracted from MT time-series. The fuzzy c-means (FCM) clustering technique is used to analyze the characteristic parameters and automatically distinguish signals with strong interference from the rest. The wavelet threshold (WT) de-noising method is used only to suppress the identified strong interference in selected signal sections. The technique is validated through signal samples with known interference, before being applied to a set of field measured MT/Audio Magnetotelluric (AMT) data. Compared with the conventional de-noising strategy that blindly applies the filter to the overall dataset, the proposed method can automatically identify and purposefully suppress the intermittent interference in the MT/AMT signal. The resulted apparent resistivity-phase curve is more continuous and smooth, and the slow-change trend in the low-frequency range is more precisely reserved. Moreover, the characteristic of the target-filtered MT/AMT signal is close to the essential characteristic of the natural field, and the result more accurately reflects the inherent electrical structure information of the measured site.

  10. Survey of intravitreal injection techniques among retina specialists in Israel

    PubMed Central

    Segal, Ori; Segal-Trivitz, Yael; Nemet, Arie Y; Geffen, Noa; Nesher, Ronit; Mimouni, Michael

    2016-01-01

    Purpose The purpose of this study was to describe antivascular endothelial growth factor intravitreal injection techniques of retinal specialists in order to establish a cornerstone for future practice guidelines. Methods All members of the Israeli Retina Society were contacted by email to complete an anonymous, 19-question, Internet-based survey regarding their intravitreal injection techniques. Results Overall, 66% (52/79) completed the survey. Most (98%) do not instruct patients to discontinue anticoagulant therapy and 92% prescribe treatment for patients in the waiting room. Three quarters wear sterile gloves and prepare the patient in the supine position. A majority (71%) use sterile surgical draping. All respondents apply topical analgesics and a majority (69%) measure the distance from the limbus to the injection site. A minority (21%) displace the conjunctiva prior to injection. A majority of the survey participants use a 30-gauge needle and the most common quadrant for injection is superotemporal (33%). Less than half routinely assess postinjection optic nerve perfusion (44%). A majority (92%) apply prophylactic antibiotics immediately after the injection. Conclusion The majority of retina specialists perform intravitreal injections similarly. However, a relatively large minority performs this procedure differently. Due to the extremely low percentage of complications, it seems as though such differences do not increase the risk. However, more evidence-based medicine, a cornerstone for practice guidelines, is required in order to identify the intravitreal injection techniques that combine safety and efficacy while causing as little discomfort to the patients as possible. PMID:27366050

  11. Magnetometry of single ferromagnetic nanoparticles using magneto-optical indicator films with spatial amplification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balk, Andrew L., E-mail: andrew.balk@nist.gov; Maryland NanoCenter, University of Maryland, College Park, Maryland 20742; Hangarter, Carlos

    2015-03-16

    We present a magneto-optical technique to spatially amplify and image fringe fields from single ferromagnetic nanorods. The fringe fields nucleate magnetic domains in a low-coercivity, perpendicularly magnetized indicator film, which are expanded by an applied out-of-plane field from the nanoscale to the microscale for measurement with polar Kerr microscopy. The nucleation location and therefore magnetic orientation of the sample nanorod are detected as spatially dependent field biases in locally measured hysteresis loops of the indicator film. We first discuss our method to fabricate the high-sensitivity indicator film with low energy argon ion irradiation. We then present a map of themore » amplified signal produced from a single nanorod as measured by the indicator film and compare it with a simultaneously obtained, unamplified fringe field map. The comparison demonstrates the advantage of the amplification mechanism and the capability of the technique to be performed with single-spot magneto-optical Kerr effect magnetometers. Our signal-to-noise ratio determines a minimum measureable particle diameter of tens of nanometers for typical transition metals. We finally use our method to obtain hysteresis loops from multiple nanorods in parallel. Our technique is unperturbed by applied in-plane fields for magnetic manipulation of nanoparticles, is robust against many common noise sources, and is applicable in a variety of test environments. We conclude with a discussion of the future optimization and application of our indicator film technique.« less

  12. UV spectroscopy determination of aqueous lead and copper ions in water

    NASA Astrophysics Data System (ADS)

    Tan, C. H.; Moo, Y. C.; Mat Jafri, M. Z.; Lim, H. S.

    2014-05-01

    Lead (Pb2+) and copper (Cu2+) ions are very common pollutants in water which have dangerous potential causing serious disease and health problems to human. The aim of this paper is to determine lead and copper ions in aqueous solution using direct UV detection without chemical reagent waste. This technique allow the determination of lead and copper ions from range 0.2 mg/L to 10 mg/L using UV wavelength from 205 nm to 225 nm. The method was successfully applied to synthetic sample with high performance.

  13. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  14. Gene-network inference by message passing

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Pagnani, A.; Weigt, M.; Zecchina, R.

    2008-01-01

    The inference of gene-regulatory processes from gene-expression data belongs to the major challenges of computational systems biology. Here we address the problem from a statistical-physics perspective and develop a message-passing algorithm which is able to infer sparse, directed and combinatorial regulatory mechanisms. Using the replica technique, the algorithmic performance can be characterized analytically for artificially generated data. The algorithm is applied to genome-wide expression data of baker's yeast under various environmental conditions. We find clear cases of combinatorial control, and enrichment in common functional annotations of regulated genes and their regulators.

  15. Biophysical EPR Studies Applied to Membrane Proteins

    PubMed Central

    Sahu, Indra D; Lorigan, Gary A

    2015-01-01

    Membrane proteins are very important in controlling bioenergetics, functional activity, and initializing signal pathways in a wide variety of complicated biological systems. They also represent approximately 50% of the potential drug targets. EPR spectroscopy is a very popular and powerful biophysical tool that is used to study the structural and dynamic properties of membrane proteins. In this article, a basic overview of the most commonly used EPR techniques and examples of recent applications to answer pertinent structural and dynamic related questions on membrane protein systems will be presented. PMID:26855825

  16. Diagnosis and management of acute complications in patients with colon cancer: bleeding, obstruction, and perforation

    PubMed Central

    Yang, Xue-Fei

    2014-01-01

    Among the colorectal cancers, the incidence of colon cancer has obviously increased. As a result, the actual incidence of colon cancer has exceeded that of rectal cancer, which dramatically changed the long-existing epidemiological profile. The acute complications of colon cancer include bleeding, obstruction, and perforation, which were among the common acute abdominal surgical conditions. The rapid and accurate diagnosis of these acute complications was very important, and laparoscopic techniques can be applied in abdominal surgery for management of the complications. PMID:25035661

  17. Feedback shift register sequences versus uniformly distributed random sequences for correlation chromatography

    NASA Technical Reports Server (NTRS)

    Kaljurand, M.; Valentin, J. R.; Shao, M.

    1996-01-01

    Two alternative input sequences are commonly employed in correlation chromatography (CC). They are sequences derived according to the algorithm of the feedback shift register (i.e., pseudo random binary sequences (PRBS)) and sequences derived by using the uniform random binary sequences (URBS). These two sequences are compared. By applying the "cleaning" data processing technique to the correlograms that result from these sequences, we show that when the PRBS is used the S/N of the correlogram is much higher than the one resulting from using URBS.

  18. Shockless spalling damage of alumina ceramic

    NASA Astrophysics Data System (ADS)

    Erzar, B.; Buzaud, E.

    2012-05-01

    Ceramic materials are commonly used to build multi-layer armour. However reliable test data is needed to identify correctly models and to be able to perform accurate numerical simulation of the dynamic response of armour systems. In this work, isentropic loading waves have been applied to alumina samples to induce spalling damage. The technique employed allows assessing carefully the strain-rate at failure and the dynamic strength. Moreover, specimens have been recovered and analysed using SEM. In a damaged but unbroken specimen, interactions between cracks has been highlighted illustrating the fragmentation process.

  19. Clinical and diagnostic aspects of lymphedema.

    PubMed

    Keo, Hong H; Gretener, Silvia B; Staub, Daniel

    2017-07-01

    Lymphedema is a chronic, progressive, and common but often unrecognized condition. The diagnosis of lymphatic disease on clinical grounds alone remains a challenge. Without proper diagnosis, therapy is often delayed, allowing disease progression. There is a need for a practical diagnostic algorithm and its imaging technique to guide clinical decision-making. The aim of this topical review is to provide a practical approach for assessing patients with suspected lymphedema and to give a critical appraisal of currently available imaging modalities that are applied in clinical practice to diagnose and map lymphatic disease.

  20. Measuring Seebeck Coefficient

    NASA Technical Reports Server (NTRS)

    Snyder, G. Jeffrey (Inventor)

    2015-01-01

    A high temperature Seebeck coefficient measurement apparatus and method with various features to minimize typical sources of errors is described. Common sources of temperature and voltage measurement errors which may impact accurate measurement are identified and reduced. Applying the identified principles, a high temperature Seebeck measurement apparatus and method employing a uniaxial, four-point geometry is described to operate from room temperature up to 1300K. These techniques for non-destructive Seebeck coefficient measurements are simple to operate, and are suitable for bulk samples with a broad range of physical types and shapes.

  1. NASA/ESA CV-990 spacelab simulation

    NASA Technical Reports Server (NTRS)

    Reller, J. O., Jr.

    1976-01-01

    Simplified techniques were applied to conduct an extensive spacelab simulation using the airborne laboratory. The scientific payload was selected to perform studies in upper atmospheric physics and infrared astronomy. The mission was successful and provided extensive data relevant to spacelab objectives on overall management of a complex international payload; experiment preparation, testing, and integration; training for proxy operation in space; data handling; multiexperimenter use of common experimenter facilities (telescopes); multiexperiment operation by experiment operators; selection criteria for spacelab experiment operators; and schedule requirements to prepare for such a spacelab mission.

  2. A unified development of several techniques for the representation of random vectors and data sets

    NASA Technical Reports Server (NTRS)

    Bundick, W. T.

    1973-01-01

    Linear vector space theory is used to develop a general representation of a set of data vectors or random vectors by linear combinations of orthonormal vectors such that the mean squared error of the representation is minimized. The orthonormal vectors are shown to be the eigenvectors of an operator. The general representation is applied to several specific problems involving the use of the Karhunen-Loeve expansion, principal component analysis, and empirical orthogonal functions; and the common properties of these representations are developed.

  3. Realizing situation awareness within a cyber environment

    NASA Astrophysics Data System (ADS)

    Tadda, George; Salerno, John J.; Boulware, Douglas; Hinman, Michael; Gorton, Samuel

    2006-04-01

    Situation Awareness (SA) problems all require an understanding of current activities, an ability to anticipate what may happen next, and techniques to analyze the threat or impact of current activities and predictions. These processes of SA are common regardless of the domain and can be applied to the detection of cyber attacks. This paper will describe the application of a SA framework to implementing Cyber SA, describe some metrics for measuring and evaluating systems implementing Cyber SA, and discuss ongoing work in this area. We conclude with some ideas for future activities.

  4. Using Classification and Regression Trees (CART) and random forests to analyze attrition: Results from two simulations.

    PubMed

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J

    2015-12-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).

  5. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  6. Using Classification and Regression Trees (CART) and Random Forests to Analyze Attrition: Results From Two Simulations

    PubMed Central

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.

    2016-01-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526

  7. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  8. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Quantifying the heritability of testicular germ cell tumour using both population-based and genomic approaches.

    PubMed

    Litchfield, Kevin; Thomsen, Hauke; Mitchell, Jonathan S; Sundquist, Jan; Houlston, Richard S; Hemminki, Kari; Turnbull, Clare

    2015-09-09

    A sizable fraction of testicular germ cell tumour (TGCT) risk is expected to be explained by heritable factors. Recent genome-wide association studies (GWAS) have successfully identified a number of common SNPs associated with TGCT. It is however, unclear how much common variation there is left to be accounted for by other, yet to be identified, common SNPs and what contribution common genetic variation makes to the heritable risk of TGCT. We approached this question using two complimentary analytical techniques. We undertook a population-based analysis of the Swedish family-cancer database, through which we estimated that the heritability of TGCT at 48.9% (CI:47.2%-52.3%). We also applied Genome-Wide Complex Trait Analysis to 922 cases and 4,842 controls to estimate the heritability of TGCT. The heritability explained by known common risk SNPs identified by GWAS was 9.1%, whereas the heritability explained by all common SNPs was 37.4% (CI:27.6%-47.2%). These complementary findings indicate that the known TGCT SNPs only explain a small proportion of the heritability and many additional common SNPs remain to be identified. The data also suggests that a fraction of the heritability of TGCT is likely to be explained by other classes of genetic variation, such as rare disease-causing alleles.

  10. Does Choice of Multicriteria Method Matter? An Experiment in Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Hobbs, Benjamin F.; Chankong, Vira; Hamadeh, Wael; Stakhiv, Eugene Z.

    1992-07-01

    Many multiple criteria decision making methods have been proposed and applied to water planning. Their purpose is to provide information on tradeoffs among objectives and to help users articulate value judgments in a systematic, coherent, and documentable manner. The wide variety of available techniques confuses potential users, causing inappropriate matching of methods with problems. Experiments in which water planners apply more than one multicriteria procedure to realistic problems can help dispel this confusion by testing method appropriateness, ease of use, and validity. We summarize one such experiment where U.S. Army Corps of Engineers personnel used several methods to screen urban water supply plans. The methods evaluated include goal programming, ELECTRE I, additive value functions, multiplicative utility functions, and three techniques for choosing weights (direct rating, indifference tradeoff, and the analytical hierarchy process). Among the conclusions we reach are the following. First, experienced planners generally prefer simpler, more transparent methods. Additive value functions are favored. Yet none of the methods are endorsed by a majority of the participants; many preferred to use no formal method at all. Second, there is strong evidence that rating, the most commonly applied weight selection method, is likely to lead to weights that fail to represent the trade-offs that users are willing to make among criteria. Finally, we show that decisions can be as or more sensitive to the method used as to which person applies it. Therefore, if who chooses is important, then so too is how a choice is made.

  11. Centrifuges in gravitational physiology research

    NASA Technical Reports Server (NTRS)

    Ballard, Rodney W.; Davies, Phil; Fuller, Charles A.

    1993-01-01

    Data from space flight and ground based experiments have clearly demonstrated the importance of Earth gravity for normal physiological function in man and animals. Gravitational Physiology is concerned with the role and influence of gravity on physiological systems. Research in this field examines how we perceive and respond to gravity and the mechanisms underlying these responses. Inherent in our search for answers to these questions is the ability to alter gravity, which is not physically possible without leaving Earth. However, useful experimental paradigms have been to modify the perceived force of gravity by changing either the orientation of subjects to the gravity vector (i.e., postural changes) or by applying inertial forces to augment the magnitude of the gravity vector. The later technique has commonly been used by applying centripetal force via centrifugation.

  12. Application of filtering techniques in preprocessing magnetic data

    NASA Astrophysics Data System (ADS)

    Liu, Haijun; Yi, Yongping; Yang, Hongxia; Hu, Guochuang; Liu, Guoming

    2010-08-01

    High precision magnetic exploration is a popular geophysical technique for its simplicity and its effectiveness. The explanation in high precision magnetic exploration is always a difficulty because of the existence of noise and disturbance factors, so it is necessary to find an effective preprocessing method to get rid of the affection of interference factors before further processing. The common way to do this work is by filtering. There are many kinds of filtering methods. In this paper we introduced in detail three popular kinds of filtering techniques including regularized filtering technique, sliding averages filtering technique, compensation smoothing filtering technique. Then we designed the work flow of filtering program based on these techniques and realized it with the help of DELPHI. To check it we applied it to preprocess magnetic data of a certain place in China. Comparing the initial contour map with the filtered contour map, we can see clearly the perfect effect our program. The contour map processed by our program is very smooth and the high frequency parts of data are disappeared. After filtering, we separated useful signals and noisy signals, minor anomaly and major anomaly, local anomaly and regional anomaly. It made us easily to focus on the useful information. Our program can be used to preprocess magnetic data. The results showed the effectiveness of our program.

  13. Evaluation of the recorded ground motions for the unusual earthquake of 13 August 2006 ( M w 5.3) in Michoacán México

    NASA Astrophysics Data System (ADS)

    Ramírez-Gaytán, Alejandro; Jaimes, Miguel A.; Bandy, William L.; Huerfano, Victor M.; Salido-Ruiz, Ricardo A.

    2015-10-01

    The focal mechanism of the moderate earthquake of 13 August 2006 M w = 5.3, which occurred in the border coastal area between Michoacán and Colima, México, is unusual. As shown by the Global Centroid Moment Tensor (CMT) project and the Servicio Sismológico Nacional de Mexico (SSN), the thrust mechanism is striking almost perpendicularly to the majority of earthquakes occurring along the subduction zone of the Mexican Pacific continental margin which commonly strike nearly parallel to the trench. The purpose of this study is to analyze the observed ground motions of this particular event relative to those of the common events. First, we apply the H/V technique to verify that the stations involved in this study are nearly free of site effects. Then, we compare the observed ground motions with (i) three empirical ground motion prediction equations (GMPEs) appropriate for the region, (ii) ground motions of four real earthquakes with the common mechanism, and (iii) the Fourier spectrum of a selected common event.

  14. Wideband Single-Crystal Transducer for Bone Characterization

    NASA Technical Reports Server (NTRS)

    Liang, Yu; Snook, Kevin

    2012-01-01

    The microgravity conditions of space travel result in unique physiological demands on the human body. In particular, the absence of the continual mechanical stresses on the skeletal system that are present on Earth cause the bones to decalcify. Trabecular structure decreases in thickness and increases in spacing, resulting in decreased bone strength and increased risk of injury. Thus, monitoring bone health is a high priority for long-term space travel. A single probe covering all frequency bands of interest would be ideal for such measurements, and this would also minimize storage space and eliminate the complexity of integrating multiple probes. This invention is an ultrasound transducer for the structural characterization of bone. Such characterization measures features of reflected and transmitted ultrasound signals, and correlates these signals with bone structure metrics such as bone mineral density, trabecular spacing, and thickness, etc. The techniques used to determine these various metrics require measurements over a broad range of ultrasound frequencies, and therefore, complete characterization requires the use of several narrowband transducers. This is a single transducer capable of making these measurements in all the required frequency bands. The device achieves this capability through a unique combination of a broadband piezoelectric material; a design incorporating multiple resonator sizes with distinct, overlapping frequency spectra; and a micromachining process for producing the multiple-resonator pattern with common electrode surfaces between the resonators. This device consists of a pattern of resonator bars with common electrodes that is wrapped around a central mandrel such that the radiating faces of the resonators are coplanar and can be simultaneously applied to the sample to be measured. The device operates as both a source and receiver of acoustic energy. It is operated by connection to an electronic system capable of both providing an excitation signal to the transducer and amplifying the signal received from the transducer. The excitation signal may be either a wide-bandwidth signal to excite the transducer across its entire operational spectrum, or a narrow-bandwidth signal optimized for a particular measurement technique. The transducer face is applied to the skin covering the bone to be characterized, and may be operated in through-transmission mode using two transducers, or in pulse-echo mode. The transducer is a unique combination of material, design, and fabrication technique. It is based on single-crystal lead magnesium niobate lead titanate (PMN-PT) piezoelectric material. As compared to the commonly used piezoceramics, this piezocrystal has superior piezoelectric and elastic properties, which results in devices with superior bandwidth, source level, and power requirements. This design necessitates a single resonant frequency. However, by operating in a transverse length-extensional mode, with the electric field applied orthogonally to the extensional direction, resonators of different sizes can share common electrodes, resulting in a multiply-resonant structure. With carefully sized resonators, and the superior bandwidth of piezocrystal, the resonances can be made to overlap to form a smooth, wide-bandwidth characteristic.

  15. The development of laser speckle velocimetry for the study of vortical flows

    NASA Technical Reports Server (NTRS)

    Krothapalli, A.

    1991-01-01

    A research program was undertaken to develop a new experimental technique commonly known as particle image displacement velocity (PIVD) to measure an instantaneous two dimensional velocity field in a selected plane of flow field. This technique was successfully developed and applied to the study of several aerodynamic problems. A detailed description of the technique and a broad review of all the research activity carried out in this field are reported. A list of technical publications is also provided. The application of PIDV to unsteady flows with large scale structures is demonstrated in a study of the temporal evolution of the flow past an impulsively started circular cylinder. The instantaneous two dimensional flow in the transition region of a rectangular air jet was measured using PIDV and the details are presented. This experiment clearly demonstrates the PIDV capability in the measurement of turbulent flows. Preliminary experiments were also conducted to measure the instantaneous flow over a circular bump in a transonic flow. Several other experiments now routinely use PIDV as a non-intrustive measurement technique to obtain instantaneous two dimensional velocity fields.

  16. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data.

    PubMed

    Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.

  17. [Demand for and the Development of Detection Techniques for Source of Schistosome Infection in China].

    PubMed

    Wang, Shi-ping; He, Xin; Zhou, Yun-fei

    2015-12-01

    Schistosomiasis is a type of zoonotic parasitosis that severely impairs human health. Rapid detection of infection sources is a key to the control of schistosomiasis. With the effective control of schistosomiasis in China, the detection techniques for infection sources have also been developed. The rate and the intensity of infection among humans and livestocks have been significantly decreased in China, as the control program has entered the transmission control stage in most of the endemic areas. Under this situation, the traditional etiological diagnosing techniques and common immunological methods can not afford rapid detection of infection sources of schistosomiasis. Instead, we are calling for detection methods with higher sensitivity, specificity and stability while being less time-consuming, more convenient and less costing. In recent years, many improved or novel detection methods have been applied for the epidemiological surveillance of schistosomiasis, such as the automatic scanning microscopic image acquisition system, PCR-ELISA, immunosensors, loop-mediated isothermal amplification, etc. The development of new monitoring techniques can facilitate rapid detection of schistosome infection sources in endemic areas.

  18. Discriminating Induced-Microearthquakes Using New Seismic Features

    NASA Astrophysics Data System (ADS)

    Mousavi, S. M.; Horton, S.

    2016-12-01

    We studied characteristics of induced-microearthquakes on the basis of the waveforms recorded on a limited number of surface receivers using machine-learning techniques. Forty features in the time, frequency, and time-frequency domains were measured on each waveform, and several techniques such as correlation-based feature selection, Artificial Neural Networks (ANNs), Logistic Regression (LR) and X-mean were used as research tools to explore the relationship between these seismic features and source parameters. The results show that spectral features have the highest correlation to source depth. Two new measurements developed as seismic features for this study, spectral centroids and 2D cross-correlations in the time-frequency domain, performed better than the common seismic measurements. These features can be used by machine learning techniques for efficient automatic classification of low energy signals recorded at one or more seismic stations. We applied the technique to 440 microearthquakes-1.7Reference: Mousavi, S.M., S.P. Horton, C. A. Langston, B. Samei, (2016) Seismic features and automatic discrimination of deep and shallow induced-microearthquakes using neural network and logistic regression, Geophys. J. Int. doi: 10.1093/gji/ggw258.

  19. Overview of Serological Techniques for Influenza Vaccine Evaluation: Past, Present and Future

    PubMed Central

    Trombetta, Claudia Maria; Perini, Daniele; Mather, Stuart; Temperton, Nigel; Montomoli, Emanuele

    2014-01-01

    Serological techniques commonly used to quantify influenza-specific antibodies include the Haemagglutination Inhibition (HI), Single Radial Haemolysis (SRH) and Virus Neutralization (VN) assays. HI and SRH are established and reproducible techniques, whereas VN is more demanding. Every new influenza vaccine needs to fulfil the strict criteria issued by the European Medicines Agency (EMA) in order to be licensed. These criteria currently apply exclusively to SRH and HI assays and refer to two different target groups—healthy adults and the elderly, but other vaccine recipient age groups have not been considered (i.e., children). The purpose of this timely review is to highlight the current scenario on correlates of protection concerning influenza vaccines and underline the need to revise the criteria and assays currently in use. In addition to SRH and HI assays, the technical advantages provided by other techniques such as the VN assay, pseudotype-based neutralization assay, neuraminidase and cell-mediated immunity assays need to be considered and regulated via EMA criteria, considering the many significant advantages that they could offer for the development of effective vaccines. PMID:26344888

  20. Teaching communication skills: using action methods to enhance role-play in problem-based learning.

    PubMed

    Baile, Walter F; Blatner, Adam

    2014-08-01

    Role-play is a method of simulation used commonly to teach communication skills. Role-play methods can be enhanced by techniques that are not widely used in medical teaching, including warm-ups, role-creation, doubling, and role reversal. The purposes of these techniques are to prepare learners to take on the role of others in a role-play; to develop an insight into unspoken attitudes, thoughts, and feelings, which often determine the behavior of others; and to enhance communication skills through the participation of learners in enactments of communication challenges generated by them. In this article, we describe a hypothetical teaching session in which an instructor applies each of these techniques in teaching medical students how to break bad news using a method called SPIKES [Setting, Perception, Invitation, Knowledge, Emotions, Strategy, and Summary]. We illustrate how these techniques track contemporary adult learning theory through a learner-centered, case-based, experiential approach to selecting challenging scenarios in giving bad news, by attending to underlying emotion and by using reflection to anchor new learning.

  1. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  2. Fluctuations in alliance and use of techniques over time: A bidirectional relation between use of "common factors" techniques and the development of the working alliance.

    PubMed

    Solomonov, Nili; McCarthy, Kevin S; Keefe, John R; Gorman, Bernard S; Blanchard, Mark; Barber, Jacques P

    2018-01-01

    The aim of this study was twofold: (a) Investigate whether therapists are consistent in their use of therapeutic techniques throughout supportive-expressive therapy (SET) and (b) Examine the bi-directional relation between therapists' use of therapeutic techniques and the working alliance over the course of SET. Thirty-seven depressed patients were assigned to 16 weeks of SET as part of a larger randomized clinical trial (Barber, Barrett, Gallop, Rynn, & Rickels, ). Working Alliance Inventory-Short Form (WAI-SF) was collected at Weeks 2, 4, and 8. Use of therapeutic interventions was rated by independent observers using the Multitheoretical List of Therapeutic Interventions (MULTI). Intraclass correlation coefficients assessed therapists' consistency in use of techniques. A cross-lagged path analysis estimated the working alliance inventory- Multitheoretical List of Therapeutic Interventions bidirectional relation across time. Therapists were moderately consistent in their use of prescribed techniques (psychodynamic, process-experiential, and person-centred). However, they were inconsistent, or more flexible, in their use of "common factors" techniques (e.g., empathy, active listening, hope, and encouragements). A positive bidirectional relation was found between use of common factors techniques and the working alliance, such that initial high levels of common factors (but not prescribed) techniques predicted higher alliance later on and vice versa. Therapists tend to modulate their use of common factors techniques across treatment. Additionally, when a strong working alliance is developed early in treatment, therapists tend to use more common factors later on. Moreover, high use of common factors techniques is predictive of later improvement in the alliance. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Intercomparison of Lab-Based Soil Water Extraction Methods for Stable Water Isotope Analysis

    NASA Astrophysics Data System (ADS)

    Pratt, D.; Orlowski, N.; McDonnell, J.

    2016-12-01

    The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of five common lab-based soil water extraction techniques: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and cryogenic extraction. We applied five extraction methods to two physicochemically different standard soil types (silty sand and clayey loam) that were oven-dried and rewetted with water of known isotopic composition at three different gravimetric water contents (8, 20, and 30%). We tested the null hypothisis that all extraction techniques would provide the same isotopic result independent from soil type and water content. Our results showed that the extraction technique had a significant effect on the soil water isotopic composition. Each method exhibited deviations from spiked reference water, with soil type and water content showing a secondary effect. Cryogenic extraction showed the largest deviations from the reference water, whereas mechanical squeezing and centrifugation provided the closest match to the reference water for both soil types. We also compared results for each extraction technique that produced liquid water on both an OA-ICOS and IRMS; differences between them were negligible.

  4. Whole body MRI: Improved Lesion Detection and Characterization With Diffusion Weighted Techniques

    PubMed Central

    Attariwala, Rajpaul; Picker, Wayne

    2013-01-01

    Diffusion-weighted imaging (DWI) is an established functional imaging technique that interrogates the delicate balance of water movement at the cellular level. Technological advances enable this technique to be applied to whole-body MRI. Theory, b-value selection, common artifacts and target to background for optimized viewing will be reviewed for applications in the neck, chest, abdomen, and pelvis. Whole-body imaging with DWI allows novel applications of MRI to aid in evaluation of conditions such as multiple myeloma, lymphoma, and skeletal metastases, while the quantitative nature of this technique permits evaluation of response to therapy. Persisting signal at high b-values from restricted hypercellular tissue and viscous fluid also permits applications of DWI beyond oncologic imaging. DWI, when used in conjunction with routine imaging, can assist in detecting hemorrhagic degradation products, infection/abscess, and inflammation in colitis, while aiding with discrimination of free fluid and empyema, while limiting the need for intravenous contrast. DWI in conjunction with routine anatomic images provides a platform to improve lesion detection and characterization with findings rivaling other combined anatomic and functional imaging techniques, with the added benefit of no ionizing radiation. PMID:23960006

  5. Measurement Techniques for Hypervelocity Impact Test Fragments

    NASA Technical Reports Server (NTRS)

    Hill, Nicole E.

    2008-01-01

    The ability to classify the size and shape of individual orbital debris fragments provides a better understanding of the orbital debris environment as a whole. The characterization of breakup fragmentation debris has gradually evolved from a simplistic, spherical assumption towards that of describing debris in terms of size, material, and shape parameters. One of the goals of the NASA Orbital Debris Program Office is to develop high-accuracy techniques to measure these parameters and apply them to orbital debris observations. Measurement of the physical characteristics of debris resulting from groundbased, hypervelocity impact testing provides insight into the shapes and sizes of debris produced from potential impacts in orbit. Current techniques for measuring these ground-test fragments require determination of dimensions based upon visual judgment. This leads to reduced accuracy and provides little or no repeatability for the measurements. With the common goal of mitigating these error sources, allaying any misunderstandings, and moving forward in fragment shape determination, the NASA Orbital Debris Program Office recently began using a computerized measurement system. The goal of using these new techniques is to improve knowledge of the relation between commonly used dimensions and overall shape. The immediate objective is to scan a single fragment, measure its size and shape properties, and import the fragment into a program that renders a 3D model that adequately demonstrates how the object could appear in orbit. This information would then be used to aid optical methods in orbital debris shape determination. This paper provides a description of the measurement techniques used in this initiative and shows results of this work. The tradeoffs of the computerized methods are discussed, as well as the means of repeatability in the measurements of these fragments. This paper serves as a general description of methods for the measurement and shape analysis of orbital debris.

  6. Ground roll attenuation using polarization analysis in the t-f-k domain

    NASA Astrophysics Data System (ADS)

    Wang, C.; Wang, Y.

    2017-07-01

    S waves travel slower than P waves and have a lower dominant frequency. Therefore, applying common techniques such as time-frequency filtering and f-k filtering to separate S waves from ground roll is difficult because ground roll is also characterized by slow velocity and low frequency. In this study, we present a method for attenuating ground roll using a polarization filtering method based on the t-f-k transform. We describe the particle motion of the waves by complex vector signals. Each pair of frequency components, whose frequencies have the same absolute value but different signs, of the complex signal indicate an elliptical or linear motion. The polarization parameters of the elliptical or linear motion are explicitly related to the two Fourier coefficients. We then extend these concepts to the t-f-k domain and propose a polarization filtering method for ground roll attenuation based on the t-f-k transform. The proposed approach can define automatically the time-varying reject zones on the f-k panel at different times as a function of the reciprocal ellipticity. Four attributes, time, frequency, apparent velocity and polarization are used to identify and extract the ground roll simultaneously. Thus, the ground roll and body waves can be separated as long as they are dissimilar in one of these attributes. We compare our method with commonly used filtering techniques by applying the methods to synthetic and real seismic data. The results indicate that our method can attenuate ground roll while preserving body waves more effectively than the other methods.

  7. Molecular modeling: An open invitation for applied mathematics

    NASA Astrophysics Data System (ADS)

    Mezey, Paul G.

    2013-10-01

    Molecular modeling methods provide a very wide range of challenges for innovative mathematical and computational techniques, where often high dimensionality, large sets of data, and complicated interrelations imply a multitude of iterative approximations. The physical and chemical basis of these methodologies involves quantum mechanics with several non-intuitive aspects, where classical interpretation and classical analogies are often misleading or outright wrong. Hence, instead of the everyday, common sense approaches which work so well in engineering, in molecular modeling one often needs to rely on rather abstract mathematical constraints and conditions, again emphasizing the high level of reliance on applied mathematics. Yet, the interdisciplinary aspects of the field of molecular modeling also generates some inertia and perhaps too conservative reliance on tried and tested methodologies, that is at least partially caused by the less than up-to-date involvement in the newest developments in applied mathematics. It is expected that as more applied mathematicians take up the challenge of employing the latest advances of their field in molecular modeling, important breakthroughs may follow. In this presentation some of the current challenges of molecular modeling are discussed.

  8. [Cost of therapy for neurodegenerative diseases. Applying an activity-based costing system].

    PubMed

    Sánchez-Rebull, María-Victoria; Terceño Gómez, Antonio; Travé Bautista, Angeles

    2013-01-01

    To apply the activity based costing (ABC) model to calculate the cost of therapy for neurodegenerative disorders in order to improve hospital management and allocate resources more efficiently. We used the case study method in the Francolí long-term care day center. We applied all phases of an ABC system to quantify the cost of the activities developed in the center. We identified 60 activities; the information was collected in June 2009. The ABC system allowed us to calculate the average cost per patient with respect to the therapies received. The most costly and commonly applied technique was psycho-stimulation therapy. Focusing on this therapy and on others related to the admissions process could lead to significant cost savings. ABC costing is a viable method for costing activities and therapies in long-term day care centers because it can be adapted to their structure and standard practice. This type of costing allows the costs of each activity and therapy, or combination of therapies, to be determined and aids measures to improve management. Copyright © 2012 SESPAS. Published by Elsevier Espana. All rights reserved.

  9. Visual Occlusion During Minimally Invasive Surgery: A Contemporary Review of Methods to Reduce Laparoscopic and Robotic Lens Fogging and Other Sources of Optical Loss.

    PubMed

    Manning, Todd G; Perera, Marlon; Christidis, Daniel; Kinnear, Ned; McGrath, Shannon; O'Beirne, Richard; Zotov, Paul; Bolton, Damien; Lawrentschuk, Nathan

    2017-04-01

    Maintenance of optimal vision during minimally invasive surgery is crucial to maintaining operative awareness, efficiency, and safety. Hampered vision is commonly caused by laparoscopic lens fogging (LLF), which has prompted the development of various antifogging fluids and warming devices. However, limited comparative evidence exists in contemporary literature. Despite technologic advancements there remains no consensus as to superior methods to prevent LLF or restore visual acuity once LLF has occurred. We performed a review of literature to present the current body of evidence supporting the use of numerous techniques. A standardized Preferred Reporting Items for Systematic Reviews and Meta-Analysis review was performed, and PubMed, Embase, Web of Science, and Google Scholar were searched. Articles pertaining to mechanisms and prevention of LLF were reviewed. We applied no limit to year of publication or publication type and all articles encountered were included in final review. Limited original research and heterogenous outcome measures precluded meta-analytical assessment. Vision loss has a multitude of causes and although scientific theory can be applied to in vivo environments, no authors have completely characterized this complex problem. No method to prevent or correct LLF was identified as superior to others and comparative evidence is minimal. Robotic LLF was poorly investigated and aside from a single analysis has not been directly compared to standard laparoscopic fogging in any capacity. Obscured vision during surgery is hazardous and typically caused by LLF. The etiology of LLF despite application of scientific theory is yet to be definitively proven in the in vivo environment. Common methods of prevention of LLF or restoration of vision due to LLF have little evidence-based data to support their use. A multiarm comparative in vivo analysis is required to formally assess these commonly used techniques in both standard and robotic laparoscopes.

  10. Retrospective analysis of two hundred thirty-five pediatric mandibular fracture cases.

    PubMed

    Eskitascioglu, Teoman; Ozyazgan, Irfan; Coruh, Atilla; Gunay, Galip K; Yuksel, Esabil

    2009-11-01

    Maxillofacial fractures are encountered less commonly during childhood period due to anatomic, social, cultural, and environmental factors. Although the incidence of all maxillofacial fractures is 1% to 15% among pediatric and adolescent patients, this rate drops to less than 1% in children below 5 years age. Two hundred thirty-five cases (

  11. Analysis of bacteria on steel surfaces using reflectance micro-Fourier transform infrared spectroscopy.

    PubMed

    Ojeda, Jesús J; Romero-González, María E; Banwart, Steven A

    2009-08-01

    Reflectance micro-Fourier transform infrared (FT-IR) analysis has been applied to characterize biofilm formation of Aquabacterium commune, a common microorganism present on drinking water distribution systems, onto the increasingly popular pipe material stainless steel EN1.4307. The applicability of the reflectance micro-FT-IR technique for analyzing the bacterial functional groups is discussed, and the results are compared to spectra obtained using more conventional FT-IR techniques: transmission micro-FT-IR, attenuated transmitted reflectance (ATR), and KBr pellets. The differences between the infrared spectra of wet and dried bacteria, as well as free versus attached bacteria, are also discussed. The spectra obtained using reflectance micro-FT-IR spectroscopy were comparable to those obtained using other FT-IR techniques. The absence of sample preparation, the potential to analyze intact samples, and the ability to characterize opaque and thick samples without the need to transfer the bacterial samples to an infrared transparent medium or produce a pure culture were the main advantages of reflectance micro-FT-IR spectroscopy.

  12. Application of the FICTION technique for the simultaneous detection of immunophenotype and chromosomal abnormalities in routinely fixed, paraffin wax embedded bone marrow trephines

    PubMed Central

    Korać, P; Jones, M; Dominis, M; Kušec, R; Mason, D Y; Banham, A H; Ventura, R A

    2005-01-01

    The use of interphase fluorescence in situ hybridisation (FISH) to study cytogenetic abnormalities in routinely fixed paraffin wax embedded tissue has become commonplace over the past decade. However, very few studies have applied FISH to routinely fixed bone marrow trephines (BMTs). This may be because of the acid based decalcification methods that are commonly used during the processing of BMTs, which may adversely affect the suitability of the sample for FISH analysis. For the first time, this report describes the simultaneous application of FISH and immunofluorescent staining (the FICTION technique) to formalin fixed, EDTA decalcified and paraffin wax embedded BMTs. This technique allows the direct correlation of genetic abnormalities to immunophenotype, and therefore will be particularly useful for the identification of genetic abnormalities in specific tumour cells present in BMTs. The application of this to routine clinical practice will assist diagnosis and the detection of minimal residual disease. PMID:16311361

  13. µ-XRF Studies on the Colour Brilliance in Ancient Wool Carpets

    PubMed Central

    Meyer, Markus; Borca, Camelia N.; Huthwelker, Thomas; Bieber, Manfred; Meßlinger, Karl; Fink, Rainer H.

    2017-01-01

    Many handmade ancient and recent oriental wool carpets show outstanding brilliance and persistence of colour that is not achieved by common industrial dyeing procedures. Anthropologists have suggested the influence of wool fermentation prior to dyeing as key technique to achieve the high dyeing quality. By means of μ-XRF elemental mapping of mordant metals we corroborate this view and show a deep and homogenous penetration of colourants into fermented wool fibres. Furthermore we are able to apply this technique and prove that the fermentation process for ancient specimens cannot be investigated by standard methods due to the lack of intact cuticle layers. This finding suggests a broad range of further investigations that will contribute to a deeper understanding of the development of traditional dyeing techniques. Spectroscopic studies add information on the oxidation states of the metal ions within the respective mordant-dye-complexes and suggest a partial charge transfer as basis for a significant colour change when Fe mordants are used. PMID:29109824

  14. Three solutions to a single problem: alternative casting frames for treating infantile idiopathic scoliosis.

    PubMed

    Halanski, Matthew A; Harper, Benjamin L; Cassidy, Jeffry A; Crawford, Haemish A

    2013-07-01

    This is a technique article discussing 3 alternative frames for casting children with infantile scoliosis. To provide surgeons with alternatives to expensive specialized casting tables to allow local treatment of these children utilizing readily available materials present at most institutions. Casting for infantile scoliosis has become more popular as reports have shown promising results with this technique without the morbidity and complications associated with more invasive procedures. However, without a specialized casting table, treating these patients has been limited to a few centers throughout the country often causing patients to travel large distances to receive care. Three different alternatives to commercially available casting frames are presented. Requirements, setup, and techniques are discussed. Each surgeon has had success with each of these frames. These provide adequate support and traction while allowing enough access to the trunk to apply a well-molded cast. Cotrel/Metha casting for infantile scoliosis can be accomplished without a specialized table using commonly available equipment.

  15. Genome scale enzyme–metabolite and drug–target interaction predictions using the signature molecular descriptor

    DOE PAGES

    Faulon, Jean-Loup; Misra, Milind; Martin, Shawn; ...

    2007-11-23

    Motivation: Identifying protein enzymatic or pharmacological activities are important areas of research in biology and chemistry. Biological and chemical databases are increasingly being populated with linkages between protein sequences and chemical structures. Additionally, there is now sufficient information to apply machine-learning techniques to predict interactions between chemicals and proteins at a genome scale. Current machine-learning techniques use as input either protein sequences and structures or chemical information. We propose here a method to infer protein–chemical interactions using heterogeneous input consisting of both protein sequence and chemical information. Results: Our method relies on expressing proteins and chemicals with a common cheminformaticsmore » representation. We demonstrate our approach by predicting whether proteins can catalyze reactions not present in training sets. We also predict whether a given drug can bind a target, in the absence of prior binding information for that drug and target. Lastly, such predictions cannot be made with current machine-learning techniques requiring binding information for individual reactions or individual targets.« less

  16. On the Active and Passive Flow Separation Control Techniques over Airfoils

    NASA Astrophysics Data System (ADS)

    Moghaddam, Tohid; Banazadeh Neishabouri, Nafiseh

    2017-10-01

    In the present work, recent advances in the field of the active and passive flow separation control, particularly blowing and suction flow control techniques, applied on the common airfoils are briefly reviewed. This broad research area has remained the point of interest for many years as it is applicable to various applications. The suction and blowing flow control methods, among other methods, are more technically feasible and market ready techniques. It is well established that the uniform and/or oscillatory blowing and suction flow control mechanisms significantly improve the lift-to-drag ratio, and further, postpone the boundary layer separation as well as the stall. The oscillatory blowing and suction flow control, however, is more efficient compared to the uniform one. A wide range of parameters is involved in controlling the behavior of a blowing and/or suction flow control, including the location, length, and angle of the jet slots. The oscillation range of the jet slot is another substantial parameter.

  17. Endothelial effects of hemostatic devices for continuous cardioplegia or minimally invasive operations.

    PubMed

    Perrault, L P; Menasché, P; Wassef, M; Bidouard, J P; Janiak, P; Villeneuve, N; Jacquemin, C; Bloch, G; Vilaine, J P; Vanhoutte, P M

    1996-10-01

    Improvements in myocardial protection may include the continuous delivery of normothermic blood cardioplegia. Technical aids are required for optimal visualization of the operative field during the performance of coronary anastomoses if cardioplegia is to be given continuously or during minimally invasive operations. However, the effects of the different hemostatic devices on coronary endothelial function are unknown. We compared the effects on endothelial function of two commonly used hemostatic techniques, coronary clamping and gas jet insufflation, with those of a technique using extravascular balloon occlusion to mimic systolic luminal closure by the surrounding myocardium. The three techniques were applied for 15 minutes on porcine epicardial coronary arteries from explanted hearts. For coronary clamping, standard bulldog clamps were used. Gas jet insufflation was applied by blowing oxygen (12 L/min) tangentially at a 45-degree angle 1 cm away from a 3-mm arteriotomy. Extravascular balloon occlusion was achieved with a needle-tipped silicone loop, the midportion of which, once positioned beneath the coronary artery, was inflated to push a myocardial "cushion" against the back of the vessel until its occlusion. Control rings were taken from the same coronary artery. The endothelial function of control and instrumented arterial rings was then studied in organ chambers filled with modified Krebs-Ringer bicarbonate solution. Contractions to potassium chloride and prostaglandin F2 alpha and endothelium-independent relaxation to sin-1, a nitric oxide donor, were unaffected in all groups. Endothelium-dependent relaxation to serotonin was impaired after clamping and preserved after gas jet insufflation and extravascular balloon occlusion. Maximal endothelium-dependent relaxation to serotonin was as follows: for coronary clamping, 63% +/- 6% versus 87% +/- 3% in controls; for gas jet insufflation, 67% +/- 12% versus 88% +/- 7%; and for extraluminal balloon occlusion, 79% +/- 6% versus 85% +/- 5%. Whereas commonly used hemostatic devices may impair endothelial function, extravascular balloon occlusion appears to achieve effective hemostasis while preserving endothelial integrity.

  18. In vitro stent lumen visualisation of various common and newly developed femoral artery stents using MR angiography at 1.5 and 3 tesla.

    PubMed

    Syha, R; Ketelsen, D; Kaempf, M; Mangold, S; Sixt, S; Zeller, T; Springer, F; Schick, F; Claussen, C D; Brechtel, K

    2013-02-01

    To evaluate stent lumen assessment of various commonly used and newly developed stents for the superficial femoral artery (SFA) using MR angiography (MRA) at 1.5 and 3 T. Eleven nitinol stents and one cobalt-chromium stent were compared regarding stent lumen visualisation using a common three-dimensional MRA sequence. Maximum visible stent lumen width and contrast ratio were analysed in three representative slices for each stent type. A scoring system for lumen visualisation was applied. Nitinol stents showed significantly better performance than the cobalt chromium stent (P < 0.05) at 1.5 and 3 T. Maximum visible stent lumen ranged between 43.4 and 95.5 %, contrast ratio between 7.2 and 110.6 %. Regarding both field strengths, seven of the nitinol stents were classified as "suitable". Three nitinol stents were "limited", and one nitinol stent and the cobalt chromium stent were "not suitable". Intraluminal loss of signal and artefacts of most of the SFA stents do not markedly limit assessment of stent lumen by MRA at 1.5 and 3 T. MRA can thus be considered a valid technique for detection of relevant in-stent restenosis. Applied field strength does not strongly influence stent lumen assessment in general, but proper choice of field strength might be helpful.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saha, K; Barbarits, J; Humenik, R

    Purpose: Chang’s mathematical formulation is a common method of attenuation correction applied on reconstructed Jaszczak phantom images. Though Chang’s attenuation correction method has been used for 360° angle acquisition, its applicability for 180° angle acquisition remains a question with one vendor’s camera software producing artifacts. The objective of this work is to ensure that Chang’s attenuation correction technique can be applied for reconstructed Jaszczak phantom images acquired in both 360° and 180° mode. Methods: The Jaszczak phantom filled with 20 mCi of diluted Tc-99m was placed on the patient table of Siemens e.cam™ (n = 2) and Siemens Symbia™ (nmore » = 1) dual head gamma cameras centered both in lateral and axial directions. A total of 3 scans were done at 180° and 2 scans at 360° orbit acquisition modes. Thirty two million counts were acquired for both modes. Reconstruction of the projection data was performed using filtered back projection smoothed with pre reconstruction Butterworth filter (order: 6, cutoff: 0.55). Reconstructed transaxial slices were attenuation corrected by Chang’s attenuation correction technique as implemented in the camera software. Corrections were also done using a modified technique where photon path lengths for all possible attenuation paths through a pixel in the image space were added to estimate the corresponding attenuation factor. The inverse of the attenuation factor was utilized to correct the attenuated pixel counts. Results: Comparable uniformity and noise were observed for 360° acquired phantom images attenuation corrected by the vendor technique (28.3% and 7.9%) and the proposed technique (26.8% and 8.4%). The difference in uniformity for 180° acquisition between the proposed technique (22.6% and 6.8%) and the vendor technique (57.6% and 30.1%) was more substantial. Conclusion: Assessment of attenuation correction performance by phantom uniformity analysis illustrated improved uniformity with the proposed algorithm compared to the camera software.« less

  20. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    NASA Astrophysics Data System (ADS)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any converter. This conceivably aids the sizing and design of output passive filters. An analytical estimation method has been presented to achieve this purpose for am IMC. Knowledge of the fundamental component in output voltage can be utilized to calculate its Total Harmonic Distortion (THD). The effectiveness of the proposed SVPWM algorithms and the analytical estimation technique is substantiated by simulations in MATLAB / Simulink and experiments on a laboratory prototype of the IMC. Proper comparison plots have been provided to contrast the performance of the proposed methods with the conventional SVPWM method. The behavior of output voltage distortion and CMV with variation in operating parameters like modulation index and output frequency has also been analyzed.

  1. Reconstruction of reflectance data using an interpolation technique.

    PubMed

    Abed, Farhad Moghareh; Amirshahi, Seyed Hossein; Abed, Mohammad Reza Moghareh

    2009-03-01

    A linear interpolation method is applied for reconstruction of reflectance spectra of Munsell as well as ColorChecker SG color chips from the corresponding colorimetric values under a given set of viewing conditions. Hence, different types of lookup tables (LUTs) have been created to connect the colorimetric and spectrophotometeric data as the source and destination spaces in this approach. To optimize the algorithm, different color spaces and light sources have been used to build different types of LUTs. The effects of applied color datasets as well as employed color spaces are investigated. Results of recovery are evaluated by the mean and the maximum color difference values under other sets of standard light sources. The mean and the maximum values of root mean square (RMS) error between the reconstructed and the actual spectra are also calculated. Since the speed of reflectance reconstruction is a key point in the LUT algorithm, the processing time spent for interpolation of spectral data has also been measured for each model. Finally, the performance of the suggested interpolation technique is compared with that of the common principal component analysis method. According to the results, using the CIEXYZ tristimulus values as a source space shows priority over the CIELAB color space. Besides, the colorimetric position of a desired sample is a key point that indicates the success of the approach. In fact, because of the nature of the interpolation technique, the colorimetric position of the desired samples should be located inside the color gamut of available samples in the dataset. The resultant spectra that have been reconstructed by this technique show considerable improvement in terms of RMS error between the actual and the reconstructed reflectance spectra as well as CIELAB color differences under the other light source in comparison with those obtained from the standard PCA technique.

  2. [Ligament-controlled positioning of the knee prosthesis components].

    PubMed

    Widmer, K-H; Zich, A

    2015-04-01

    There are at least two predominant goals in total knee replacement: first, the surgeon aims to achieve an optimal postoperative kinematic motion close to the patient's physiological range, and second, he aims for concurrent high ligament stability to establish pain-free movement for the entire range of motion. A number of prosthetic designs and surgical techniques have been developed in recent years to achieve both of these targets. This study presents another modified surgical procedure for total knee implantation. As in common practice the osteotomies are planned preoperatively, referencing well-defined bony landmarks, but their placement and orientation are also controlled intraoperatively in a stepwise sequence via ligamentous linkages. This method is open to all surgical approaches and can be applied for PCL-conserving or -sacrificing techniques. The anterior femoral osteotomy is carried out first, followed by the distal femoral osteotomy. Then, the extension gap is finalized by tensioning the ligaments and "top-down" referencing at the level of the tibial osteotomy, followed by finishing the flexion gap in the same way, except that the osteotomy of the posterior condyles is referenced in a "bottom-up" fashion. Hence, this technique relies on both bony and ligament-controlled procedures. Thus, it respects the modified ligamentous framework and drives the prosthetic components into the new ligamentous envelope. Further improvement may be achieved by additional control of the kinematics during surgery by applying modern computer navigation technology.

  3. Analysis and modification of blue sapphires from Rwanda by ion beam techniques

    NASA Astrophysics Data System (ADS)

    Bootkul, D.; Chaiwai, C.; Tippawan, U.; Wanthanachaisaeng, B.; Intarasiri, S.

    2015-12-01

    Blue sapphire is categorised in a corundum (Al2O3) group. The gems of this group are always amazed by their beauties and thus having high value. In this study, blue sapphires from Rwanda, recently came to Thai gemstone industry, are chosen for investigations. On one hand, we have applied Particle Induced X-ray Emission (PIXE), which is a highly sensitive and precise analytical technique that can be used to identify and quantify trace elements, for chemical analysis of the sapphires. Here we have found that the major element of blue sapphires from Rwanda is Al with trace elements such as Fe, Ti, Cr, Ga and Mg as are commonly found in normal blue sapphire. On the other hand, we have applied low and medium ion implantations for color improvement of the sapphire. It seems that a high amount of energy transferring during cascade collisions have altered the gems properties. We have clearly seen that the blue color of the sapphires have been intensified after nitrogen ion bombardment. In addition, the gems were also having more transparent and luster. The UV-Vis-NIR measurement detected the modification of their absorption properties, implying of the blue color increasing. Here the mechanism of these modifications is postulated and reported. In any point of view, the bombardment by using nitrogen ion beam is a promising technique for quality improvement of the blue sapphire from Rwanda.

  4. Label-free in vivo analysis of intracellular lipid droplets in the oleaginous microalga Monoraphidium neglectum by coherent Raman scattering microscopy

    PubMed Central

    Jaeger, Daniel; Pilger, Christian; Hachmeister, Henning; Oberländer, Elina; Wördenweber, Robin; Wichmann, Julian; Mussgnug, Jan H.; Huser, Thomas; Kruse, Olaf

    2016-01-01

    Oleaginous photosynthetic microalgae hold great promise as non-food feedstocks for the sustainable production of bio-commodities. The algal lipid quality can be analysed by Raman micro-spectroscopy, and the lipid content can be imaged in vivo in a label-free and non-destructive manner by coherent anti-Stokes Raman scattering (CARS) microscopy. In this study, both techniques were applied to the oleaginous microalga Monoraphidium neglectum, a biotechnologically promising microalga resistant to commonly applied lipid staining techniques. The lipid-specific CARS signal was successfully separated from the interfering two-photon excited fluorescence of chlorophyll and for the first time, lipid droplet formation during nitrogen starvation could directly be analysed. We found that the neutral lipid content deduced from CARS image analysis strongly correlated with the neutral lipid content measured gravimetrically and furthermore, that the relative degree of unsaturation of fatty acids stored in lipid droplets remained similar. Interestingly, the lipid profile during cellular adaption to nitrogen starvation showed a two-phase characteristic with initially fatty acid recycling and subsequent de novo lipid synthesis. This works demonstrates the potential of quantitative CARS microscopy as a label-free lipid analysis technique for any microalgal species, which is highly relevant for future biotechnological applications and to elucidate the process of microalgal lipid accumulation. PMID:27767024

  5. Esophageal testing: What we have so far

    PubMed Central

    de Bortoli, Nicola; Martinucci, Irene; Bertani, Lorenzo; Russo, Salvatore; Franchi, Riccardo; Furnari, Manuele; Tolone, Salvatore; Bodini, Giorgia; Bolognesi, Valeria; Bellini, Massimo; Savarino, Vincenzo; Marchi, Santino; Savarino, Edoardo Vincenzo

    2016-01-01

    Gastroesophageal reflux disease (GERD) is a common disorder of the gastrointestinal tract. In the last few decades, new technologies have evolved and have been applied to the functional study of the esophagus, allowing for the improvement of our knowledge of the pathophysiology of GERD. High-resolution manometry (HRM) permits greater understanding of the function of the esophagogastric junction and the risks associated with hiatal hernia. Moreover, HRM has been found to be more reproducible and sensitive than conventional water-perfused manometry to detect the presence of transient lower esophageal sphincter relaxation. Esophageal 24-h pH-metry with or without combined impedance is usually performed in patients with negative endoscopy and reflux symptoms who have a poor response to anti-reflux medical therapy to assess esophageal acid exposure and symptom-reflux correlations. In particular, esophageal 24-h impedance and pH monitoring can detect acid and non-acid reflux events. EndoFLIP is a recent technique poorly applied in clinical practice, although it provides a large amount of information about the esophagogastric junction. In the coming years, laryngopharyngeal symptoms could be evaluated with up and coming non-invasive or minimally invasive techniques, such as pepsin detection in saliva or pharyngeal pH-metry. Future studies are required of these techniques to evaluate their diagnostic accuracy and usefulness, although the available data are promising. PMID:26909230

  6. Ultrasonic Fingerprinting of Structural Materials: Spent Nuclear Fuel Containers Case-Study

    NASA Astrophysics Data System (ADS)

    Sednev, D.; Lider, A.; Demyanuk, D.; Kroening, M.; Salchak, Y.

    Nowadays, NDT is mainly focused on safety purposes, but it seems possible to apply those methods to provide national and IAEA safeguards. The containment of spent fuel in storage casks could be dramatically improved in case of development of so-called "smart" spent fuel storage and transfer casks. Such casks would have tamper indicating and monitoring/tracking features integrated directly into the cask design. The microstructure of the containers material as well as of the dedicated weld seam is applied to the lid and the cask body and provides a unique fingerprint of the full container, which can be reproducibly scanned by using an appropriate technique. The echo-sounder technique, which is the most commonly used method for material inspection, was chosen for this project. The main measuring parameter is acoustic noise, reflected from material's artefacts. The purpose is to obtain structural fingerprinting. Reference measurement and additional measurement results were compared. Obtained results have verified the appliance of structural fingerprint and the chosen control method. The successful authentication demonstrates the levels of the feature points' compliance exceeding the given threshold which differs considerably from the percentage of the concurrent points during authentication from other points. Since reproduction or doubling of the proposed unique identification characteristics is impossible at the current state science and technology, application of this technique is considered to identify the interference into the nuclear materials displacement with high accuracy.

  7. Data in support of the detection of genetically modified organisms (GMOs) in food and feed samples.

    PubMed

    Alasaad, Noor; Alzubi, Hussein; Kader, Ahmad Abdul

    2016-06-01

    Food and feed samples were randomly collected from different sources, including local and imported materials from the Syrian local market. These included maize, barley, soybean, fresh food samples and raw material. GMO detection was conducted by PCR and nested PCR-based techniques using specific primers for the most used foreign DNA commonly used in genetic transformation procedures, i.e., 35S promoter, T-nos, epsps, cryIA(b) gene and nptII gene. The results revealed for the first time in Syria the presence of GM foods and feeds with glyphosate-resistant trait of P35S promoter and NOS terminator in the imported soybean samples with high frequency (5 out of the 6 imported soybean samples). While, tests showed negative results for the local samples. Also, tests revealed existence of GMOs in two imported maize samples detecting the presence of 35S promoter and nos terminator. Nested PCR results using two sets of primers confirmed our data. The methods applied in the brief data are based on DNA analysis by Polymerase Chain Reaction (PCR). This technique is specific, practical, reproducible and sensitive enough to detect up to 0.1% GMO in food and/or feedstuffs. Furthermore, all of the techniques mentioned are economic and can be applied in Syria and other developing countries. For all these reasons, the DNA-based analysis methods were chosen and preferred over protein-based analysis.

  8. Application of noninvasive brain stimulation for post-stroke dysphagia rehabilitation.

    PubMed

    Wang, Zhuo; Song, Wei-Qun; Wang, Liang

    2017-02-01

    Noninvasive brain stimulation (NIBS), commonly consisting of transcranial magnetic stimulation (TMS), transcranial direct-current stimulation (tDCS), as well as paired associative stimulation (PAS), has attracted increased interest and been applied experimentally in the treatment of post-stroke dysphagia (PSD). This review presented a synopsis of the current research for the application of NIBS on PSD. The intention here was to understand the current research progress and limitations in this field and to stimulate potential research questions not yet investigated for the application of NIBS on patients with PSD. Here we successively reviewed advances of repetitive TMS (rTMS), tDCS, and PAS techniques on both healthy participants and PSD patients in three aspects, including scientific researches about dysphagia mechanism, applied studies about stimulation parameters, and clinical trials about their therapeutic effects. The techniques of NIBS, especially rTMS, have been used by the researchers to explore the different mechanisms between swallowing recovery and extremity rehabilitation. The key findings included the important role of intact hemisphere reorganization for PSD recovery, and the use of NIBS on the contra-lesional side as a therapeutic potential for dysphagia rehabilitation. Though significant results were achieved in most studies by using NIBS on swallowing rehabilitation, it is still difficult to draw conclusions for the efficacy of these neurostimulation techniques, considering the great disparities between studies. Copyright © 2016. Published by Elsevier Taiwan.

  9. Using Just in Time Teaching in a Global Climate Change Course to Address Misconceptions

    NASA Astrophysics Data System (ADS)

    Schuenemann, K. C.

    2013-12-01

    Just in Time Teaching (JiTT) is employed in an introductory Global Climate Change college course with the intention of addressing common misconceptions and climate myths. Students enter the course with a variety of prior knowledge and opinions on global warming, and JiTT can be used as a constructivist pedagogical approach to make use of this prior knowledge. Students are asked to watch a short video or do a reading, sometimes screen capture videos created by the professor as review of material from the previous class, a video available on the web from NASA or NOAA, for example, or a reading from an online article or their textbook. After the video or reading, students answer a question carefully designed to pry at a common misconception, or simply are asked for the 'muddiest point' that remains on the concept. This assignment is done the night before class using a web program. The program aggregates the answers in an organized way so the professor can use the answers to design the day's lesson to address common misconceptions or concerns students displayed in their answers, as well as quickly assign participation credit to students who completed the assignment. On the other hand, if students display that they have already mastered the material, the professor can confidently move on to the next concept. The JiTT pedagogical method personalizes each lecture period to the students in that particular class for maximum efficiency while catching and fixing misconceptions in a timely manner. This technique requires students to spend time with the material outside of class, acts as review of important concepts, and increases engagement in class due to the personalization of the course. Evaluation results from use of this technique will be presented. Examples of successful JiTT videos, questions, student answers, and techniques for addressing misconceptions during lecture will also be presented with the intention that instructors can easily apply this technique to their next course.

  10. CBCT-based bone quality assessment: are Hounsfield units applicable?

    PubMed Central

    Jacobs, R; Singer, S R; Mupparapu, M

    2015-01-01

    CBCT is a widely applied imaging modality in dentistry. It enables the visualization of high-contrast structures of the oral region (bone, teeth, air cavities) at a high resolution. CBCT is now commonly used for the assessment of bone quality, primarily for pre-operative implant planning. Traditionally, bone quality parameters and classifications were primarily based on bone density, which could be estimated through the use of Hounsfield units derived from multidetector CT (MDCT) data sets. However, there are crucial differences between MDCT and CBCT, which complicates the use of quantitative gray values (GVs) for the latter. From experimental as well as clinical research, it can be seen that great variability of GVs can exist on CBCT images owing to various reasons that are inherently associated with this technique (i.e. the limited field size, relatively high amount of scattered radiation and limitations of currently applied reconstruction algorithms). Although attempts have been made to correct for GV variability, it can be postulated that the quantitative use of GVs in CBCT should be generally avoided at this time. In addition, recent research and clinical findings have shifted the paradigm of bone quality from a density-based analysis to a structural evaluation of the bone. The ever-improving image quality of CBCT allows it to display trabecular bone patterns, indicating that it may be possible to apply structural analysis methods that are commonly used in micro-CT and histology. PMID:25315442

  11. Decision curve analysis: a novel method for evaluating prediction models.

    PubMed

    Vickers, Andrew J; Elkin, Elena B

    2006-01-01

    Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.

  12. A numerical study of different projection-based model reduction techniques applied to computational homogenisation

    NASA Astrophysics Data System (ADS)

    Soldner, Dominic; Brands, Benjamin; Zabihyan, Reza; Steinmann, Paul; Mergheim, Julia

    2017-10-01

    Computing the macroscopic material response of a continuum body commonly involves the formulation of a phenomenological constitutive model. However, the response is mainly influenced by the heterogeneous microstructure. Computational homogenisation can be used to determine the constitutive behaviour on the macro-scale by solving a boundary value problem at the micro-scale for every so-called macroscopic material point within a nested solution scheme. Hence, this procedure requires the repeated solution of similar microscopic boundary value problems. To reduce the computational cost, model order reduction techniques can be applied. An important aspect thereby is the robustness of the obtained reduced model. Within this study reduced-order modelling (ROM) for the geometrically nonlinear case using hyperelastic materials is applied for the boundary value problem on the micro-scale. This involves the Proper Orthogonal Decomposition (POD) for the primary unknown and hyper-reduction methods for the arising nonlinearity. Therein three methods for hyper-reduction, differing in how the nonlinearity is approximated and the subsequent projection, are compared in terms of accuracy and robustness. Introducing interpolation or Gappy-POD based approximations may not preserve the symmetry of the system tangent, rendering the widely used Galerkin projection sub-optimal. Hence, a different projection related to a Gauss-Newton scheme (Gauss-Newton with Approximated Tensors- GNAT) is favoured to obtain an optimal projection and a robust reduced model.

  13. A short review on a complication of lumbar spine surgery: CSF leak.

    PubMed

    Menon, Sajesh K; Onyia, Chiazor U

    2015-12-01

    Cerebrospinal fluid (CSF) leak is a common complication of surgery involving the lumbar spine. Over the past decades, there has been significant advancement in understanding the basis, management and techniques of treatment for post-operative CSF leak following lumbar spine surgery. In this article, we review previous work in the literature on the various factors and technical errors during or after lumbar spine surgery that may lead to this feared complication, the available options of management with focus on the various techniques employed, the outcomes and also to highlight on the current trends. We also discuss the presentation, factors contributing to its development, basic concepts and practical aspects of the management with emphasis on the different techniques of treatment. Different outcomes following various techniques of managing post-operative CSF leak after lumbar spine surgery have been well described in the literature. However, there is currently no most ideal technique among the available options. The choice of which technique to be applied in each case is dependent on each surgeon's cumulative experience as well as a clear understanding of the contributory underlying factors in each patient, the nature and site of the leak, the available facilities and equipment. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification

    NASA Astrophysics Data System (ADS)

    Bates, Matthew E.; Keisler, Jeffrey M.; Zussblatt, Niels P.; Plourde, Kenton J.; Wender, Ben A.; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis—methods commonly applied in financial and operations management—to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios—combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

  15. Towards the estimation of effect measures in studies using respondent-driven sampling.

    PubMed

    Rotondi, Michael A

    2014-06-01

    Respondent-driven sampling (RDS) is an increasingly common sampling technique to recruit hidden populations. Statistical methods for RDS are not straightforward due to the correlation between individual outcomes and subject weighting; thus, analyses are typically limited to estimation of population proportions. This manuscript applies the method of variance estimates recovery (MOVER) to construct confidence intervals for effect measures such as risk difference (difference of proportions) or relative risk in studies using RDS. To illustrate the approach, MOVER is used to construct confidence intervals for differences in the prevalence of demographic characteristics between an RDS study and convenience study of injection drug users. MOVER is then applied to obtain a confidence interval for the relative risk between education levels and HIV seropositivity and current infection with syphilis, respectively. This approach provides a simple method to construct confidence intervals for effect measures in RDS studies. Since it only relies on a proportion and appropriate confidence limits, it can also be applied to previously published manuscripts.

  16. Investigation of smoothness-increasing accuracy-conserving filters for improving streamline integration through discontinuous fields.

    PubMed

    Steffen, Michael; Curtis, Sean; Kirby, Robert M; Ryan, Jennifer K

    2008-01-01

    Streamline integration of fields produced by computational fluid mechanics simulations is a commonly used tool for the investigation and analysis of fluid flow phenomena. Integration is often accomplished through the application of ordinary differential equation (ODE) integrators--integrators whose error characteristics are predicated on the smoothness of the field through which the streamline is being integrated--smoothness which is not available at the inter-element level of finite volume and finite element data. Adaptive error control techniques are often used to ameliorate the challenge posed by inter-element discontinuities. As the root of the difficulties is the discontinuous nature of the data, we present a complementary approach of applying smoothness-enhancing accuracy-conserving filters to the data prior to streamline integration. We investigate whether such an approach applied to uniform quadrilateral discontinuous Galerkin (high-order finite volume) data can be used to augment current adaptive error control approaches. We discuss and demonstrate through numerical example the computational trade-offs exhibited when one applies such a strategy.

  17. Balancing research and funding using value of information and portfolio tools for nanomaterial risk classification.

    PubMed

    Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor

    2016-02-01

    Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.

  18. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    PubMed Central

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  19. [Osteosynthesis in the Surgical Treatment of Prognathism: State of The Art].

    PubMed

    Durão, Nuno; Amarante, José

    2017-03-31

    Prognathism is a common skeletal facial abnormality, associated with class III malocclusion, often with repercussions in quality of life. In addition to orthodontic treatment, sagittal split ramus osteotomy is the most common technique for its correction, and segment osteosynthesis is an important element of the post-surgical outcome. A search for relevant literature was conducted in the PubMed/MEDLINE database and in other relevant sources. The stability of different fixation methods, their repercussions on inferior alveolar nerve lesions, and the type of material are among the most researched subjects. Recent research about the type of osteosynthesis applied in the sagittal split ramus osteotomy for mandibular setback is discussed. Miniplates appear to be the better option for fixation of sagittal split osteotomy for mandibular setback. Bioabsorbable osteosynthesis may be an acceptable alternative to titanium.

  20. The Levy sections theorem revisited

    NASA Astrophysics Data System (ADS)

    Figueiredo, Annibal; Gleria, Iram; Matsushita, Raul; Da Silva, Sergio

    2007-06-01

    This paper revisits the Levy sections theorem. We extend the scope of the theorem to time series and apply it to historical daily returns of selected dollar exchange rates. The elevated kurtosis usually observed in such series is then explained by their volatility patterns. And the duration of exchange rate pegs explains the extra elevated kurtosis in the exchange rates of emerging markets. In the end, our extension of the theorem provides an approach that is simpler than the more common explicit modelling of fat tails and dependence. Our main purpose is to build up a technique based on the sections that allows one to artificially remove the fat tails and dependence present in a data set. By analysing data through the lenses of the Levy sections theorem one can find common patterns in otherwise very different data sets.

  1. Nanomedicine – challenge and perspectives

    PubMed Central

    Riehemann, Kristina; Schneider, Stefan W.; Luger, Thomas A.; Godin, Biana; Ferrari, Mauro; Fuchs, Harald

    2014-01-01

    Nanomedicine introduces nanotechnology concepts into medicine and thus joins two large cross disciplinary fields with an unprecedented societal and economical potential arising from the natural combination of specific achievements in the respective fields. The common basis evolves from the molecular scale properties relevant in the two fields. Nanoanalytical tools such as local probes and molecular imaging techniques, allow us to characterize surface and interface properties at a nanometer scale at predefined locations, while elaborated chemical approaches offer the opportunity for the control and addressing of surfaces e. g. for targeted drug delivery, enhanced biocompatibility and neuroprosthetic purposes. This commonality opens a wide variety of economic fields both of industrial and clinical interests. However, concerns arise in this cross disciplinary area about toxicological aspects and ethical implications. This review gives an overview of selected recent developments of nanotechnology applied on medical objectives. PMID:19142939

  2. Identification of Low Order Equivalent System Models From Flight Test Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    2000-01-01

    Identification of low order equivalent system dynamic models from flight test data was studied. Inputs were pilot control deflections, and outputs were aircraft responses, so the models characterized the total aircraft response including bare airframe and flight control system. Theoretical investigations were conducted and related to results found in the literature. Low order equivalent system modeling techniques using output error and equation error parameter estimation in the frequency domain were developed and validated on simulation data. It was found that some common difficulties encountered in identifying closed loop low order equivalent system models from flight test data could be overcome using the developed techniques. Implications for data requirements and experiment design were discussed. The developed methods were demonstrated using realistic simulation cases, then applied to closed loop flight test data from the NASA F-18 High Alpha Research Vehicle.

  3. Statistical Mechanics of Coherent Ising Machine — The Case of Ferromagnetic and Finite-Loading Hopfield Models —

    NASA Astrophysics Data System (ADS)

    Aonishi, Toru; Mimura, Kazushi; Utsunomiya, Shoko; Okada, Masato; Yamamoto, Yoshihisa

    2017-10-01

    The coherent Ising machine (CIM) has attracted attention as one of the most effective Ising computing architectures for solving large scale optimization problems because of its scalability and high-speed computational ability. However, it is difficult to implement the Ising computation in the CIM because the theories and techniques of classical thermodynamic equilibrium Ising spin systems cannot be directly applied to the CIM. This means we have to adapt these theories and techniques to the CIM. Here we focus on a ferromagnetic model and a finite loading Hopfield model, which are canonical models sharing a common mathematical structure with almost all other Ising models. We derive macroscopic equations to capture nonequilibrium phase transitions in these models. The statistical mechanical methods developed here constitute a basis for constructing evaluation methods for other Ising computation models.

  4. Detecting Spatial Patterns in Biological Array Experiments

    PubMed Central

    ROOT, DAVID E.; KELLEY, BRIAN P.; STOCKWELL, BRENT R.

    2005-01-01

    Chemical genetic screening and DNA and protein microarrays are among a number of increasingly important and widely used biological research tools that involve large numbers of parallel experiments arranged in a spatial array. It is often difficult to ensure that uniform experimental conditions are present throughout the entire array, and as a result, one often observes systematic spatially correlated errors, especially when array experiments are performed using robots. Here, the authors apply techniques based on the discrete Fourier transform to identify and quantify spatially correlated errors superimposed on a spatially random background. They demonstrate that these techniques are effective in identifying common spatially systematic errors in high-throughput 384-well microplate assay data. In addition, the authors employ a statistical test to allow for automatic detection of such errors. Software tools for using this approach are provided. PMID:14567791

  5. A novel functional electrical stimulation-control system for restoring motor function of post-stroke hemiplegic patients

    PubMed Central

    Huang, Zonghao; Wang, Zhigong; Lv, Xiaoying; Zhou, Yuxuan; Wang, Haipeng; Zong, Sihao

    2014-01-01

    Hemiparesis is one of the most common consequences of stroke. Advanced rehabilitation techniques are essential for restoring motor function in hemiplegic patients. Functional electrical stimulation applied to the affected limb based on myoelectric signal from the unaffected limb is a promising therapy for hemiplegia. In this study, we developed a prototype system for evaluating this novel functional electrical stimulation-control strategy. Based on surface electromyography and a vector machine model, a self-administered, multi-movement, force-modulation functional electrical stimulation-prototype system for hemiplegia was implemented. This paper discusses the hardware design, the algorithm of the system, and key points of the self-oscillation-prone system. The experimental results demonstrate the feasibility of the prototype system for further clinical trials, which is being conducted to evaluate the efficacy of the proposed rehabilitation technique. PMID:25657728

  6. Determination of the ruminant origin of bone particles using fluorescence in situ hybridization (FISH)

    PubMed Central

    Lecrenier, M. C.; Ledoux, Q.; Berben, G.; Fumière, O.; Saegerman, C.; Baeten, V.; Veys, P.

    2014-01-01

    Molecular biology techniques such as PCR constitute powerful tools for the determination of the taxonomic origin of bones. DNA degradation and contamination by exogenous DNA, however, jeopardise bone identification. Despite the vast array of techniques used to decontaminate bone fragments, the isolation and determination of bone DNA content are still problematic. Within the framework of the eradication of transmissible spongiform encephalopathies (including BSE, commonly known as “mad cow disease”), a fluorescence in situ hybridization (FISH) protocol was developed. Results from the described study showed that this method can be applied directly to bones without a demineralisation step and that it allows the identification of bovine and ruminant bones even after severe processing. The results also showed that the method is independent of exogenous contamination and that it is therefore entirely appropriate for this application. PMID:25034259

  7. Determination of the ruminant origin of bone particles using fluorescence in situ hybridization (FISH).

    PubMed

    Lecrenier, M C; Ledoux, Q; Berben, G; Fumière, O; Saegerman, C; Baeten, V; Veys, P

    2014-07-17

    Molecular biology techniques such as PCR constitute powerful tools for the determination of the taxonomic origin of bones. DNA degradation and contamination by exogenous DNA, however, jeopardise bone identification. Despite the vast array of techniques used to decontaminate bone fragments, the isolation and determination of bone DNA content are still problematic. Within the framework of the eradication of transmissible spongiform encephalopathies (including BSE, commonly known as "mad cow disease"), a fluorescence in situ hybridization (FISH) protocol was developed. Results from the described study showed that this method can be applied directly to bones without a demineralisation step and that it allows the identification of bovine and ruminant bones even after severe processing. The results also showed that the method is independent of exogenous contamination and that it is therefore entirely appropriate for this application.

  8. Combined empirical mode decomposition and texture features for skin lesion classification using quadratic support vector machine.

    PubMed

    Wahba, Maram A; Ashour, Amira S; Napoleon, Sameh A; Abd Elnaby, Mustafa M; Guo, Yanhui

    2017-12-01

    Basal cell carcinoma is one of the most common malignant skin lesions. Automated lesion identification and classification using image processing techniques is highly required to reduce the diagnosis errors. In this study, a novel technique is applied to classify skin lesion images into two classes, namely the malignant Basal cell carcinoma and the benign nevus. A hybrid combination of bi-dimensional empirical mode decomposition and gray-level difference method features is proposed after hair removal. The combined features are further classified using quadratic support vector machine (Q-SVM). The proposed system has achieved outstanding performance of 100% accuracy, sensitivity and specificity compared to other support vector machine procedures as well as with different extracted features. Basal Cell Carcinoma is effectively classified using Q-SVM with the proposed combined features.

  9. Advances in simultaneous DSC-FTIR microspectroscopy for rapid solid-state chemical stability studies: some dipeptide drugs as examples.

    PubMed

    Lin, Shan-Yang; Wang, Shun-Li

    2012-04-01

    The solid-state chemistry of drugs has seen growing importance in the pharmaceutical industry for the development of useful API (active pharmaceutical ingredients) of drugs and stable dosage forms. The stability of drugs in various solid dosage forms is an important issue because solid dosage forms are the most common pharmaceutical formulation in clinical use. In solid-state stability studies of drugs, an ideal accelerated method must not only be selected by different complicated methods, but must also detect the formation of degraded product. In this review article, an analytical technique combining differential scanning calorimetry and Fourier-transform infrared (DSC-FTIR) microspectroscopy simulates the accelerated stability test, and simultaneously detects the decomposed products in real time. The pharmaceutical dipeptides aspartame hemihydrate, lisinopril dihydrate, and enalapril maleate either with or without Eudragit E were used as testing examples. This one-step simultaneous DSC-FTIR technique for real-time detection of diketopiperazine (DKP) directly evidenced the dehydration process and DKP formation as an impurity common in pharmaceutical dipeptides. DKP formation in various dipeptides determined by different analytical methods had been collected and compiled. Although many analytical methods have been applied, the combined DSC-FTIR technique is an easy and fast analytical method which not only can simulate the accelerated drug stability testing but also at the same time enable to explore phase transformation as well as degradation due to thermal-related reactions. This technique offers quick and proper interpretations. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Helminth ova control in sludge: a review.

    PubMed

    Jiménez, B

    2007-01-01

    Sludge reuse for agricultural production or soil reclamation is a common practice in several countries, but it entails risks if not properly performed. One such risk is the dissemination of helminthiases diseases. As a consequence, international criteria and national standards set values to limit their content in biosolids. However, little information is available on how to inactivate helminth ova from sludge, particularly when a high content is involved as is the case in the developing world. Moreover, treatment criteria are based on a limited number of studies dealing with local characteristics that, when applied to the conditions in developing countries, produce poor results. This is because design criteria were developed for Ascaris (a kind of helminth) while sludge contains a variety of genera. In addition, much information on helminth ova was produced a long time ago using inaccurate analytical techniques. This paper summarizes research and recent technical information from the literature concerning: (a) the general characteristics of helminth ova; (b) the common helminth ova genera found in sludge; (c) the main removal and inactivation mechanisms, (d) the processes that have proven effective in practical conditions at inactivating helminth ova; and (e) analytical techniques used to enumerate these pathogens.

  11. Studying Health Outcomes in Farmworker Populations Exposed to Pesticides

    PubMed Central

    McCauley, Linda A.; Anger, W. Kent; Keifer, Matthew; Langley, Rick; Robson, Mark G.; Rohlman, Diane

    2006-01-01

    A major goal of studying farmworkers is to better understand how their work environment, including exposure to pesticides, affects their health. Although a number of health conditions have been associated with pesticide exposure, clear linkages have yet to be made between exposure and health effects except in cases of acute pesticide exposure. In this article, we review the most common health end points that have been studied and describe the epidemiologic challenges encountered in studying these health effects of pesticides among farmworkers, including the difficulties in accessing the population and challenges associated with obtaining health end point data. The assessment of neurobehavioral health effects serves as one of the most common and best examples of an approach used to study health outcomes in farmworkers and other populations exposed to pesticides. We review the current limitations in neurobehavioral assessment and strategies to improve these analytical methods. Emerging techniques to improve our assessment of health effects associated with pesticide exposure are reviewed. These techniques, which in most cases have not been applied to farmworker populations, hold promise in our ability to study and understand the relationship between pesticide exposure and a variety of health effects in this population. PMID:16760000

  12. Improved superficial brain hemorrhage visualization in susceptibility weighted images by constrained minimum intensity projection

    NASA Astrophysics Data System (ADS)

    Castro, Marcelo A.; Pham, Dzung L.; Butman, John

    2016-03-01

    Minimum intensity projection is a technique commonly used to display magnetic resonance susceptibility weighted images, allowing the observer to better visualize hemorrhages and vasculature. The technique displays the minimum intensity in a given projection within a thick slab, allowing different connectivity patterns to be easily revealed. Unfortunately, the low signal intensity of the skull within the thick slab can mask superficial tissues near the skull base and other regions. Because superficial microhemorrhages are a common feature of traumatic brain injury, this effect limits the ability to proper diagnose and follow up patients. In order to overcome this limitation, we developed a method to allow minimum intensity projection to properly display superficial tissues adjacent to the skull. Our approach is based on two brain masks, the largest of which includes extracerebral voxels. The analysis of the rind within both masks containing the actual brain boundary allows reclassification of those voxels initially missed in the smaller mask. Morphological operations are applied to guarantee accuracy and topological correctness, and the mean intensity within the mask is assigned to all outer voxels. This prevents bone from dominating superficial regions in the projection, enabling superior visualization of cortical hemorrhages and vessels.

  13. The moss Physcomitrella patens: methods and tools from cultivation to targeted analysis of gene function.

    PubMed

    Strotbek, Christoph; Krinninger, Stefan; Frank, Wolfgang

    2013-01-01

    To comprehensively understand the major processes in plant biology, it is necessary to study a diverse set of species that represent the complexity of plants. This research will help to comprehend common conserved mechanisms and principles, as well as to elucidate those mechanisms that are specific to a particular plant clade. Thereby, we will gain knowledge about the invention and loss of mechanisms and their biological impact causing the distinct specifications throughout the plant kingdom. Since the establishment of transgenic plants, these studies concentrate on the elucidation of gene functions applying an increasing repertoire of molecular techniques. In the last two decades, the moss Physcomitrella patens joined the established set of plant models based on its evolutionary position bridging unicellular algae and vascular plants and a number of specific features alleviating gene function analysis. Here, we want to provide an overview of the specific features of P. patens making it an interesting model for many research fields in plant biology, to present the major achievements in P. patens genetic engineering, and to introduce common techniques to scientists who intend to use P. patens as a model in their research activities.

  14. Preprocessing of 2-Dimensional Gel Electrophoresis Images Applied to Proteomic Analysis: A Review.

    PubMed

    Goez, Manuel Mauricio; Torres-Madroñero, Maria Constanza; Röthlisberger, Sarah; Delgado-Trejos, Edilson

    2018-02-01

    Various methods and specialized software programs are available for processing two-dimensional gel electrophoresis (2-DGE) images. However, due to the anomalies present in these images, a reliable, automated, and highly reproducible system for 2-DGE image analysis has still not been achieved. The most common anomalies found in 2-DGE images include vertical and horizontal streaking, fuzzy spots, and background noise, which greatly complicate computational analysis. In this paper, we review the preprocessing techniques applied to 2-DGE images for noise reduction, intensity normalization, and background correction. We also present a quantitative comparison of non-linear filtering techniques applied to synthetic gel images, through analyzing the performance of the filters under specific conditions. Synthetic proteins were modeled into a two-dimensional Gaussian distribution with adjustable parameters for changing the size, intensity, and degradation. Three types of noise were added to the images: Gaussian, Rayleigh, and exponential, with signal-to-noise ratios (SNRs) ranging 8-20 decibels (dB). We compared the performance of wavelet, contourlet, total variation (TV), and wavelet-total variation (WTTV) techniques using parameters SNR and spot efficiency. In terms of spot efficiency, contourlet and TV were more sensitive to noise than wavelet and WTTV. Wavelet worked the best for images with SNR ranging 10-20 dB, whereas WTTV performed better with high noise levels. Wavelet also presented the best performance with any level of Gaussian noise and low levels (20-14 dB) of Rayleigh and exponential noise in terms of SNR. Finally, the performance of the non-linear filtering techniques was evaluated using a real 2-DGE image with previously identified proteins marked. Wavelet achieved the best detection rate for the real image. Copyright © 2018 Beijing Institute of Genomics, Chinese Academy of Sciences and Genetics Society of China. Production and hosting by Elsevier B.V. All rights reserved.

  15. Taxonomy of segmental myocardial systolic dysfunction

    PubMed Central

    McDiarmid, Adam K.; Pellicori, Pierpaolo; Cleland, John G.; Plein, Sven

    2017-01-01

    The terms used to describe different states of myocardial health and disease are poorly defined. Imprecision and inconsistency in nomenclature can lead to difficulty in interpreting and applying trial outcomes to clinical practice. In particular, the terms ‘viable’ and ‘hibernating’ are commonly applied interchangeably and incorrectly to myocardium that exhibits chronic contractile dysfunction in patients with ischaemic heart disease. The range of inherent differences amongst imaging modalities used to define myocardial health and disease add further challenges to consistent definitions. The results of several large trials have led to renewed discussion about the classification of dysfunctional myocardial segments. This article aims to describe the diverse myocardial pathologies that may affect the myocardium in ischaemic heart disease and cardiomyopathy, and how they may be assessed with non-invasive imaging techniques in order to provide a taxonomy of myocardial dysfunction. PMID:27147609

  16. Signal processing methods for in-situ creep specimen monitoring

    NASA Astrophysics Data System (ADS)

    Guers, Manton J.; Tittmann, Bernhard R.

    2018-04-01

    Previous work investigated using guided waves for monitoring creep deformation during accelerated life testing. The basic objective was to relate observed changes in the time-of-flight to changes in the environmental temperature and specimen gage length. The work presented in this paper investigated several signal processing strategies for possible application in the in-situ monitoring system. Signal processing methods for both group velocity (wave-packet envelope) and phase velocity (peak tracking) time-of-flight were considered. Although the Analytic Envelope found via the Hilbert transform is commonly applied for group velocity measurements, erratic behavior in the indicated time-of-flight was observed when this technique was applied to the in-situ data. The peak tracking strategies tested had generally linear trends, and tracking local minima in the raw waveform ultimately showed the most consistent results.

  17. Microbial sequencing methods for monitoring of anaerobic treatment of antibiotics to optimize performance and prevent system failure.

    PubMed

    Aydin, Sevcan

    2016-06-01

    As a result of developments in molecular technologies and the use of sequencing technologies, the analyses of the anaerobic microbial community in biological treatment process has become increasingly prevalent. This review examines the ways in which microbial sequencing methods can be applied to achieve an extensive understanding of the phylogenetic and functional characteristics of microbial assemblages in anaerobic reactor if the substrate is contaminated by antibiotics which is one of the most important toxic compounds. It will discuss some of the advantages and disadvantages associated with microbial sequencing techniques that are more commonly employed and will assess how a combination of the existing methods may be applied to develop a more comprehensive understanding of microbial communities and improve the validity and depth of the results for the enhancement of the stability of anaerobic reactors.

  18. Quantitative analysis of the local phase transitions induced by the laser heating

    DOE PAGES

    Levlev, Anton V.; Susner, Michael A.; McGuire, Michael A.; ...

    2015-11-04

    Functional imaging enabled by scanning probe microscopy (SPM) allows investigations of nanoscale material properties under a wide range of external conditions, including temperature. However, a number of shortcomings preclude the use of the most common material heating techniques, thereby limiting precise temperature measurements. Here we discuss an approach to local laser heating on the micron scale and its applicability for SPM. We applied local heating coupled with piezoresponse force microscopy and confocal Raman spectroscopy for nanoscale investigations of a ferroelectric-paraelectric phase transition in the copper indium thiophosphate layered ferroelectric. Bayesian linear unmixing applied to experimental results allowed extraction of themore » Raman spectra of different material phases and enabled temperature calibration in the heated region. Lastly, the obtained results enable a systematic approach for studying temperature-dependent material functionalities in heretofore unavailable temperature regimes.« less

  19. Patient Characteristics by Type of Hypersexuality Referral: A Quantitative Chart Review of 115 Consecutive Male Cases.

    PubMed

    Sutton, Katherine S; Stratton, Natalie; Pytyck, Jennifer; Kolla, Nathan J; Cantor, James M

    2015-01-01

    Hypersexuality remains an increasingly common but poorly understood patient complaint. Despite diversity in clinical presentations of patients referred for hypersexuality, the literature has maintained treatment approaches that are assumed to apply to the entire phenomenon. This approach has proven ineffective, despite its application over several decades. The present study used quantitative methods to examine demographic, mental health, and sexological correlates of common clinical subtypes of hypersexuality referrals. Findings support the existence of subtypes, each with distinct clusters of features. Paraphilic hypersexuals reported greater numbers of sexual partners, more substance abuse, initiation to sexual activity at an earlier age, and novelty as a driving force behind their sexual behavior. Avoidant masturbators reported greater levels of anxiety, delayed ejaculation, and use of sex as an avoidance strategy. Chronic adulterers reported premature ejaculation and later onset of puberty. Designated patients were less likely to report substance abuse, employment, or finance problems. Although quantitative, this article nonetheless presents a descriptive study in which the underlying typology emerged from features most salient in routine sexological assessment. Future studies might apply purely empirical statistical techniques, such as cluster analyses, to ascertain to what extent similar typologies emerge when examined prospectively.

  20. Digital fringe projection for hand surface coordinate variation analysis caused by osteoarthritis

    NASA Astrophysics Data System (ADS)

    Nor Haimi, Wan Mokhdzani Wan; Hau Tan, Cheek; Retnasamy, Vithyacharan; Vairavan, Rajendaran; Sauli, Zaliman; Roshidah Yusof, Nor; Hambali, Nor Azura Malini Ahmad; Aziz, Muhammad Hafiz Ab; Bakhit, Ahmad Syahir Ahmad

    2017-11-01

    Hand osteoarthritis is one of the most common forms of arthritis which impact millions of people worldwide. The disabling problem occurs when the protective cartilage on the boundaries of bones wear off over time. Currently, in order to identify hand osteoarthritis, special instruments namely X-ray scanning and MRI are used for the detection but it also has its limitations such as radiation exposure and can be quite costly. In this work, an optical metrology system based on digital fringe projection which comprises of an LCD projector, CCD camera and a personal computer has been developed to anticipate abnormal growth or deformation on the joints of the hand which are common symptoms of osteoarthritis. The main concept of this optical metrology system is to apply structured light as imaging source for surface change detection. The imaging source utilizes fringe patterns generated by C++ programming and is shifted by 3 phase shifts based on the 3 steps 2 shifts method. Phase wrapping technique and analysis were applied in order to detect the deformation of live subjects. The result has demonstrated a successful method of hand deformation detection based on the pixel tracking differences of a normal and deformed state.

  1. Rapid method for controlling the correct labeling of products containing common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas) by fast real-time PCR.

    PubMed

    Espiñeira, Montserrat; Vieites, Juan M

    2012-12-15

    The TaqMan real-time PCR has the highest potential for automation, therefore representing the currently most suitable method for screening, allowing the detection of fraudulent or unintentional mislabeling of species. This work describes the development of a real-time polymerase chain reaction (RT-PCR) system for the detection and identification of common octopus (Octopus vulgaris) and main substitute species (Eledone cirrhosa and Dosidicus gigas). This technique is notable for the combination of simplicity, speed, sensitivity and specificity in an homogeneous assay. The method can be applied to all kinds of products; fresh, frozen and processed, including those undergoing intensive processes of transformation. This methodology was validated to check how the degree of food processing affects the method and the detection of each species. Moreover, it was applied to 34 commercial samples to evaluate the labeling of products made from them. The methodology herein developed is useful to check the fulfillment of labeling regulations for seafood products and to verify traceability in commercial trade and for fisheries control. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. The use of the nominal group technique as an evaluative tool in medical undergraduate education.

    PubMed

    Lloyd-Jones, G; Fowell, S; Bligh, J G

    1999-01-01

    In the present state of flux affecting UK medical undergraduate education, there is a pressing need for evaluative methods which will identify relevant outcomes both expected and unanticipated. The student perspective is now legitimately accepted to form part of any evaluative exercise but qualitative methods commonly used for this purpose are expensive in time and analytical skills. The nominal group technique (NGT) has been used for various purposes, including course evaluation, and appears well suited to this application. It combines qualitative and quantitative components in a structured interaction, which minimizes the influences of the researcher, and of group dynamics. The sequence and mechanics of the NGT process are described as applied to an end of first year evaluation in a novel undergraduate course. Doubts have been raised as to whether the results of NGT can be generalized to the larger group. In this paper, this problem is overcome by compiling a questionnaire based on the NGT items which was distributed throughout the class. Nominal group technique with questionnaire development. The medical school at The University of Liverpool. Medical students. Previous claims made on behalf of the NGT, such as the focus on the student voice, the minimizing of leadership influence and the richness of the data, are upheld in this report. Broad agreement was found with the NGT items but two items (10%) did not display any consensus. The questionnaire extension of the NGT provides back-up evidence of the reliability of the data derived from the technique and enables it to be applied to the larger groups typical of undergraduate medicine.

  3. New perineal injection technique for pudendal nerve infiltration in diagnostic and therapeutic procedures.

    PubMed

    Weinschenk, Stefan; Hollmann, Markus W; Strowitzki, Thomas

    2016-04-01

    Pudendal nerve injection is used as a diagnostic procedure in the vulvar region and for therapeutic purposes, such as in vulvodynia. Here, we provide a new, easy-to-perform perineal injection technique. We analyzed 105 perineal injections into the pudendal nerve with a local anesthetic (LA), procaine in 20 patients. A 0.4 × 40 mm needle was handled using a stop-and-go technique while monitoring the patient's discomfort. The needle was placed 1-2 cm laterally to the dorsal introitus. After aspiration, a small amount of LA was applied. After subcutaneous anesthesia, the needle was further advanced step-by-step. Thus, 5 ml could be applied with little discomfort to the patient. Anesthesia in the pudendal target region was the primary endpoint of our analysis. In 93 of 105 injections (88.6 %), complete perineal anesthesia was achieved with a single injection. 12 injections were repeated. These injections were excluded from the analysis. Severity of injection pain, on visual analog scale (VAS) from 0 to 100, was 26.8 (95 % CI 7.2-46.4). Age (β = 0.33, p < 0.01) and the number of previous injections (β = 0.35, p < 0.01) inversely correlated with injection pain. Injection pain and anesthesia were not affected by BMI, the number and the side of previous injections, or order of injection. A reversible vasovagal reaction was common, but no serious adverse effects occurred. Perineal pudendal injection is an effective and safe technique for anesthesia in diagnostic (vulva biopsy) and therapeutic indications (pudendal neuralgia), and regional anesthesia in perinatal settings.

  4. CSM solutions of rotating blade dynamics using integrating matrices

    NASA Technical Reports Server (NTRS)

    Lakin, William D.

    1992-01-01

    The dynamic behavior of flexible rotating beams continues to receive considerable research attention as it constitutes a fundamental problem in applied mechanics. Further, beams comprise parts of many rotating structures of engineering significance. A topic of particular interest at the present time involves the development of techniques for obtaining the behavior in both space and time of a rotor acted upon by a simple airload loading. Most current work on problems of this type use solution techniques based on normal modes. It is certainly true that normal modes cannot be disregarded, as knowledge of natural blade frequencies is always important. However, the present work has considered a computational structural mechanics (CSM) approach to rotor blade dynamics problems in which the physical properties of the rotor blade provide input for a direct numerical solution of the relevant boundary-and-initial-value problem. Analysis of the dynamics of a given rotor system may require solution of the governing equations over a long time interval corresponding to many revolutions of the loaded flexible blade. For this reason, most of the common techniques in computational mechanics, which treat the space-time behavior concurrently, cannot be applied to the rotor dynamics problem without a large expenditure of computational resources. By contrast, the integrating matrix technique of computational mechanics has the ability to consistently incorporate boundary conditions and 'remove' dependence on a space variable. For problems involving both space and time, this feature of the integrating matrix approach thus can generate a 'splitting' which forms the basis of an efficient CSM method for numerical solution of rotor dynamics problems.

  5. The reliability and accuracy of estimating heart-rates from RGB video recorded on a consumer grade camera

    NASA Astrophysics Data System (ADS)

    Eaton, Adam; Vincely, Vinoin; Lloyd, Paige; Hugenberg, Kurt; Vishwanath, Karthik

    2017-03-01

    Video Photoplethysmography (VPPG) is a numerical technique to process standard RGB video data of exposed human skin and extracting the heart-rate (HR) from the skin areas. Being a non-contact technique, VPPG has the potential to provide estimates of subject's heart-rate, respiratory rate, and even the heart rate variability of human subjects with potential applications ranging from infant monitors, remote healthcare and psychological experiments, particularly given the non-contact and sensor-free nature of the technique. Though several previous studies have reported successful correlations in HR obtained using VPPG algorithms to HR measured using the gold-standard electrocardiograph, others have reported that these correlations are dependent on controlling for duration of the video-data analyzed, subject motion, and ambient lighting. Here, we investigate the ability of two commonly used VPPG-algorithms in extraction of human heart-rates under three different laboratory conditions. We compare the VPPG HR values extracted across these three sets of experiments to the gold-standard values acquired by using an electrocardiogram or a commercially available pulseoximeter. The two VPPG-algorithms were applied with and without KLT-facial feature tracking and detection algorithms from the Computer Vision MATLAB® toolbox. Results indicate that VPPG based numerical approaches have the ability to provide robust estimates of subject HR values and are relatively insensitive to the devices used to record the video data. However, they are highly sensitive to conditions of video acquisition including subject motion, the location, size and averaging techniques applied to regions-of-interest as well as to the number of video frames used for data processing.

  6. Pipette aspiration applied to the characterization of nonhomogeneous, transversely isotropic materials used for vocal fold modeling.

    PubMed

    Weiß, S; Thomson, S L; Lerch, R; Döllinger, M; Sutor, A

    2013-01-01

    The etiology and treatment of voice disorders are still not completely understood. Since the vibratory characteristics of vocal folds are strongly influenced by both anatomy and mechanical material properties, measurement methods to analyze the material behavior of vocal fold tissue are required. Due to the limited life time of real tissue in the laboratory, synthetic models are often used to study vocal fold vibrations. In this paper we focus on two topics related to synthetic and real vocal fold materials. First, because certain tissues within the human vocal folds are transversely isotropic, a fabrication process for introducing this characteristic in commonly used vocal fold modeling materials is presented. Second, the pipette aspiration technique is applied to the characterization of these materials. By measuring the displacement profiles of stretched specimens that exhibit varying degrees of transverse isotropy, it is shown that local anisotropy can be quantified using a parameter describing the deviation from an axisymmetric profile. The potential for this technique to characterize homogeneous, anisotropic materials, including soft biological tissues such as those found in the human vocal folds, is supplemented by a computational study. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Multimodal label-free ex vivo imaging using a dual-wavelength microscope with axial chromatic aberration compensation.

    PubMed

    Filippi, Andrea; Dal Sasso, Eleonora; Iop, Laura; Armani, Andrea; Gintoli, Michele; Sandri, Marco; Gerosa, Gino; Romanato, Filippo; Borile, Giulia

    2018-03-01

    Label-free microscopy is a very powerful technique that can be applied to study samples with no need for exogenous fluorescent probes, keeping the main benefits of multiphoton microscopy, such as longer penetration depths and intrinsic optical sectioning while enabling serial multitechniques examinations on the same specimen. Among the many label-free microscopy methods, harmonic generation (HG) is one of the most intriguing methods due to its generally low photo-toxicity and relative ease of implementation. Today, HG and common two-photon microscopy (TPM) are well-established techniques, and are routinely used in several research fields. However, they require a significant amount of fine-tuning to be fully exploited, making them quite difficult to perform in parallel. Here, we present our designed multimodal microscope, capable of performing simultaneously TPM and HG without any kind of compromise thanks to two, separate, individually optimized laser sources with axial chromatic aberration compensation. We also apply our setup to the examination of a plethora of ex vivo samples to prove its capabilities and the significant advantages of a multimodal approach. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  8. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    PubMed

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Solid immersion lenses for enhancing the optical resolution of thermal and electroluminescence mapping of GaN-on-SiC transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomeroy, J. W., E-mail: James.Pomeroy@Bristol.ac.uk; Kuball, M.

    2015-10-14

    Solid immersion lenses (SILs) are shown to greatly enhance optical spatial resolution when measuring AlGaN/GaN High Electron Mobility Transistors (HEMTs), taking advantage of the high refractive index of the SiC substrates commonly used for these devices. Solid immersion lenses can be applied to techniques such as electroluminescence emission microscopy and Raman thermography, aiding the development device physics models. Focused ion beam milling is used to fabricate solid immersion lenses in SiC substrates with a numerical aperture of 1.3. A lateral spatial resolution of 300 nm is demonstrated at an emission wavelength of 700 nm, and an axial spatial resolution of 1.7 ± 0.3 μm atmore » a laser wavelength of 532 nm is demonstrated; this is an improvement of 2.5× and 5×, respectively, when compared with a conventional 0.5 numerical aperture objective lens without a SIL. These results highlight the benefit of applying the solid immersion lenses technique to the optical characterization of GaN HEMTs. Further improvements may be gained through aberration compensation and increasing the SIL numerical aperture.« less

  10. Vibration based condition monitoring of a multistage epicyclic gearbox in lifting cranes

    NASA Astrophysics Data System (ADS)

    Assaad, Bassel; Eltabach, Mario; Antoni, Jérôme

    2014-01-01

    This paper proposes a model-based technique for detecting wear in a multistage planetary gearbox used by lifting cranes. The proposed method establishes a vibration signal model which deals with cyclostationary and autoregressive models. First-order cyclostationarity is addressed by the analysis of the time synchronous average (TSA) of the angular resampled vibration signal. Then an autoregressive model (AR) is applied to the TSA part in order to extract a residual signal containing pertinent fault signatures. The paper also explores a number of methods commonly used in vibration monitoring of planetary gearboxes, in order to make comparisons. In the experimental part of this study, these techniques are applied to accelerated lifetime test bench data for the lifting winch. After processing raw signals recorded with an accelerometer mounted on the outside of the gearbox, a number of condition indicators (CIs) are derived from the TSA signal, the residual autoregressive signal and other signals derived using standard signal processing methods. The goal is to check the evolution of the CIs during the accelerated lifetime test (ALT). Clarity and fluctuation level of the historical trends are finally considered as a criteria for comparing between the extracted CIs.

  11. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  12. Structural Health Monitoring Using Textile Reinforcement Structures with Integrated Optical Fiber Sensors

    PubMed Central

    Bremer, Kort; Weigand, Frank; Zheng, Yulong; Alwis, Lourdes Shanika; Helbig, Reinhard; Roth, Bernhard

    2017-01-01

    Optical fiber-based sensors “embedded” in functionalized carbon structures (FCSs) and textile net structures (TNSs) based on alkaline-resistant glass are introduced for the purpose of structural health monitoring (SHM) of concrete-based structures. The design aims to monitor common SHM parameters such as strain and cracks while at the same time acting as a structural strengthening mechanism. The sensor performances of the two systems are characterized in situ using Mach-Zehnder interferometric (MZI) and optical attenuation measurement techniques, respectively. For this purpose, different FCS samples were subjected to varying elongation using a tensile testing machine by carefully incrementing the applied force, and good correlation between the applied force and measured length change was observed. For crack detection, the functionalized TNSs were embedded into a concrete block which was then exposed to varying load using the three-point flexural test until destruction. Promising results were observed, identifying that the location of the crack can be determined using the conventional optical time domain reflectometry (OTDR) technique. The embedded sensors thus evaluated show the value of the dual achievement of the schemes proposed in obtaining strain/crack measurement while being utilized as strengthening agents as well. PMID:28208636

  13. Review of Bioassays for Monitoring Fate and Transport ofEstrogenic Endocrine Disrupting Compounds in Water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CGCampbell@lbl.gov

    Endocrine disrupting compounds (EDCs) are recognizedcontaminants threatening water quality. Despite efforts in sourceidentification, few strategies exist for characterization or treatment ofthis environmental pollution. Given that there are numerous EDCs that cannegatively affect humans and wildlife, general screening techniques likebioassays and biosensors provide an essential rapid and intensiveanalysis capacity. Commonly applied bioassays include the ELISA and YESassays, but promising technologies include ER-CALUXa, ELRA, Endotecta,RIANA, and IR-bioamplification. Two biosensors, Endotecta and RIANA, arefield portable using non-cellular biological detection strategies.Environmental management of EDCs in water requires integration ofbiosensors and bioassays for monitoring and assessment.

  14. Conflicts in developing countries: a case study from Rio de Janeiro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bredariol, Celso Simoes; Magrini, Alessandra

    In developing countries, environmental conflicts are resolved mainly in the political arena. In the developed nations, approaches favoring structured negotiation support techniques are more common, with methodologies and studies designed especially for this purpose, deriving from Group Communications and Decision Theory. This paper analyzes an environmental dispute in the City of Rio de Janeiro, applying conflict analysis methods and simulating its settlement. It concludes that the use of these methodologies in the developing countries may be undertaken with adaptations, designed to train community groups in negotiating while fostering the democratization of the settlement of these disputes.

  15. Investigation of photodynamic effect caused by MPPa-PDT on breast cancer Investigation of photodynamic effect caused by MPPa-PDT

    NASA Astrophysics Data System (ADS)

    Tian, Y. Y.; Hu, X. Y.; Leung, W. N.; Yuan, H. Q.; Zhang, L. Y.; Cui, F. A.; Tian, X.

    2012-10-01

    Breast cancer is the common malignant tumor, the incidence increases with age. Photodynamic therapy (PDT) is a new technique applied in tumors, which involves the administration of a tumor localizing photosensitizer and it is followed by the activation of a specific wavelength. Pyropheophorbide-a methyl ester (MPPa), a derivative of chlorophyll, is a novel potent photosensitizer. We are exploring the photodynamic effect caused by MPPa-PDT on breast cancer. The in vitro and in vivo experiments indicate that MPPa is a comparatively ideal photosensitizer which can induce apoptosis in breast cancer.

  16. Piecewise uniform conduction-like flow channels and method therefor

    DOEpatents

    Cummings, Eric B [Livermore, CA; Fiechtner, Gregory J [Livermore, CA

    2006-02-28

    A low-dispersion methodology for designing microfabricated conduction channels for on-chip electrokinetic-based systems is presented. The technique relies on trigonometric relations that apply for ideal electrokinetic flows, allowing faceted channels to be designed on chips using common drafting software and a hand calculator. Flows are rotated and stretched along the abrupt interface between adjacent regions with differing permeability. Regions bounded by interfaces form flow "prisms" that can be combined with other designed prisms to obtain a wide range of turning angles and expansion ratios while minimizing dispersion. Designs are demonstrated using two-dimensional numerical solutions of the Laplace equation.

  17. Introduction to cell culture.

    PubMed

    Philippeos, Christina; Hughes, Robin D; Dhawan, Anil; Mitry, Ragai R

    2012-01-01

    The basics of cell culture as applied to human cells are discussed. Biosafety when working with human tissue, which is often pathogenic, is important. The requirements for a tissue culture laboratory are described, particularly the range of equipment needed to carry out cell isolation, purification, and culture. Steps must be taken to maintain aseptic conditions to prevent contamination of cultures with micro-organisms. Basic cell-handling techniques are discussed, including choice of media, primary culture, and cryopreservation of cells so they can be stored for future use. Common assays which are used to determine cell viability and activity are considered.

  18. Characteristics of health plans that treat psychiatric patients.

    PubMed

    Zarin, D A; West, J C; Pincus, H A; Tanielian, T L

    1999-01-01

    Nationally representative data regarding the organizational, financial, and procedural features of health plans in which psychiatric patients receive treatment indicate that fewer privately insured, Medicaid, and Medicare managed care enrollees receive care from a psychiatrist than is true for "nonmanaged" enrollees. Financial considerations were reported to adversely affect treatment for one-third of all patients. Although utilization management techniques and financial/resource constraints commonly applied to patients in both managed and nonmanaged plans, performance-based incentives were rare in nonmanaged plans. The traditional health plan categories provide limited information to identify salient plan characteristics and guide policy decisions regarding the provision of care.

  19. On the Interconnection of Incompatible Solid Finite Element Meshes Using Multipoint Constraints

    NASA Technical Reports Server (NTRS)

    Fox, G. L.

    1985-01-01

    Incompatible meshes, i.e., meshes that physically must have a common boundary, but do not necessarily have coincident grid points, can arise in the course of a finite element analysis. For example, two substructures may have been developed at different times for different purposes and it becomes necessary to interconnect the two models. A technique that uses only multipoint constraints, i.e., MPC cards (or MPCS cards in substructuring), is presented. Since the method uses only MPC's, the procedure may apply at any stage in an analysis; no prior planning or special data is necessary.

  20. Doppler flow imaging of cytoplasmic streaming using spectral domain phase microscopy

    NASA Astrophysics Data System (ADS)

    Choma, Michael A.; Ellerbee, Audrey K.; Yazdanfar, Siavash; Izatt, Joseph A.

    2006-03-01

    Spectral domain phase microscopy (SDPM) is a function extension of spectral domain optical coherence tomography. SDPM achieves exquisite levels of phase stability by employing common-path interferometry. We discuss the theory and limitations of Doppler flow imaging using SDPM, demonstrate monitoring the thermal contraction of a glass sample with nanometer per second velocity sensitivity, and apply this technique to measurement of cytoplasmic streaming in an Amoeba proteus pseudopod. We observe reversal of cytoplasmic flow induced by extracellular CaCl2, and report results that suggest parabolic flow of cytoplasm in the A. proteus pseudopod.

  1. Arabinogalactan-proteins and the research challenges for these enigmatic plant cell surface proteoglycans

    PubMed Central

    Tan, Li; Showalter, Allan M.; Egelund, Jack; Hernandez-Sanchez, Arianna; Doblin, Monika S.; Bacic, Antony

    2012-01-01

    Arabinogalactan-proteins (AGPs) are complex glycoconjugates that are commonly found at the cell surface and in secretions of plants. Their location and diversity of structures have made them attractive targets as modulators of plant development but definitive proof of their direct role(s) in biological processes remains elusive. Here we overview the current state of knowledge on AGPs, identify key challenges impeding progress in the field and propose approaches using modern bioinformatic, (bio)chemical, cell biological, molecular and genetic techniques that could be applied to redress these gaps in our knowledge. PMID:22754559

  2. Common Bean: A Legume Model on the Rise for Unraveling Responses and Adaptations to Iron, Zinc, and Phosphate Deficiencies.

    PubMed

    Castro-Guerrero, Norma A; Isidra-Arellano, Mariel C; Mendoza-Cozatl, David G; Valdés-López, Oswaldo

    2016-01-01

    Common bean (Phaseolus vulgaris) was domesticated ∼8000 years ago in the Americas and today is a staple food worldwide. Besides caloric intake, common bean is also an important source of protein and micronutrients and it is widely appreciated in developing countries for their affordability (compared to animal protein) and its long storage life. As a legume, common bean also has the economic and environmental benefit of associating with nitrogen-fixing bacteria, thus reducing the use of synthetic fertilizers, which is key for sustainable agriculture. Despite significant advances in the plant nutrition field, the mechanisms underlying the adaptation of common bean to low nutrient input remains largely unknown. The recent release of the common bean genome offers, for the first time, the possibility of applying techniques and approaches that have been exclusive to model plants to study the adaptive responses of common bean to challenging environments. In this review, we discuss the hallmarks of common bean domestication and subsequent distribution around the globe. We also discuss recent advances in phosphate, iron, and zinc homeostasis, as these nutrients often limit plant growth, development, and yield. In addition, iron and zinc are major targets of crop biofortification to improve human nutrition. Developing common bean varieties able to thrive under nutrient limiting conditions will have a major impact on human nutrition, particularly in countries where dry beans are the main source of carbohydrates, protein and minerals.

  3. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Comparison of Sequential and Variational Data Assimilation

    NASA Astrophysics Data System (ADS)

    Alvarado Montero, Rodolfo; Schwanenberg, Dirk; Weerts, Albrecht

    2017-04-01

    Data assimilation is a valuable tool to improve model state estimates by combining measured observations with model simulations. It has recently gained significant attention due to its potential in using remote sensing products to improve operational hydrological forecasts and for reanalysis purposes. This has been supported by the application of sequential techniques such as the Ensemble Kalman Filter which require no additional features within the modeling process, i.e. it can use arbitrary black-box models. Alternatively, variational techniques rely on optimization algorithms to minimize a pre-defined objective function. This function describes the trade-off between the amount of noise introduced into the system and the mismatch between simulated and observed variables. While sequential techniques have been commonly applied to hydrological processes, variational techniques are seldom used. In our believe, this is mainly attributed to the required computation of first order sensitivities by algorithmic differentiation techniques and related model enhancements, but also to lack of comparison between both techniques. We contribute to filling this gap and present the results from the assimilation of streamflow data in two basins located in Germany and Canada. The assimilation introduces noise to precipitation and temperature to produce better initial estimates of an HBV model. The results are computed for a hindcast period and assessed using lead time performance metrics. The study concludes with a discussion of the main features of each technique and their advantages/disadvantages in hydrological applications.

  5. The design of common aperture and multi-band optical system based on day light telescope

    NASA Astrophysics Data System (ADS)

    Chen, Jiao; Wang, Ling; Zhang, Bo; Teng, Guoqi; Wang, Meng

    2017-02-01

    As the development of electro-optical weapon system, the technique of common path and multi-sensor are used popular, and becoming a trend. According to the requirement of miniaturization and lightweight for electro-optical stabilized sighting system, a day light telescope/television viewing-aim system/ laser ranger has been designed in this thesis, which has common aperture. Thus integration scheme of multi-band and common aperture has been adopted. A day light telescope has been presented, which magnification is 8, field of view is 6°, and distance of exit pupil is more than 20mm. For 1/3" CCD, television viewing-aim system which has 156mm focal length, has been completed. In addition, laser ranging system has been designed, with 10km raging distance. This paper outlines its principle which used day light telescope as optical reference of correcting the optical axis. Besides, by means of shared objective, reserved image with inverting prism and coating beam-splitting film on the inclined plane of the cube prism, the system has been applied to electro-optical weapon system, with high-resolution of imaging and high-precision ranging.

  6. External carotid compression: a novel technique to improve cerebral perfusion during selective antegrade cerebral perfusion for aortic arch surgery.

    PubMed

    Grocott, Hilary P; Ambrose, Emma; Moon, Mike

    2016-10-01

    Selective antegrade cerebral perfusion (SACP) involving cannulation of either the axillary or innominate artery is a commonly used technique for maintaining cerebral blood flow (CBF) during the use of hypothermic cardiac arrest (HCA) for operations on the aortic arch. Nevertheless, asymmetrical CBF with hypoperfusion of the left cerebral hemisphere is a common occurrence during SACP. The purpose of this report is to describe an adjunctive maneuver to improve left hemispheric CBF during SACP by applying extrinsic compression to the left carotid artery. A 77-yr-old male patient with a history of aortic valve replacement presented for emergent surgical repair of an acute type A aortic dissection of a previously known ascending aortic aneurysm. His intraoperative course included cannulation of the right axillary artery, which was used as the aortic inflow during cardiopulmonary bypass and also allowed for subsequent SACP during HCA. After the onset of HCA, the innominate artery was clamped at its origin to allow for SACP. Shortly thereafter, however, the left-sided cerebral oxygen saturation (SrO2) began to decrease. Augmenting the PaO2, PaCO2 and both SACP pressure and flow failed to increase left hemispheric SrO2. Following the use of ultrasound guidance to confirm the absence of atherosclerotic disease in the carotid artery, external pressure was applied partially compressing the artery. With the carotid compression, the left cerebral saturation abruptly increased, suggesting pressurization of the left cerebral hemispheric circulation and augmentation of CBF. Direct ultrasound visualization and cautious partial compression of the left carotid artery may address asymmetrical CBF that occurs with SACP during HCA for aortic arch surgery. This strategy may lead to improved symmetry of CBF and corresponding cerebral oximetry measurements during aortic arch surgery.

  7. Refraction traveltime tomography based on damped wave equation for irregular topographic model

    NASA Astrophysics Data System (ADS)

    Park, Yunhui; Pyun, Sukjoon

    2018-03-01

    Land seismic data generally have time-static issues due to irregular topography and weathered layers at shallow depths. Unless the time static is handled appropriately, interpretation of the subsurface structures can be easily distorted. Therefore, static corrections are commonly applied to land seismic data. The near-surface velocity, which is required for static corrections, can be inferred from first-arrival traveltime tomography, which must consider the irregular topography, as the land seismic data are generally obtained in irregular topography. This paper proposes a refraction traveltime tomography technique that is applicable to an irregular topographic model. This technique uses unstructured meshes to express an irregular topography, and traveltimes calculated from the frequency-domain damped wavefields using the finite element method. The diagonal elements of the approximate Hessian matrix were adopted for preconditioning, and the principle of reciprocity was introduced to efficiently calculate the Fréchet derivative. We also included regularization to resolve the ill-posed inverse problem, and used the nonlinear conjugate gradient method to solve the inverse problem. As the damped wavefields were used, there were no issues associated with artificial reflections caused by unstructured meshes. In addition, the shadow zone problem could be circumvented because this method is based on the exact wave equation, which does not require a high-frequency assumption. Furthermore, the proposed method was both robust to an initial velocity model and efficient compared to full wavefield inversions. Through synthetic and field data examples, our method was shown to successfully reconstruct shallow velocity structures. To verify our method, static corrections were roughly applied to the field data using the estimated near-surface velocity. By comparing common shot gathers and stack sections with and without static corrections, we confirmed that the proposed tomography algorithm can be used to correct the statics of land seismic data.

  8. Low-rank Atlas Image Analyses in the Presence of Pathologies

    PubMed Central

    Liu, Xiaoxiao; Niethammer, Marc; Kwitt, Roland; Singh, Nikhil; McCormick, Matt; Aylward, Stephen

    2015-01-01

    We present a common framework, for registering images to an atlas and for forming an unbiased atlas, that tolerates the presence of pathologies such as tumors and traumatic brain injury lesions. This common framework is particularly useful when a sufficient number of protocol-matched scans from healthy subjects cannot be easily acquired for atlas formation and when the pathologies in a patient cause large appearance changes. Our framework combines a low-rank-plus-sparse image decomposition technique with an iterative, diffeomorphic, group-wise image registration method. At each iteration of image registration, the decomposition technique estimates a “healthy” version of each image as its low-rank component and estimates the pathologies in each image as its sparse component. The healthy version of each image is used for the next iteration of image registration. The low-rank and sparse estimates are refined as the image registrations iteratively improve. When that framework is applied to image-to-atlas registration, the low-rank image is registered to a pre-defined atlas, to establish correspondence that is independent of the pathologies in the sparse component of each image. Ultimately, image-to-atlas registrations can be used to define spatial priors for tissue segmentation and to map information across subjects. When that framework is applied to unbiased atlas formation, at each iteration, the average of the low-rank images from the patients is used as the atlas image for the next iteration, until convergence. Since each iteration’s atlas is comprised of low-rank components, it provides a population-consistent, pathology-free appearance. Evaluations of the proposed methodology are presented using synthetic data as well as simulated and clinical tumor MRI images from the brain tumor segmentation (BRATS) challenge from MICCAI 2012. PMID:26111390

  9. [Preliminary processing, processing and usage of Dendrobii Caulis in history].

    PubMed

    Yang, Wen-yu; Tang, Sheng; Shi, Dong-jun; Chen, Xiang-gui; Li, Ming-yuan; Tang, Xian-fu; Yuan, Chang-jiang

    2015-07-01

    On account of the dense cuticles of the fresh stem and the light, hard and pliable texture of the dried stem, Dendrobii Caulis is difficult to dry or pulverize. So, it is very important to the ancient doctors that Dendrobii Caulis should be properly treated and applied to keep or evoke its medicinal effects. The current textual research results about the preliminary processing, processing and usage methods of Dendrobii Caulis showed that: (1) In history the clinical use of fresh or processed Dendrobii Caulis as teas and tinctures were very common. (2) Its roots and rhizomes would be removed before using. (3) Some ancillary approaches were applied to shorten drying times, such as rinsing with boiling mulberry-ash soup, washing or soaking with liquor, mixing with rice pulp and then basking, etc. (4) According to the ancients knowledge, the sufficient pulverization, by means of slicing, rasping, hitting or pestling techniques, was necessary for Dendrobii Caulis to take its effects. (5) The heat processing methods for Dendrobii Caulis included stir-baking, stir-frying, steaming, decocting and stewing techniques, usually with liquor as an auxiliary material. Among above mentioned, steaming by pretreating with liquor was most commonly used, and this scheme was colorfully drawn in Bu Yi Lei Gong Pao Zhi Bian Lan (Ming Dynasty, 1591 CE) ; moreover, decocting in advance or long-time simmering so as to prepare paste products were recommended in the Qing Dynasty. (6) Some different processing programs involving stir-baking with grit, air-tightly baking with ondol (Kangs), fumigating with sulfur, which appeared in modern times and brought attractive outward appearance of the drug, went against ancients original intentions of ensuring drug efficacy.

  10. The impact of Kinesio taping technique on children with cerebral palsy

    PubMed Central

    Shamsoddini, Alireza; Rasti, Zabihallah; Kalantari, Minoo; Hollisaz, Mohammad Taghi; Sobhani, Vahid; Dalvand, Hamid; Bakhshandeh-Bali, Mohammad Kazem

    2016-01-01

    Cerebral palsy (CP) is the most common movement disorder in children that is associated with life-long disability and multiple impairments. The clinical manifestations of CP vary among children. CP is accompanied by a wide range of problems and has a broad spectrum. Children with CP demonstrate poor fine and dross motor function due to psychomotor disturbances. Early rehabilitation programs are essential for children with CP and should be appropriate for the age and functional condition of the patients. Kinesio taping (KT) technique is a relatively new technique applied in rehabilitation programs of CP. This article reviews the effects of KT techniques on improving motor skills in children with CP. In this study, we used keywords "cerebral palsy, Kinesio Tape, KT and Taping" in the national and international electronic databases between 1999 and 2016. Out of the 43 articles obtained, 21 studies met the inclusion criteria. There are several different applications about KT technique in children with CP. Review of the literature demonstrated that the impact of this technique on gross and fine motor function and dynamic activities is more effective than postural and static activities. Also this technique has more effectiveness in the child at higher developmental and motor stages. The majority of consistent findings showed that KT technique as part of a multimodal therapy program can be effective in the rehabilitation of children with CP to improve motor function and dynamic activities especially in higher developmental and motor stages. PMID:28435631

  11. Use and effectiveness of commercial flit-spray insecticides in control of mosquito population in Sagamu, Southwest Nigeria.

    PubMed

    Adedeji, A A; Ahmed, I A; Akinwunmi, M; Aina, S A; Tikare, O; Adeboye, A F; Badmos, S O; Adedeji, K A; Fehintola, F A; Amoo, A O J

    2012-06-01

    Control of mosquito vector is crucial to reducing the burden of malaria in endemic region. In the present study, we investigated the use of commercial insecticides in families and their effectiveness in control of mosquito population in Sagamu, southwest Nigeria. A pretested structured questionnaire was used to determine mosquito adulticides techniques employed in the community and most commonly used adulticides were evaluated for effectiveness by exposing adult mosquitoes to varying concentrations of the insecticides and responses monitored. Families differ in methods adopted to prevent mosquito and use of flit-spray insecticide was commoner. Although parents constitute 64% of those applying the insecticide, 22.2% were children. Household pyrethroid insecticide products of Baygon (Imiprothrin, Prallethrin plus Cyfluthrin), Mobil (Neopynamin, Prallethrin plus Cyphenothrin) and Raid (Pynamin forte, Neopynamin plus Deltimethrin) were three commonly used in the community. The exposure tie interval for eath of osquitoes was shorter with Raid (100% at 8 minutes) when compared with Mobil (80%) and Baygon (85%) at 10 minutes (p = 0.005). Kaplan-Meier survival curve of cumulative probability of surviving exposure to insecticide was lowest with Raid (log rank 2 = 14.56, P = 0.001). Although flit-spray insecticides are affordable with simple application tool, inexplicit use-instruction on labels may cause discrepancies in application. Monitoring responses of mosquitoes to commercial flit-spray insecticide may support effective control technique and prevention of vector resistance in poor resource communities.

  12. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2013-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.

  13. A survey of camera error sources in machine vision systems

    NASA Astrophysics Data System (ADS)

    Jatko, W. B.

    In machine vision applications, such as an automated inspection line, television cameras are commonly used to record scene intensity in a computer memory or frame buffer. Scene data from the image sensor can then be analyzed with a wide variety of feature-detection techniques. Many algorithms found in textbooks on image processing make the implicit simplifying assumption of an ideal input image with clearly defined edges and uniform illumination. The ideal image model is helpful to aid the student in understanding the principles of operation, but when these algorithms are blindly applied to real-world images the results can be unsatisfactory. This paper examines some common measurement errors found in camera sensors and their underlying causes, and possible methods of error compensation. The role of the camera in a typical image-processing system is discussed, with emphasis on the origination of signal distortions. The effects of such things as lighting, optics, and sensor characteristics are considered.

  14. Multifractal analysis of charged particle distributions using horizontal visibility graph and sandbox algorithm

    NASA Astrophysics Data System (ADS)

    Mali, P.; Mukhopadhyay, A.; Manna, S. K.; Haldar, P. K.; Singh, G.

    2017-03-01

    Horizontal visibility graphs (HVGs) and the sandbox (SB) algorithm usually applied for multifractal characterization of complex network systems that are converted from time series measurements, are used to characterize the fluctuations in pseudorapidity densities of singly charged particles produced in high-energy nucleus-nucleus collisions. Besides obtaining the degree distribution associated with event-wise pseudorapidity distributions, the common set of observables, typical of any multifractality measurement, are studied in 16O-Ag/Br and 32S-Ag/Br interactions, each at an incident laboratory energy of 200 GeV/nucleon. For a better understanding, we systematically compare the experiment with a Monte Carlo model simulation based on the Ultra-relativistic Quantum Molecular Dynamics (UrQMD). Our results suggest that the HVG-SB technique is an efficient tool that can characterize multifractality in multiparticle emission data, and in some cases, it is even superior to other methods more commonly used in this regard.

  15. Representing Mutually Exclusive Knowledge in a Property Hierarchy for a Reasoning System in Clinical Gynecology

    PubMed Central

    Small, Steven L.; Muechler, Eberhard K.

    1985-01-01

    The education and practice of clinical medicine can benefit significantly from the use of computational assistants. This article describes the development of a prototype system called SURGES (Strong/University of Rochester Gynecological Expert System) for representing medical knowledge and then applying this knowledge to suggest diagnostic procedures in medical gynecology. The paper focuses on the representation technique of property inheritance, which facilitates the simple common sense reasoning required to enable execution of the more complex medical inferences. Such common sense can be viewed as a collection mundane inferences, which are the simple conclusions drawn from knowledge that an exclusive or (XOR) relation (i.e., mutual exclusion) holds among a number of facts. The paper discusses the use of a property hierarchy for this purpose and shows how it simplifies knowledge representation in medical artificial intelligence (AIM) computer systems.

  16. Bypass laparoscopic procedure for palliation of esophageal cancer.

    PubMed

    Siosaki, Marcos Duarte; Lacerda, Croider Franco; Bertulucci, Paulo Anderson; da Costa Filho, José Orlando; de Oliveira, Antônio Talvane Torres

    2013-03-26

    Esophageal cancer is a devastating disease with rapidly increasing incidence in Western countries. Dysphagia is the most common complication, causing severe malnutrition and reduced quality of life. A 69-year-old male with persistent esophageal cancer after radiation therapy was subjected to palliative by-pass surgery using a laparoscopic approach. Due to the advanced stage at diagnosis, palliative treatment was a more realistic option. Dysphagia is a most distressing symptom of this disease, causing malnutrition and reducing quality of life. The goal of palliation is to improve swallowing. The most common methods applied are endoscopic stenting, radiation therapy (external or brachytherapy), chemotherapy, yttrium-aluminum-garnet laser rechanneling or endoscopic dilatation. Palliative surgery is rarely proposed due to morbidity and complications. This paper demonstrates an update in the technique proposed by Postlethwait in 1979 for palliation of esophageal cancer. Published by Oxford University Press and JSCR Publishing Ltd. All rights reserved. © The Author 2013.

  17. Developing a common language for using social marketing: an analysis of Public Health literature.

    PubMed

    Quinn, Gwendolyn P; Ellery, Jane; Thomas, Kamilah B; Marshall, Robert

    2010-10-01

    The term social marketing has been used to describe a multitude of interventions that incorporate the use of traditional marketing techniques to promote a behavior that will improve the health or well-being of a target audience or of society as a whole. However, there is wide variation in the way social marketing is defined and used. This systematic review article examines how social marketing has been defined and applied to social problems within the public health literature from 2001-2006, by adapting a grading-system borrowed from evidence-based medicine and utilizing Kotler and Zaltman's definition of social marketing. Additionally, definitions of social marketing were identified in the reviewed articles. Identifying a common language in the description and design of social marketing interventions will benefit researchers and practitioners interested in social marketing as a behavior change approach.

  18. Ensuring Confidentiality of Geocoded Health Data: Assessing Geographic Masking Strategies for Individual-Level Data.

    PubMed

    Zandbergen, Paul A

    2014-01-01

    Public health datasets increasingly use geographic identifiers such as an individual's address. Geocoding these addresses often provides new insights since it becomes possible to examine spatial patterns and associations. Address information is typically considered confidential and is therefore not released or shared with others. Publishing maps with the locations of individuals, however, may also breach confidentiality since addresses and associated identities can be discovered through reverse geocoding. One commonly used technique to protect confidentiality when releasing individual-level geocoded data is geographic masking. This typically consists of applying a certain amount of random perturbation in a systematic manner to reduce the risk of reidentification. A number of geographic masking techniques have been developed as well as methods to quantity the risk of reidentification associated with a particular masking method. This paper presents a review of the current state-of-the-art in geographic masking, summarizing the various methods and their strengths and weaknesses. Despite recent progress, no universally accepted or endorsed geographic masking technique has emerged. Researchers on the other hand are publishing maps using geographic masking of confidential locations. Any researcher publishing such maps is advised to become familiar with the different masking techniques available and their associated reidentification risks.

  19. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    PubMed

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Advanced neuroimaging techniques for the term newborn with encephalopathy.

    PubMed

    Chau, Vann; Poskitt, Kenneth John; Miller, Steven Paul

    2009-03-01

    Neonatal encephalopathy is associated with a high risk of morbidity and mortality in the neonatal period and of long-term neurodevelopmental disability in survivors. Advanced magnetic resonance techniques now play a major role in the clinical care of newborns with encephalopathy and in research addressing this important condition. From conventional magnetic resonance imaging, typical patterns of injury have been defined in neonatal encephalopathy. When applied in contemporary cohorts of newborns with encephalopathy, the patterns of brain injury on magnetic resonance imaging distinguish risk factors, clinical presentation, and risk of abnormal outcome. Advanced magnetic resonance techniques such as magnetic resonance spectroscopy, diffusion-weighted imaging, and diffusion tensor imaging provide novel perspectives on neonatal brain metabolism, microstructure, and connectivity. With the application of these imaging tools, it is increasingly apparent that brain injury commonly occurs at or near the time of birth and evolves over the first weeks of life. These observations have complemented findings from trials of emerging strategies of brain protection, such as hypothermia. Application of these advanced magnetic resonance techniques may enable the earliest possible identification of newborns at risk of neurodevelopmental impairment, thereby ensuring appropriate follow-up with rehabilitation and psychoeducational resources.

  1. Virtual reality system for treatment of the fear of public speaking using image-based rendering and moving pictures.

    PubMed

    Lee, Jae M; Ku, Jeong H; Jang, Dong P; Kim, Dong H; Choi, Young H; Kim, In Y; Kim, Sun I

    2002-06-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology enabled us to use virtual reality (VR) for the treatment of the fear of public speaking. There have been two techniques used to construct a virtual environment for the treatment of the fear of public speaking: model-based and movie-based. Virtual audiences and virtual environments made by model-based technique are unrealistic and unnatural. The movie-based technique has a disadvantage in that each virtual audience cannot be controlled respectively, because all virtual audiences are included in one moving picture file. To address this disadvantage, this paper presents a virtual environment made by using image-based rendering (IBR) and chroma keying simultaneously. IBR enables us to make the virtual environment realistic because the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma keying allows a virtual audience to be controlled individually. In addition, a real-time capture technique was applied in constructing the virtual environment to give the subjects more interaction, in that they can talk with a therapist or another subject.

  2. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Application of Metamorphic Testing to Supervised Classifiers

    PubMed Central

    Xie, Xiaoyuan; Ho, Joshua; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh

    2010-01-01

    Many applications in the field of scientific computing - such as computational biology, computational linguistics, and others - depend on Machine Learning algorithms to provide important core functionality to support solutions in the particular problem domains. However, it is difficult to test such applications because often there is no “test oracle” to indicate what the correct output should be for arbitrary input. To help address the quality of such software, in this paper we present a technique for testing the implementations of supervised machine learning classification algorithms on which such scientific computing software depends. Our technique is based on an approach called “metamorphic testing”, which has been shown to be effective in such cases. More importantly, we demonstrate that our technique not only serves the purpose of verification, but also can be applied in validation. In addition to presenting our technique, we describe a case study we performed on a real-world machine learning application framework, and discuss how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also discuss how our findings can be of use to other areas outside scientific computing, as well. PMID:21243103

  4. Multishot Targeted PROPELLER Magnetic Resonance Imaging: Description of the Technique and Initial Applications

    PubMed Central

    Deng, Jie; Larson, Andrew C.

    2010-01-01

    Objectives To test the feasibility of combining inner-volume imaging (IVI) techniques with conventional multishot periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) techniques for targeted-PROPELLER magnetic resonance imaging. Materials and Methods Perpendicular section-selective gradients for spatially selective excitation and refocusing RF pulses were applied to limit the refocused field-of-view (FOV) along the phase-encoding direction for each rectangular blade image. We performed comparison studies in phantoms and normal volunteers by using targeted-PROPELLER methods for a wide range of imaging applications that commonly use turbo-spin-echo (TSE) approaches (brain, abdominal, vessel wall, cardiac). Results In these initial studies, we demonstrated the feasibility of using targeted-PROPELLER approaches to limit the imaging FOV thereby reducing the number of blades or permitting increased spatial resolution without commensurate increases in scan time. Both phantom and in vivo motion studies demonstrated the potential for more robust regional self-navigated motion correction compared with conventional full FOV PROPELLER methods. Conclusion We demonstrated that the reduced FOV targeted-PROPELLER technique offers the potential for reducing imaging time, increasing spatial resolution, and targeting specific areas for robust regional motion correction. PMID:19465860

  5. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation

    PubMed Central

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site. PMID:29370230

  6. Identification of transformer fault based on dissolved gas analysis using hybrid support vector machine-modified evolutionary particle swarm optimisation.

    PubMed

    Illias, Hazlee Azil; Zhao Liang, Wee

    2018-01-01

    Early detection of power transformer fault is important because it can reduce the maintenance cost of the transformer and it can ensure continuous electricity supply in power systems. Dissolved Gas Analysis (DGA) technique is commonly used to identify oil-filled power transformer fault type but utilisation of artificial intelligence method with optimisation methods has shown convincing results. In this work, a hybrid support vector machine (SVM) with modified evolutionary particle swarm optimisation (EPSO) algorithm was proposed to determine the transformer fault type. The superiority of the modified PSO technique with SVM was evaluated by comparing the results with the actual fault diagnosis, unoptimised SVM and previous reported works. Data reduction was also applied using stepwise regression prior to the training process of SVM to reduce the training time. It was found that the proposed hybrid SVM-Modified EPSO (MEPSO)-Time Varying Acceleration Coefficient (TVAC) technique results in the highest correct identification percentage of faults in a power transformer compared to other PSO algorithms. Thus, the proposed technique can be one of the potential solutions to identify the transformer fault type based on DGA data on site.

  7. Advances in Photopletysmography Signal Analysis for Biomedical Applications.

    PubMed

    Moraes, Jermana L; Rocha, Matheus X; Vasconcelos, Glauber G; Vasconcelos Filho, José E; de Albuquerque, Victor Hugo C; Alexandria, Auzuir R

    2018-06-09

    Heart Rate Variability (HRV) is an important tool for the analysis of a patient’s physiological conditions, as well a method aiding the diagnosis of cardiopathies. Photoplethysmography (PPG) is an optical technique applied in the monitoring of the HRV and its adoption has been growing significantly, compared to the most commonly used method in medicine, Electrocardiography (ECG). In this survey, definitions of these technique are presented, the different types of sensors used are explained, and the methods for the study and analysis of the PPG signal (linear and nonlinear methods) are described. Moreover, the progress, and the clinical and practical applicability of the PPG technique in the diagnosis of cardiovascular diseases are evaluated. In addition, the latest technologies utilized in the development of new tools for medical diagnosis are presented, such as Internet of Things, Internet of Health Things, genetic algorithms, artificial intelligence and biosensors which result in personalized advances in e-health and health care. After the study of these technologies, it can be noted that PPG associated with them is an important tool for the diagnosis of some diseases, due to its simplicity, its cost⁻benefit ratio, the easiness of signals acquisition, and especially because it is a non-invasive technique.

  8. Optical Microscopy Techniques to Inspect for Metallic Whiskers

    NASA Technical Reports Server (NTRS)

    Brusse, Jay A.

    2006-01-01

    Metal surface finishes of tin, zinc and cadmium are often applied to electronic components, mechanical hardware and other structures. These finishes sometimes unpredictably may form metal whiskers over periods that can take from hours to months or even many years. The metal whiskers are crystalline structures commonly having uniform cross sectional area along their entire length. Typical whisker dimensions are nominally on the order of only a few microns (um) across while their lengths can extend from a few microns to several millimeters. Metal whiskers pose a reliability hazard to electronic systems primarily as an electrical shorting hazard. The extremely narrow dimensions of metal whiskers can make observation with optical techniques very challenging. The videos herein were compiled to demonstrate the complexities associated with optical microscope inspection of electronic and mechanical components and assemblies for the presence or absence of metal whiskers. The importance of magnification, light source and angle of illumination play critical roles in being able to detect metal whiskers when present. Furthermore, it is demonstrated how improper techniques can easily obscure detection. It is hoped that these videos will improve the probability of detecting metal whiskers with optical inspection techniques.

  9. Application of Function-Failure Similarity Method to Rotorcraft Component Design

    NASA Technical Reports Server (NTRS)

    Roberts, Rory A.; Stone, Robert E.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Performance and safety are the top concerns of high-risk aerospace applications at NASA. Eliminating or reducing performance and safety problems can be achieved with a thorough understanding of potential failure modes in the designs that lead to these problems. The majority of techniques use prior knowledge and experience as well as Failure Modes and Effects as methods to determine potential failure modes of aircraft. During the design of aircraft, a general technique is needed to ensure that every potential failure mode is considered, while avoiding spending time on improbable failure modes. In this work, this is accomplished by mapping failure modes to specific components, which are described by their functionality. The failure modes are then linked to the basic functions that are carried within the components of the aircraft. Using this technique, designers can examine the basic functions, and select appropriate analyses to eliminate or design out the potential failure modes. The fundamentals of this method were previously introduced for a simple rotating machine test rig with basic functions that are common to a rotorcraft. In this paper, this technique is applied to the engine and power train of a rotorcraft, using failures and functions obtained from accident reports and engineering drawings.

  10. Treatment of systematic errors in land data assimilation systems

    NASA Astrophysics Data System (ADS)

    Crow, W. T.; Yilmaz, M.

    2012-12-01

    Data assimilation systems are generally designed to minimize the influence of random error on the estimation of system states. Yet, experience with land data assimilation systems has also revealed the presence of large systematic differences between model-derived and remotely-sensed estimates of land surface states. Such differences are commonly resolved prior to data assimilation through implementation of a pre-processing rescaling step whereby observations are scaled (or non-linearly transformed) to somehow "match" comparable predictions made by an assimilation model. While the rationale for removing systematic differences in means (i.e., bias) between models and observations is well-established, relatively little theoretical guidance is currently available to determine the appropriate treatment of higher-order moments during rescaling. This talk presents a simple analytical argument to define an optimal linear-rescaling strategy for observations prior to their assimilation into a land surface model. While a technique based on triple collocation theory is shown to replicate this optimal strategy, commonly-applied rescaling techniques (e.g., so called "least-squares regression" and "variance matching" approaches) are shown to represent only sub-optimal approximations to it. Since the triple collocation approach is likely infeasible in many real-world circumstances, general advice for deciding between various feasible (yet sub-optimal) rescaling approaches will be presented with an emphasis of the implications of this work for the case of directly assimilating satellite radiances. While the bulk of the analysis will deal with linear rescaling techniques, its extension to nonlinear cases will also be discussed.

  11. Monitoring of Total Type II Pyrethroid Pesticides in Citrus Oils and Water by Converting to a Common Product 3-Phenoxybenzoic Acid

    PubMed Central

    McCoy, Mark R.; Yang, Zheng; Fu, Xun; Ahn, Ki Chang; Gee, Shirley J.; Bom, David C.; Zhong, Ping; Chang, Dan; Hammock, Bruce D.

    2012-01-01

    Pyrethroids are a class of insecticides that are becoming increasingly popular in agricultural and home use applications. Sensitive assays for pyrethroid insecticides in complex matrices are difficult both with instrumental and immunochemical methods. Environmental analysis of the pyrethroids by immunoassay requires either knowing which pyrethroids contaminate the source or the use of non-specific antibodies with cross reactivities to a class of compounds. We describe an alternative method that converts the type-II-pyrethroids to a common chemical product, 3-phenoxybenzoic acid (3-PBA), prior to analysis. This method is much more sensitive than detecting the parent compound, and it is much easier to detect a single compound rather than an entire class of compounds. This is useful in screening for pyrethroids as a class or in situations where a single type of pyrethroid is used. We demonstrated this technique in both citrus oils and environmental water samples with conversion rates of the pyrethroid to 3-PBA that range from 45%-75% and methods that require no extraction steps for either the immunoassay or LC-MS/MS techniques. Limits of detection for this technique applied to orange oil are 5 nM, 2 μM, and 0.8 μM when detected by LC-MS/MS, GC-MS, and immunoassay respectively. The limit of detection for pyrethroids in water when detected by immunoassay was 2 nM. PMID:22486225

  12. A surgical technique to expand the operative corridor for supracerebellar infratentorial approaches: technical note.

    PubMed

    Rey-Dios, Roberto; Cohen-Gadol, Aaron A

    2013-10-01

    The supracerebellar infratentorial approach is a commonly used route in neurosurgery. It provides a narrow and deep corridor to the dorsal midbrain and pineal region. The authors describe a surgical technique to expand the operative corridor and the surgeon's working angles during this approach. Thirteen cases of patients who underwent resection of their lesions using this extended approach were reviewed. During their suboccipital craniotomy, additional bone over the transverse sinus (paramedian approach) and the confluence of the sinuses (midline approach) were removed. Two sutures (tentorial stay sutures) were anchored to the tentorium anterior to the transverse sinus and tension was applied. A video narrated by the senior author describes the details of technique. This additional bone removal and tentorial stay sutures led to gentle elevation of the tentorium and partial mobilization of the dural venous sinuses superiorly. This technique enhanced operative viewing through improved illumination and expanded working angles for microsurgical instruments while minimizing the need for fixed retractors and extensive cerebellar retraction. All patients underwent satisfactory removal of their lesions. No patient suffered from any related complication. The use of stay sutures anchored on the tentorium is a simple and effective technique that expands the surgical corridor during supracerebellar infratentorial approaches.

  13. Quality evaluation of fish and other seafood by traditional and nondestructive instrumental methods: Advantages and limitations.

    PubMed

    Hassoun, Abdo; Karoui, Romdhane

    2017-06-13

    Although being one of the most vulnerable and perishable products, fish and other seafoods provide a wide range of health-promoting compounds. Recently, the growing interest of consumers in food quality and safety issues has contributed to the increasing demand for sensitive and rapid analytical technologies. Several traditional physicochemical, textural, sensory, and electrical methods have been used to evaluate freshness and authentication of fish and other seafood products. Despite the importance of these standard methods, they are expensive and time-consuming, and often susceptible to large sources of variation. Recently, spectroscopic methods and other emerging techniques have shown great potential due to speed of analysis, minimal sample preparation, high repeatability, low cost, and, most of all, the fact that these techniques are noninvasive and nondestructive and, therefore, could be applied to any online monitoring system. This review describes firstly and briefly the basic principles of multivariate data analysis, followed by the most commonly traditional methods used for the determination of the freshness and authenticity of fish and other seafood products. A special focus is put on the use of rapid and nondestructive techniques (spectroscopic techniques and instrumental sensors) to address several issues related to the quality of these products. Moreover, the advantages and limitations of each technique are reviewed and some perspectives are also given.

  14. A predictive modeling approach to increasing the economic effectiveness of disease management programs.

    PubMed

    Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian

    2014-09-01

    Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.

  15. Detonation Properties Measurements for Inorganic Explosives

    NASA Astrophysics Data System (ADS)

    Morgan, Brent A.; Lopez, Angel

    2005-03-01

    Many commonly available explosive materials have never been quantitatively or theoretically characterized in a manner suitable for use in analytical models. This includes inorganic explosive materials used in spacecraft ordnance, such as zirconium potassium perchlorate (ZPP). Lack of empirical information about these materials impedes the development of computational techniques. We have applied high fidelity measurement techniques to experimentally determine the pressure and velocity characteristics of ZPP, a previously uncharacterized explosive material. Advances in measurement technology now permit the use of very small quantities of material, thus yielding a significant reduction in the cost of conducting these experiments. An empirical determination of the explosive behavior of ZPP derived a Hugoniot for ZPP with an approximate particle velocity (uo) of 1.0 km/s. This result compares favorably with the numerical calculations from the CHEETAH thermochemical code, which predicts uo of approximately 1.2 km/s under ideal conditions.

  16. Real-Time and High-Resolution 3D Face Measurement via a Smart Active Optical Sensor.

    PubMed

    You, Yong; Shen, Yang; Zhang, Guocai; Xing, Xiuwen

    2017-03-31

    The 3D measuring range and accuracy in traditional active optical sensing, such as Fourier transform profilometry, are influenced by the zero frequency of the captured patterns. The phase-shifting technique is commonly applied to remove the zero component. However, this phase-shifting method must capture several fringe patterns with phase difference, thereby influencing the real-time performance. This study introduces a smart active optical sensor, in which a composite pattern is utilized. The composite pattern efficiently combines several phase-shifting fringes and carrier frequencies. The method can remove zero frequency by using only one pattern. Model face reconstruction and human face measurement were employed to study the validity and feasibility of this method. Results show no distinct decrease in the precision of the novel method unlike the traditional phase-shifting method. The texture mapping technique was utilized to reconstruct a nature-appearance 3D digital face.

  17. Real-Time and High-Resolution 3D Face Measurement via a Smart Active Optical Sensor

    PubMed Central

    You, Yong; Shen, Yang; Zhang, Guocai; Xing, Xiuwen

    2017-01-01

    The 3D measuring range and accuracy in traditional active optical sensing, such as Fourier transform profilometry, are influenced by the zero frequency of the captured patterns. The phase-shifting technique is commonly applied to remove the zero component. However, this phase-shifting method must capture several fringe patterns with phase difference, thereby influencing the real-time performance. This study introduces a smart active optical sensor, in which a composite pattern is utilized. The composite pattern efficiently combines several phase-shifting fringes and carrier frequencies. The method can remove zero frequency by using only one pattern. Model face reconstruction and human face measurement were employed to study the validity and feasibility of this method. Results show no distinct decrease in the precision of the novel method unlike the traditional phase-shifting method. The texture mapping technique was utilized to reconstruct a nature-appearance 3D digital face. PMID:28362349

  18. Ultrasonographic percutaneous anatomy of the atlanto-occipital region and indirect ultrasound-guided cisternal puncture in the dog and the cat.

    PubMed

    Etienne, A-L; Audigié, F; Peeters, D; Gabriel, A; Busoni, V

    2015-04-01

    Cisternal puncture in dogs and cats is commonly carried out. This article describes the percutaneous ultrasound anatomy of the cisternal region in the dog and the cat and an indirect technique for ultrasound-guided cisternal puncture. Ultrasound images obtained ex vivo and in vivo were compared with anatomic sections and used to identify the landmarks for ultrasound-guided cisternal puncture. The ultrasound-guided procedure was established in cadavers and then applied in vivo in seven dogs and two cats. The anatomic landmarks for the ultrasound-guided puncture are the cisterna magna, the spinal cord, the two occipital condyles on transverse images, the external occipital crest and the dorsal arch of the first cervical vertebra on longitudinal images. Using these ultrasound anatomic landmarks, an indirect ultrasound-guided technique for cisternal puncture is applicable in the dog and the cat. © 2014 Blackwell Verlag GmbH.

  19. Controllable dissociations of PH3 molecules on Si(001)

    NASA Astrophysics Data System (ADS)

    Liu, Qin; Lei, Yanhua; Shao, Xiji; Ming, Fangfei; Xu, Hu; Wang, Kedong; Xiao, Xudong

    2016-04-01

    We demonstrate for the first time to our knowledge that controllable dissociation of PH3 adsorption products PH x (x = 2, 1) can be realized by STM (scanning tunneling microscope) manipulation techniques at room temperature. Five dissociative products and their geometric structures are identified via combining STM experiments and first-principle calculations and simulations. In total we realize nine kinds of controllable dissociations by applying a voltage pulse among the PH3-related structures on Si(001). The dissociation rates of the five most common reactions are measured by the I-t spectrum method as a function of voltage. The suddenly increased dissociation rate at 3.3 V indicates a transition from multivibrational excitation to single-step excitation induced by inelastic tunneling electrons. Our studies prove that selectively breaking the chemical bonds of a single molecule on semiconductor surface by STM manipulation technique is feasible.

  20. Overview of selected surrogate technologies for continuous suspended-sediment monitoring

    USGS Publications Warehouse

    Gray, J.R.; Gartner, J.W.

    2006-01-01

    Surrogate technologies for inferring selected characteristics of suspended sediments in surface waters are being tested by the U.S. Geological Survey and several partners with the ultimate goal of augmenting or replacing traditional monitoring methods. Optical properties of water such as turbidity and optical backscatter are the most commonly used surrogates for suspended-sediment concentration, but use of other techniques such as those based on acoustic backscatter, laser diffraction, digital photo-optic, and pressure-difference principles is increasing for concentration and, in some cases, particle-size distribution and flux determinations. The potential benefits of these technologies include acquisition of automated, continuous, quantifiably accurate data obtained with increased safety and at less expense. When suspended-sediment surrogate data meet consensus accuracy criteria and appropriate sediment-record computation techniques are applied, these technologies have the potential to revolutionize the way fluvial-sediment data are collected, analyzed, and disseminated.

  1. A modified active appearance model based on an adaptive artificial bee colony.

    PubMed

    Abdulameer, Mohammed Hasan; Sheikh Abdullah, Siti Norul Huda; Othman, Zulaiha Ali

    2014-01-01

    Active appearance model (AAM) is one of the most popular model-based approaches that have been extensively used to extract features by highly accurate modeling of human faces under various physical and environmental circumstances. However, in such active appearance model, fitting the model with original image is a challenging task. State of the art shows that optimization method is applicable to resolve this problem. However, another common problem is applying optimization. Hence, in this paper we propose an AAM based face recognition technique, which is capable of resolving the fitting problem of AAM by introducing a new adaptive ABC algorithm. The adaptation increases the efficiency of fitting as against the conventional ABC algorithm. We have used three datasets: CASIA dataset, property 2.5D face dataset, and UBIRIS v1 images dataset in our experiments. The results have revealed that the proposed face recognition technique has performed effectively, in terms of accuracy of face recognition.

  2. Numerical tilting compensation in microscopy based on wavefront sensing using transport of intensity equation method

    NASA Astrophysics Data System (ADS)

    Hu, Junbao; Meng, Xin; Wei, Qi; Kong, Yan; Jiang, Zhilong; Xue, Liang; Liu, Fei; Liu, Cheng; Wang, Shouyu

    2018-03-01

    Wide-field microscopy is commonly used for sample observations in biological research and medical diagnosis. However, the tilting error induced by the oblique location of the image recorder or the sample, as well as the inclination of the optical path often deteriorates the imaging quality. In order to eliminate the tilting in microscopy, a numerical tilting compensation technique based on wavefront sensing using transport of intensity equation method is proposed in this paper. Both the provided numerical simulations and practical experiments prove that the proposed technique not only accurately determines the tilting angle with simple setup and procedures, but also compensates the tilting error for imaging quality improvement even in the large tilting cases. Considering its simple systems and operations, as well as image quality improvement capability, it is believed the proposed method can be applied for tilting compensation in the optical microscopy.

  3. Application of parallel distributed Lagrange multiplier technique to simulate coupled Fluid-Granular flows in pipes with varying Cross-Sectional area

    DOE PAGES

    Kanarska, Yuliya; Walton, Otis

    2015-11-30

    Fluid-granular flows are common phenomena in nature and industry. Here, an efficient computational technique based on the distributed Lagrange multiplier method is utilized to simulate complex fluid-granular flows. Each particle is explicitly resolved on an Eulerian grid as a separate domain, using solid volume fractions. The fluid equations are solved through the entire computational domain, however, Lagrange multiplier constrains are applied inside the particle domain such that the fluid within any volume associated with a solid particle moves as an incompressible rigid body. The particle–particle interactions are implemented using explicit force-displacement interactions for frictional inelastic particles similar to the DEMmore » method with some modifications using the volume of an overlapping region as an input to the contact forces. Here, a parallel implementation of the method is based on the SAMRAI (Structured Adaptive Mesh Refinement Application Infrastructure) library.« less

  4. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    PubMed

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  5. Agent-based modeling: a new approach for theory building in social psychology.

    PubMed

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.

  6. Automatic limb identification and sleeping parameters assessment for pressure ulcer prevention.

    PubMed

    Baran Pouyan, Maziyar; Birjandtalab, Javad; Nourani, Mehrdad; Matthew Pompeo, M D

    2016-08-01

    Pressure ulcers (PUs) are common among vulnerable patients such as elderly, bedridden and diabetic. PUs are very painful for patients and costly for hospitals and nursing homes. Assessment of sleeping parameters on at-risk limbs is critical for ulcer prevention. An effective assessment depends on automatic identification and tracking of at-risk limbs. An accurate limb identification can be used to analyze the pressure distribution and assess risk for each limb. In this paper, we propose a graph-based clustering approach to extract the body limbs from the pressure data collected by a commercial pressure map system. A robust signature-based technique is employed to automatically label each limb. Finally, an assessment technique is applied to evaluate the experienced stress by each limb over time. The experimental results indicate high performance and more than 94% average accuracy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Theta-burst microstimulation in the human entorhinal area improves memory specificity.

    PubMed

    Titiz, Ali S; Hill, Michael R H; Mankin, Emily A; M Aghajan, Zahra; Eliashiv, Dawn; Tchemodanov, Natalia; Maoz, Uri; Stern, John; Tran, Michelle E; Schuette, Peter; Behnke, Eric; Suthana, Nanthia A; Fried, Itzhak

    2017-10-24

    The hippocampus is critical for episodic memory, and synaptic changes induced by long-term potentiation (LTP) are thought to underlie memory formation. In rodents, hippocampal LTP may be induced through electrical stimulation of the perforant path. To test whether similar techniques could improve episodic memory in humans, we implemented a microstimulation technique that allowed delivery of low-current electrical stimulation via 100 μm -diameter microelectrodes. As thirteen neurosurgical patients performed a person recognition task, microstimulation was applied in a theta-burst pattern, shown to optimally induce LTP. Microstimulation in the right entorhinal area during learning significantly improved subsequent memory specificity for novel portraits; participants were able both to recognize previously-viewed photos and reject similar lures. These results suggest that microstimulation with physiologic level currents-a radical departure from commonly used deep brain stimulation protocols-is sufficient to modulate human behavior and provides an avenue for refined interrogation of the circuits involved in human memory.

  8. 3D FISH to analyse gene domain-specific chromatin re-modeling in human cancer cell lines.

    PubMed

    Kocanova, Silvia; Goiffon, Isabelle; Bystricky, Kerstin

    2018-06-01

    Fluorescence in situ hybridization (FISH) is a common technique used to label DNA and/or RNA for detection of a genomic region of interest. However, the technique can be challenging, in particular when applied to single genes in human cancer cells. Here, we provide a step-by-step protocol for analysis of short (35 kb-300 kb) genomic regions in three dimensions (3D). We discuss the experimental design and provide practical considerations for 3D imaging and data analysis to determine chromatin folding. We demonstrate that 3D FISH using BACs (Bacterial Artificial Chromosomes) or fosmids can provide detailed information of the architecture of gene domains. More specifically, we show that mapping of specific chromatin landscapes informs on changes associated with estrogen stimulated gene activity in human breast cancer cell lines. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Metagenomic applications in environmental monitoring and bioremediation.

    PubMed

    Techtmann, Stephen M; Hazen, Terry C

    2016-10-01

    With the rapid advances in sequencing technology, the cost of sequencing has dramatically dropped and the scale of sequencing projects has increased accordingly. This has provided the opportunity for the routine use of sequencing techniques in the monitoring of environmental microbes. While metagenomic applications have been routinely applied to better understand the ecology and diversity of microbes, their use in environmental monitoring and bioremediation is increasingly common. In this review we seek to provide an overview of some of the metagenomic techniques used in environmental systems biology, addressing their application and limitation. We will also provide several recent examples of the application of metagenomics to bioremediation. We discuss examples where microbial communities have been used to predict the presence and extent of contamination, examples of how metagenomics can be used to characterize the process of natural attenuation by unculturable microbes, as well as examples detailing the use of metagenomics to understand the impact of biostimulation on microbial communities.

  10. Microstructural and Defect Characterization in Ceramic Composites Using an Ultrasonic Guided Wave Scan System

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Cosgriff, L. M.; Martin, R. E.; Verrilli, M. J.; Bhatt, R. T.

    2003-01-01

    In this study, an ultrasonic guided wave scan system was used to characterize various microstructural and flaw conditions in two types of ceramic matrix composites, SiC/SiC and C/SiC. Rather than attempting to isolate specific lamb wave modes to use for characterization (as is desired for many types of guided wave inspection problems), the guided wave scan system utilizes the total (multi-mode) ultrasonic response in its inspection analysis. Several time and frequency-domain parameters are calculated from the ultrasonic guided wave signal at each scan location to form images. Microstructural and defect conditions examined include delamination, density variation, cracking, and pre/ post-infiltration. Results are compared with thermographic imaging methods. Although the guided wave technique is commonly used so scanning can be eliminated, applying the technique in the scanning mode allows a more precise characterization of defect conditions.

  11. Cerebral Microbleeds: A Field Guide to their Detection and Interpretation

    PubMed Central

    Greenberg, Steven M.; Vernooij, Meike W.; Cordonnier, Charlotte; Viswanathan, Anand; Salman, Rustam Al-Shahi; Warach, Steven; Launer, Lenore J.; Van Buchem, Mark A.; Breteler, Monique M.B.

    2012-01-01

    Summary Cerebral microbleeds (CMB) are increasingly recognized neuroimaging findings, occurring with cerebrovascular disease, dementia, and normal aging. Recent years have seen substantial progress, particularly in developing newer MRI methodologies for CMB detection and applying them to population-based elderly samples. This review focuses on these recent developments and their impact on two major questions: how CMB are detected, and how they should be interpreted. There is now ample evidence that prevalence and number of detected CMB varies with MRI characteristics such as pulse sequence, sequence parameters, spatial resolution, magnetic field strength, and post-processing, underlining the importance of MRI technique in interpreting studies. Recent investigations using sensitive techniques find the prevalence of CMB detected in community-dwelling elderly to be surprisingly high. We propose procedural guidelines for identifying CMB and suggest possible future approaches for elucidating the role of these common lesions as markers for, and potential contributors to, small vessel brain disease. PMID:19161908

  12. Use of New Techniques in Addition to IHC Applied to the Diagnosis of Melanocytic Lesions, With Emphasis on CGH, FISH, and Mass Spectrometry.

    PubMed

    Nagarajan, P; Tetzlaff, M T; Curry, J L; Prieto, V G

    Melanoma remains one of the most aggressive forms of cutaneous malignancies. While its diagnosis based on histologic parameters is usually straight forward in most cases, distinguishing a melanoma from a melanocytic nevus can be challenging in some instances, especially when there are overlapping clinical and histopathologic features. Occasionally, melanomas can histologically mimic other tumors and even demonstration of melanocytic origin can be challenging. Thus, several ancillary tests may be employed to arrive at the correct diagnosis. The objective of this review is to summarize these tests, including the well-established and commonly used ones such as immunohistochemistry, with specific emphasis on emerging techniques such as comparative genomic hybridization, fluorescence in situ hybridization and imaging mass spectrometry. Copyright © 2016 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Preimplantation development of somatic cell cloned embryos in the common marmoset (Callithrix jacchus).

    PubMed

    Sotomaru, Yusuke; Hirakawa, Reiko; Shimada, Akiko; Shiozawa, Seiji; Sugawara, Ayako; Oiwa, Ryo; Nobukiyo, Asako; Okano, Hideyuki; Tamaoki, Norikazu; Nomura, Tatsuji; Hiyama, Eiso; Sasaki, Erika

    2009-12-01

    The somatic cell nuclear transfer technique has been applied to various mammals to produce cloned animals; however, a standardized method is not applicable to all species. We aimed here to develop optimum procedures for somatic cell cloning in nonhuman primates, using common marmosets. First, we confirmed that parthenogenetic activation of in vitro matured oocytes was successfully induced by electrical stimulation (three cycles of 150 V/mm, 50 microsec x 2, 20 min intervals), and this condition was applied to the egg activation procedure in the subsequent experiments. Next, nuclear transfer to recipient enucleated oocytes was performed 1 h before, immediately after, or 1 h after egg activation treatment. The highest developmental rate was observed when nuclear transfer was performed 1 h before activation, but none of the cloned embryos developed beyond the eight-cell stage. To investigate the causes of the low developmental potential of cloned embryos, a study was performed to determine whether the presence of metaphase II (MII) chromosome in recipient ooplasm has an effect on developmental potential. As a result, only tetraploid cloned embryos produced by transferring a donor cell into a recipient bearing the MII chromosome developed into blastocysts (66.7%). In contrast, neither parthenogenetic embryos nor cloned embryos (whether diploid or tetraploid) produced using enucleated oocytes developed past the eight-cell stage. These results suggest that MII chromosome, or cytoplasm proximal to the MII chromosome, plays a major role in the development of cloned embryos in common marmosets.

  14. Stress-Constrained Structural Topology Optimization with Design-Dependent Loads

    NASA Astrophysics Data System (ADS)

    Lee, Edmund

    Topology optimization is commonly used to distribute a given amount of material to obtain the stiffest structure, with predefined fixed loads. The present work investigates the result of applying stress constraints to topology optimization, for problems with design-depending loading, such as self-weight and pressure. In order to apply pressure loading, a material boundary identification scheme is proposed, iteratively connecting points of equal density. In previous research, design-dependent loading problems have been limited to compliance minimization. The present study employs a more practical approach by minimizing mass subject to failure constraints, and uses a stress relaxation technique to avoid stress constraint singularities. The results show that these design dependent loading problems may converge to a local minimum when stress constraints are enforced. Comparisons between compliance minimization solutions and stress-constrained solutions are also given. The resulting topologies of these two solutions are usually vastly different, demonstrating the need for stress-constrained topology optimization.

  15. Metabolomics for laboratory diagnostics.

    PubMed

    Bujak, Renata; Struck-Lewicka, Wiktoria; Markuszewski, Michał J; Kaliszan, Roman

    2015-09-10

    Metabolomics is an emerging approach in a systems biology field. Due to continuous development in advanced analytical techniques and in bioinformatics, metabolomics has been extensively applied as a novel, holistic diagnostic tool in clinical and biomedical studies. Metabolome's measurement, as a chemical reflection of a current phenotype of a particular biological system, is nowadays frequently implemented to understand pathophysiological processes involved in disease progression as well as to search for new diagnostic or prognostic biomarkers of various organism's disorders. In this review, we discussed the research strategies and analytical platforms commonly applied in the metabolomics studies. The applications of the metabolomics in laboratory diagnostics in the last 5 years were also reviewed according to the type of biological sample used in the metabolome's analysis. We also discussed some limitations and further improvements which should be considered taking in mind potential applications of metabolomic research and practice. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Bisphenols: Application, occurrence, safety, and biodegradation mediated by bacterial communities in wastewater treatment plants and rivers.

    PubMed

    Noszczyńska, Magdalena; Piotrowska-Seget, Zofia

    2018-06-01

    Numerous data indicate that most of bisphenols (BPs) are endocrine disrupters and exhibit cytotoxicity, neurotoxicity, genotoxicity and reproductive toxicity against vertebrates. Nevertheless, they are widely applied in material production what result in their ubiquitous occurrence in ecosystems. While BPA is the most frequently detected in environment, BPAF, BPF and BPS are also often found. Ecosystem particularly exposed to BPs pollution is industrial and municipal wastewater being a common source of BPA in river waters. Different techniques to remove BPs from these ecosystems have been applied, among which biodegradation seems to be the most effective. In this review the current state of knowledge in the field of BPs application, distribution in the environment, effects on animal and human health, and biodegradation mediated by bacterial populations in wastewater treatment plants and rivers is presented. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Mass spectrometry in plant metabolomics strategies: from analytical platforms to data acquisition and processing.

    PubMed

    Ernst, Madeleine; Silva, Denise Brentan; Silva, Ricardo Roberto; Vêncio, Ricardo Z N; Lopes, Norberto Peporine

    2014-06-01

    Covering: up to 2013. Plant metabolomics is a relatively recent research field that has gained increasing interest in the past few years. Up to the present day numerous review articles and guide books on the subject have been published. This review article focuses on the current applications and limitations of the modern mass spectrometry techniques, especially in combination with electrospray ionisation (ESI), an ionisation method which is most commonly applied in metabolomics studies. As a possible alternative to ESI, perspectives on matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) in metabolomics studies are introduced, a method which still is not widespread in the field. In metabolomics studies the results must always be interpreted in the context of the applied sampling procedures as well as data analysis. Different sampling strategies are introduced and the importance of data analysis is illustrated in the example of metabolic network modelling.

  18. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  19. Characterisation of a cryostat using simultaneous, single-beam multiple-surface laser vibrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kissinger, Thomas; Charrett, Thomas O. H.; James, Stephen W.

    2016-06-28

    A novel range-resolved interferometric signal processing technique that uses sinusoidal optical frequency modulation is applied to multi-surface vibrometry, demonstrating simultaneous optical measurements of vibrations on two surfaces using a single, collimated laser beam, with a minimum permissible distance of 3.5 cm between surfaces. The current system, using a cost-effective laser diode and a fibre-coupled, downlead insensitive setup, allows an interferometric fringe rate of up to 180 kHz to be resolved with typical displacement noise levels of 8 pm · Hz{sup −05}. In this paper, the system is applied to vibrometry measurements of a table-top cryostat, with concurrent measurements of the optical widowmore » and the sample holder target inside. This allows the separation of common-mode vibrations of the whole cryostat from differential vibrations between the window and the target, allowing any resonances to be identified.« less

  20. Taxonomy of segmental myocardial systolic dysfunction.

    PubMed

    McDiarmid, Adam K; Pellicori, Pierpaolo; Cleland, John G; Plein, Sven

    2017-04-01

    The terms used to describe different states of myocardial health and disease are poorly defined. Imprecision and inconsistency in nomenclature can lead to difficulty in interpreting and applying trial outcomes to clinical practice. In particular, the terms 'viable' and 'hibernating' are commonly applied interchangeably and incorrectly to myocardium that exhibits chronic contractile dysfunction in patients with ischaemic heart disease. The range of inherent differences amongst imaging modalities used to define myocardial health and disease add further challenges to consistent definitions. The results of several large trials have led to renewed discussion about the classification of dysfunctional myocardial segments. This article aims to describe the diverse myocardial pathologies that may affect the myocardium in ischaemic heart disease and cardiomyopathy, and how they may be assessed with non-invasive imaging techniques in order to provide a taxonomy of myocardial dysfunction. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Cardiology.

  1. Monte Carlo estimation of total variation distance of Markov chains on large spaces, with application to phylogenetics.

    PubMed

    Herbei, Radu; Kubatko, Laura

    2013-03-26

    Markov chains are widely used for modeling in many areas of molecular biology and genetics. As the complexity of such models advances, it becomes increasingly important to assess the rate at which a Markov chain converges to its stationary distribution in order to carry out accurate inference. A common measure of convergence to the stationary distribution is the total variation distance, but this measure can be difficult to compute when the state space of the chain is large. We propose a Monte Carlo method to estimate the total variation distance that can be applied in this situation, and we demonstrate how the method can be efficiently implemented by taking advantage of GPU computing techniques. We apply the method to two Markov chains on the space of phylogenetic trees, and discuss the implications of our findings for the development of algorithms for phylogenetic inference.

  2. Extractive-spectrophotometric determination of disopyramide and irbesartan in their pharmaceutical formulation

    NASA Astrophysics Data System (ADS)

    Abdellatef, Hisham E.

    2007-04-01

    Picric acid, bromocresol green, bromothymol blue, cobalt thiocyanate and molybdenum(V) thiocyanate have been tested as spectrophotometric reagents for the determination of disopyramide and irbesartan. Reaction conditions have been optimized to obtain coloured comoplexes of higher sensitivity and longer stability. The absorbance of ion-pair complexes formed were found to increases linearity with increases in concentrations of disopyramide and irbesartan which were corroborated by correction coefficient values. The developed methods have been successfully applied for the determination of disopyramide and irbesartan in bulk drugs and pharmaceutical formulations. The common excipients and additives did not interfere in their determination. The results obtained by the proposed methods have been statistically compared by means of student t-test and by the variance ratio F-test. The validity was assessed by applying the standard addition technique. The results were compared statistically with the official or reference methods showing a good agreement with high precision and accuracy.

  3. Computer-based objective quantitative assessment of pulmonary parenchyma via x-ray CT

    NASA Astrophysics Data System (ADS)

    Uppaluri, Renuka; McLennan, Geoffrey; Sonka, Milan; Hoffman, Eric A.

    1998-07-01

    This paper is a review of our recent studies using a texture- based tissue characterization method called the Adaptive Multiple Feature Method. This computerized method is automated and performs tissue classification based upon the training acquired on a set of representative examples. The AMFM has been applied to several different discrimination tasks including normal subjects, subjects with interstitial lung disease, smokers, asbestos-exposed subjects, and subjects with cystic fibrosis. The AMFM has also been applied to data acquired using different scanners and scanning protocols. The AMFM has shown to be successful and better than other existing techniques in discriminating the tissues under consideration. We demonstrate that the AMFM is considerably more sensitive and specific in characterizing the lung, especially in the presence of mixed pathology, as compared to more commonly used methods. Evidence is presented suggesting that the AMFM is highly sensitive to some of the earliest disease processes.

  4. Evolutionary perspectives on wildlife disease: concepts and applications

    PubMed Central

    Vander Wal, Eric; Garant, Dany; Pelletier, Fanie

    2014-01-01

    Wildlife disease has the potential to cause significant ecological, socioeconomic, and health impacts. As a result, all tools available need to be employed when host–pathogen dynamics merit conservation or management interventions. Evolutionary principles, such as evolutionary history, phenotypic and genetic variation, and selection, have the potential to unravel many of the complex ecological realities of infectious disease in the wild. Despite this, their application to wildlife disease ecology and management remains in its infancy. In this article, we outline the impetus behind applying evolutionary principles to disease ecology and management issues in the wild. We then introduce articles from this special issue on Evolutionary Perspectives on Wildlife Disease: Concepts and Applications, outlining how each is exemplar of a practical wildlife disease challenge that can be enlightened by applied evolution. Ultimately, we aim to bring new insights to wildlife disease ecology and its management using tools and techniques commonly employed in evolutionary ecology. PMID:25469154

  5. Adaptive suppression of power line interference in ultra-low field magnetic resonance imaging in an unshielded environment

    NASA Astrophysics Data System (ADS)

    Huang, Xiaolei; Dong, Hui; Qiu, Yang; Li, Bo; Tao, Quan; Zhang, Yi; Krause, Hans-Joachim; Offenhäusser, Andreas; Xie, Xiaoming

    2018-01-01

    Power-line harmonic interference and fixed-frequency noise peaks may cause stripe-artifacts in ultra-low field (ULF) magnetic resonance imaging (MRI) in an unshielded environment and in a conductively shielded room. In this paper we describe an adaptive suppression method to eliminate these artifacts in MRI images. This technique utilizes spatial correlation of the interference from different positions, and is realized by subtracting the outputs of the reference channel(s) from those of the signal channel(s) using wavelet analysis and the least squares method. The adaptive suppression method is first implemented to remove the image artifacts in simulation. We then experimentally demonstrate the feasibility of this technique by adding three orthogonal superconducting quantum interference device (SQUID) magnetometers as reference channels to compensate the output of one 2nd-order gradiometer. The experimental results show great improvement in the imaging quality in both 1D and 2D MRI images at two common imaging frequencies, 1.3 kHz and 4.8 kHz. At both frequencies, the effective compensation bandwidth is as high as 2 kHz. Furthermore, we examine the longitudinal relaxation times of the same sample before and after compensation, and show that the MRI properties of the sample did not change after applying adaptive suppression. This technique can effectively increase the imaging bandwidth and be applied to ULF MRI detected by either SQUIDs or Faraday coil in both an unshielded environment and a conductively shielded room.

  6. Improved recovery of Listeria monocytogenes from stainless steel and polytetrafluoroethylene surfaces using air/water ablation.

    PubMed

    Gião, M S; Blanc, S; Porta, S; Belenguer, J; Keevil, C W

    2015-07-01

    To develop a gentle ablation technique to recover Listeria monocytogenes biofilms from stainless steel (SS) and polytetrafluoroethylene (PTFE) surfaces by using compressed air and water injection. Biofilms were grown for 4, 24 and 48 h or 7 days and a compressed air and water flow at 2, 3 and 4 bars was applied for cell removal. Collected cells were quantified for total/dead by staining with SYTO 9/PI double staining and cultivable populations were determined by plating onto brain heart infusion (BHI) agar, while coupon surfaces also were stained with DAPI to quantify in situ the remaining cells. The recovery efficiency was compared to that of conventional swabbing. Results showed that the air/water ablation is able to collect up to 98·6% of cells from SS surfaces while swabbing only recovered 11·2% of biofilm. Moreover, air/water ablation recovered 99·9% of cells from PTFE surfaces. The high recovery rate achieved by this technique, along with the fact that cells were able to retain membrane integrity and cultivability, indicate that this device is suitable for the gentle recovery of viable L. monocytogenes biofilm cells. This work presents a highly efficient technique to remove, collect and quantify L. monocytogenes from surfaces commonly used in the food industry, which can thus serve as an important aid in verifying cleaning and sanitation as well as in reducing the likelihood of cross-contamination events. © 2015 The Society for Applied Microbiology.

  7. Non-invasive therapeutic use of High-Intensity Focused Ultrasound (HIFU) with 3 Tesla Magnetic Resonance Imaging in women with symptomatic uterine fibroids.

    PubMed

    Łoziński, Tomasz; Filipowska, Justyna; Gurynowicz, Grzegorz; Gabriel, Iwona; Czekierdowski, Artur

    2017-01-01

    Benign uterine fibroids are common female genital tract tumors and if symptomatic often require extensive surgery. When tumors are multiple and large or unusually located, the operative treatment may lead to significant morbidity and compromise quality of life. Recovery period after surgical treatment may be complicated by patient's medical condition and wound healing problems. Currently used other non-surgical treatment modalities usually provide only a temporal symptoms relief and may not be efficient in all affected women. In the last decade, minimally invasive treatment of uterine fibroids called Magnetic Resonance guided High-Intensity Focused Ultrasound (MRI HIFU) was introduced. This technique uses thermal ablation simultaneously with MRI imaging of the mass and tissue temperature measurements during the procedure where a focused ultrasound beam is applied externally to destroy tumors located in the human body. Successful application of MRI HIFU has been recently described in patients with various malignancies, such as breast, prostate and hepatocellular cancers as well as soft tissue and bone tumors. This technique is innovative and has been proven to be safe and effective but there are several limitations for treatment. The article highlights the relative advantages and disadvantages of MRI guided HIFU in women with uterine fibroids. The authors also describe high-resolution MRI technique on 3T MRI, along with the approach to interpretation of HIFU results applied to uterine fibroids that has been experienced at one institution.

  8. Negative electrospray ionization on porous supporting tips for mass spectrometric analysis: electrostatic charging effect on detection sensitivity and its application to explosive detection.

    PubMed

    Wong, Melody Yee-Man; Man, Sin-Heng; Che, Chi-Ming; Lau, Kai-Chung; Ng, Kwan-Ming

    2014-03-21

    The simplicity and easy manipulation of a porous substrate-based ESI-MS technique have been widely applied to the direct analysis of different types of samples in positive ion mode. However, the study and application of this technique in negative ion mode are sparse. A key challenge could be due to the ease of electrical discharge on supporting tips upon the application of negative voltage. The aim of this study is to investigate the effect of supporting materials, including polyester, polyethylene and wood, on the detection sensitivity of a porous substrate-based negative ESI-MS technique. By using nitrobenzene derivatives and nitrophenol derivatives as the target analytes, it was found that the hydrophobic materials (i.e., polyethylene and polyester) with a higher tendency to accumulate negative charge could enhance the detection sensitivity towards nitrobenzene derivatives via electron-capture ionization; whereas, compounds with electron affinities lower than the cut-off value (1.13 eV) were not detected. Nitrophenol derivatives with pKa smaller than 9.0 could be detected in the form of deprotonated ions; whereas polar materials (i.e., wood), which might undergo competitive deprotonation with the analytes, could suppress the detection sensitivity. With the investigation of the material effects on the detection sensitivity, the porous substrate-based negative ESI-MS method was developed and applied to the direct detection of two commonly encountered explosives in complex samples.

  9. An international survey on noninvasive ventilation use for acute respiratory failure in general non-monitored wards.

    PubMed

    Cabrini, Luca; Esquinas, Antonio; Pasin, Laura; Nardelli, Pasquale; Frati, Elena; Pintaudi, Margherita; Matos, Paulo; Landoni, Giovanni; Zangrillo, Alberto

    2015-04-01

    Use of noninvasive ventilation (NIV) for the treatment of patients with acute respiratory failure (ARF) has greatly increased in the last decades. In contrast, the increasing knowledge of its effectiveness and physician confidence in managing this technique have been accompanied by a declining number of available ICU beds. As a consequence, the application of NIV outside the ICU has been reported as a growing phenomenon. Previously published surveys highlighted a great heterogeneity in NIV use, clinical indications, settings, and efficacy. Moreover, they revealed a marked heterogeneity with regard to staff training and technical and organizational aspects. We performed the first worldwide web-based survey focused on NIV use in general wards for ARF. A questionnaire to obtain data regarding hospital and ICU characteristics, settings and modalities of NIV application and monitoring, estimated outcomes, technical and organizational aspects, and observed complications was developed. The multiple-choice anonymous questionnaire to be filled out online was distributed worldwide by mail, LinkedIn, and Facebook professional groups. One-hundred fifty-seven questionnaires were filled out and analyzed. Respondents were from 51 countries from all 5 continents. NIV application in general wards was reported by 66% of respondents. Treatments were reported as increasing in 57% of cases. Limited training and human resources were the most common reasons for not using NIV in general wards. Overall, most respondents perceived that NIV avoids tracheal intubation in most cases; worsening of ARF, intolerance, and inability to manage secretions were the most commonly reported causes of NIV failure. Use of NIV in general wards was reported as effective, common, and gradually increasing. Improvement in staff training and introduction of protocols could help to make this technique safer and more common when applied in general wards setting. Copyright © 2015 by Daedalus Enterprises.

  10. Prospective performance evaluation of selected common virtual screening tools. Case study: Cyclooxygenase (COX) 1 and 2.

    PubMed

    Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela

    2015-01-01

    Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  11. Common neural structures activated by epidural and transcutaneous lumbar spinal cord stimulation: Elicitation of posterior root-muscle reflexes

    PubMed Central

    Freundl, Brigitta; Binder, Heinrich; Minassian, Karen

    2018-01-01

    Epidural electrical stimulation of the lumbar spinal cord is currently regaining momentum as a neuromodulation intervention in spinal cord injury (SCI) to modify dysregulated sensorimotor functions and augment residual motor capacity. There is ample evidence that it engages spinal circuits through the electrical stimulation of large-to-medium diameter afferent fibers within lumbar and upper sacral posterior roots. Recent pilot studies suggested that the surface electrode-based method of transcutaneous spinal cord stimulation (SCS) may produce similar neuromodulatory effects as caused by epidural SCS. Neurophysiological and computer modeling studies proposed that this noninvasive technique stimulates posterior-root fibers as well, likely activating similar input structures to the spinal cord as epidural stimulation. Here, we add a yet missing piece of evidence substantiating this assumption. We conducted in-depth analyses and direct comparisons of the electromyographic (EMG) characteristics of short-latency responses in multiple leg muscles to both stimulation techniques derived from ten individuals with SCI each. Post-activation depression of responses evoked by paired pulses applied either epidurally or transcutaneously confirmed the reflex nature of the responses. The muscle responses to both techniques had the same latencies, EMG peak-to-peak amplitudes, and waveforms, except for smaller responses with shorter onset latencies in the triceps surae muscle group and shorter offsets of the responses in the biceps femoris muscle during epidural stimulation. Responses obtained in three subjects tested with both methods at different time points had near-identical waveforms per muscle group as well as same onset latencies. The present results strongly corroborate the activation of common neural input structures to the lumbar spinal cord—predominantly primary afferent fibers within multiple posterior roots—by both techniques and add to unraveling the basic mechanisms underlying electrical SCS. PMID:29381748

  12. Laparoscopic totally extraperitoneal repair of inguinal hernia using two-hand approach--a gold standard alternative to open repair.

    PubMed

    Rajapandian, S; Senthilnathan, P; Gupta, Atul; Gupta, Pinak Das; Praveenraj, P; Vaitheeswaran, V; Palanivelu, C

    2010-10-01

    As laparoscopy gained popularity, minimal invasive approach was also applied for hernia surgery. Unfortunately the initial efforts were disappointing due to high early recurrence rate. Experience led to refinement of technique, with acceptable recurrence rates. This combined with the advantages of minimal invasive surgery resulted in a gradual rise in worldwide acceptance of this technique. Our preferred approach for inguinal hernia repair is laparoscopic totally extraperitoneal (TEP); only in complicated hernias (sliding or incarcerated inguinal hernias) we use the transabdominal preperitoneal repair (TAPP) technique. Records of all patients who underwent TEP repair for inguinal hernia at our centre in last 15 years were retrospectively analysed. We have done 8659 hernias in 7023 patients by TEP approach. We have developed minor modifications for the TEP repair over the years. Out of total 8659 hernias 5262 was right sided and 3397 left sided. Of these, 5387 hernias were unilateral and the remainder were bilateral; 324 cases of recurrent hernias following open repair underwent TEP. Most of the patients were males with a mean age of 46 years. Indirect hernias were most common, followed by direct hernias. Right-sided hernias were more common than left-sided hernias. In 39 cases conversion to TAPP was needed. There were intra-operative problems in 250 patients (3.56%).Postoperative complications were seen in 192 patients (2.73%), majority of which were minor complications. There was no mortality. Recurrence rate was 0.39%. The TEP technique is comfortable and highly effective. Our port placement maintains triangular orientation that is considered vital to the ergonomics of laparoscopy. Nearly 98-99% of inguinal hernias can be treated by TEP approach with excellent results.

  13. An automated technique for potential differentiation of ovarian mature teratomas from other benign tumours using neural networks classification of 2D ultrasound static images: a pilot study

    NASA Astrophysics Data System (ADS)

    Al-karawi, Dhurgham; Sayasneh, A.; Al-Assam, Hisham; Jassim, Sabah; Page, N.; Timmerman, D.; Bourne, T.; Du, Hongbo

    2017-05-01

    Ovarian cysts are a common pathology in women of all age groups. It is estimated that 5-10% of women have a surgical intervention to remove an ovarian cyst in their lifetime. Given this frequency rate, characterization of ovarian masses is essential for optimal management of patients. Patients with benign ovarian masses can be managed conservatively if they are asymptomatic. Mature teratomas are common benign ovarian cysts that occur, in most cases, in premenopausal women. These ovarian cysts can contain different types of human tissue including bone, cartilage, fat, hair, or other tissue. If they are causing no symptoms, they can be harmless and may not require surgery. Subjective assessment by ultrasound examiners has a high diagnostic accuracy when characterising mature teratomas from other types of tumours. The aim of this study is to develop a computerised technique with the potential to characterise mature teratomas and distinguish them from other types of benign ovarian tumours. Local Binary Pattern (LBP) was applied to extract texture features that are specific in distinguishing teratomas. Neural Networks (NN) was then used as a classifier for recognising mature teratomas. A pilot sample set of 130 B-mode static ovarian ultrasound images (41 mature teratomas tumours and 89 other types of benign tumours) was used to test the effectiveness of the proposed technique. Test results show an average accuracy rate of 99.4% with a sensitivity of 100%, specificity of 98.8% and positive predictive value of 98.9%. This study demonstrates that the NN and LBP techniques can accurately classify static 2D B-mode ultrasound images of benign ovarian masses into mature teratomas and other types of benign tumours.

  14. Estimating the settling velocity of bioclastic sediment using common grain-size analysis techniques

    USGS Publications Warehouse

    Cuttler, Michael V. W.; Lowe, Ryan J.; Falter, James L.; Buscombe, Daniel D.

    2017-01-01

    Most techniques for estimating settling velocities of natural particles have been developed for siliciclastic sediments. Therefore, to understand how these techniques apply to bioclastic environments, measured settling velocities of bioclastic sedimentary deposits sampled from a nearshore fringing reef in Western Australia were compared with settling velocities calculated using results from several common grain-size analysis techniques (sieve, laser diffraction and image analysis) and established models. The effects of sediment density and shape were also examined using a range of density values and three different models of settling velocity. Sediment density was found to have a significant effect on calculated settling velocity, causing a range in normalized root-mean-square error of up to 28%, depending upon settling velocity model and grain-size method. Accounting for particle shape reduced errors in predicted settling velocity by 3% to 6% and removed any velocity-dependent bias, which is particularly important for the fastest settling fractions. When shape was accounted for and measured density was used, normalized root-mean-square errors were 4%, 10% and 18% for laser diffraction, sieve and image analysis, respectively. The results of this study show that established models of settling velocity that account for particle shape can be used to estimate settling velocity of irregularly shaped, sand-sized bioclastic sediments from sieve, laser diffraction, or image analysis-derived measures of grain size with a limited amount of error. Collectively, these findings will allow for grain-size data measured with different methods to be accurately converted to settling velocity for comparison. This will facilitate greater understanding of the hydraulic properties of bioclastic sediment which can help to increase our general knowledge of sediment dynamics in these environments.

  15. Which kind of aromatic structures are produced during biomass charring? New insights provided by modern solid-state NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Knicker, Heike; Paneque-Carmona, Marina; Velasco-Molina, Marta; de la Rosa, José Maria; León-Ovelar, Laura Regina; Fernandez-Boy, Elena

    2017-04-01

    Intense research on biochar and charcoal of the last years has revealed that depending on the production conditions, the chemical and physical characteristics of their aromatic network can greatly vary. Since such variations are determining the behavior and stability of charred material in soils, a better understanding of the structural changes occurring during their heating and the impact of those changes on their function is needed. One method to characterize pyrogenic organic matter (PyOM) represents solid-state 13C NMR spectroscopy applying the cross polarization (CP) magic angle spinning technique (MAS). A drawback of this technique is that the quantification of NMR spectra of samples with highly condensed and proton-depleted structures is assumed to be bias. Typical samples with such attributes are charcoals produced at temperatures above 700°C under pyrolytic conditions. Commonly their high condensation degree leads to graphenic structures that are not only reducing the CP efficiency but create also a conductive lattice which acts as a shield and prevents the entering of the excitation pulse into the sample during the NMR experiments. Since the latter can damage the NMR probe and in the most cases the obtained NMR spectra show only one broad signal assignable to aromatic C, this technique is rarely applied for characterizing high temperature chars or soot. As a consequence, a more detailed knowledge of the nature of the aromatic ring systems is still missing. The latter is also true for the aromatic domains of PyOM produced at lower temperatures, since older NMR instruments operating at low magnetic fields deliver solid-state 13C NMR spectra with low resolution which turns a more detailed analysis of the aromatic chemical shift region into a challenging task. In order to overcome this disadvantages, modern NMR spectroscopy offers not only instruments with greatly improved resolution but also special pulse sequences for NMR experiments which allow a more detailed chemical shift assignment. Applying the latter to various charcoals and biochars, we intended to test their usefulness for a better characterization of PyOM and elucidation how specific aromatic features can affect their behavior in soils. We could demonstrate that furans represent the major compound class of low temperature chars produced from woody material. As indicated by 2D techniques, residual alkyl C in such chars has minor covalent binding to the aromatic network. Reducing the electrical conductivity of high-temperature chars by addition of aluminum oxide permitted the application of the cross CP technique. Determination of the relaxation and CP dynamics confirmed high rigidity of their aromatic domains which were dominated by coronene-type moieties. In contrast to common view, we could demonstrate that quantifiable CP NMR spectra can be obtained from high temperature chars with contact times of 3 to 5 ms and pulse delays > 3 s.

  16. Efficient methods and readily customizable libraries for managing complexity of large networks.

    PubMed

    Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can

    2018-01-01

    One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.

  17. Psychological Strategies Included by Strength and Conditioning Coaches in Applied Strength and Conditioning.

    PubMed

    Radcliffe, Jon N; Comfort, Paul; Fawcett, Tom

    2015-09-01

    This study provided the basis by which professional development needs can be addressed and add to the applied sport psychology literature from an underresearched sport domain. This study endeavored to use qualitative methods to explore the specific techniques applied by the strength and conditioning professional. Eighteen participants were recruited for interview, through convenience sampling, drawn from a previously obtained sample. Included in the study were 10 participants working within the United Kingdom, 3 within the United States, and 5 within Australia offering a cross section of experience from ranging sport disciplines and educational backgrounds. Participants were interviewed using semistructured interviews. Thematic clustering was used by interpretative phonological analysis to identify common themes. The practitioners referred to a wealth of psychological skills and strategies that are used within strength and conditioning. Through thematic clustering, it was evident that a significant emphasis is on the development or maintenance of athlete self-confidence specifically with a large focus on goal setting. Similarly, albeit to a lesser extent, there was a notable attention on skill acquisition and arousal management strategies. The strategies used by the practitioners consisted of a combination of cognitive strategies and behavioral strategies. It is important to highlight the main psychological strategies that are suggested by strength and conditioning coaches themselves to guide professional development toward specific areas. Such development should strive to develop coaches' awareness of strategies to develop confidence, regulate arousal, and facilitate skill and technique development.

  18. The effect of high-power ultrasound and gas phase plasma treatment on Aspergillus spp. and Penicillium spp. count in pure culture.

    PubMed

    Herceg, Z; Režek Jambrak, A; Vukušić, T; Stulić, V; Stanzer, D; Milošević, S

    2015-01-01

    The aim of this study was to investigate and compare two nonthermal techniques in the inactivation of moulds. High power ultrasound (20 kHz) and nonthermal gas phase plasma treatments were studied in the inactivation of selected moulds. Aspergillus spp. and Penicillium spp. were chosen as the most common mould present in or on food. Experimental design was introduced to establish and optimize working variables. For high power ultrasound, the greatest reduction of moulds (indicated by the total removal of viable cells) was obtained after ultrasound treatments at 60°C (thermosonication) for 6 and 9 min (power applied, 20-39 W). For plasma treatment, the greatest inactivation of moulds was observed for the longest treatment time (5 min) and lowest sample volume (2 ml), (AP12, AP13, PP12 and PP13). The great amount of applied energy required for achieving a partial log reduction in viable cells is the limiting factor for using high-power ultrasound. However, both treatment methods could be combined in the future to produce beneficial outcomes. This study deals with nonthermal food processing techniques and the results and findings present in this study are the root for further prospective studies. The food industry is looking for nonthermal methods that will enable food preservation, reduce deterioration of food compounds and structure and prolong food shelf life. © 2014 The Society for Applied Microbiology.

  19. Investigation of interpolation techniques for the reconstruction of the first dimension of comprehensive two-dimensional liquid chromatography-diode array detector data.

    PubMed

    Allen, Robert C; Rutan, Sarah C

    2011-10-31

    Simulated and experimental data were used to measure the effectiveness of common interpolation techniques during chromatographic alignment of comprehensive two-dimensional liquid chromatography-diode array detector (LC×LC-DAD) data. Interpolation was used to generate a sufficient number of data points in the sampled first chromatographic dimension to allow for alignment of retention times from different injections. Five different interpolation methods, linear interpolation followed by cross correlation, piecewise cubic Hermite interpolating polynomial, cubic spline, Fourier zero-filling, and Gaussian fitting, were investigated. The fully aligned chromatograms, in both the first and second chromatographic dimensions, were analyzed by parallel factor analysis to determine the relative area for each peak in each injection. A calibration curve was generated for the simulated data set. The standard error of prediction and percent relative standard deviation were calculated for the simulated peak for each technique. The Gaussian fitting interpolation technique resulted in the lowest standard error of prediction and average relative standard deviation for the simulated data. However, upon applying the interpolation techniques to the experimental data, most of the interpolation methods were not found to produce statistically different relative peak areas from each other. While most of the techniques were not statistically different, the performance was improved relative to the PARAFAC results obtained when analyzing the unaligned data. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. Prospective Study of the Surgical Techniques Used in Primary Rhinoplasty on the Caucasian Nose and Comparison of the Preoperative and Postoperative Anthropometric Nose Measurements

    PubMed Central

    Berger, Cezar Augusto Sarraf; Freitas, Renato da Silva; Malafaia, Osvaldo; Pinto, José Simão de Paula; Macedo Filho, Evaldo Dacheux; Mocellin, Marcos; Fagundes, Marina Serrato Coelho

    2014-01-01

    Introduction The knowledge and study of surgical techniques and anthropometric measurements of the nose make possible a qualitative and quantitative analysis of surgical results. Objective Study the main technique used in rhinoplasty on Caucasian noses and compare preoperative and postoperative anthropometric measurements of the nose. Methods A prospective study with 170 patients was performed at a private hospital. Data were collected using the Electronic System Integrated of Protocols software (Sistema Integrado de Protocolos Eletrônicos, SINPE©). The surgical techniques used in the nasal dorsum and tip were evaluated. Preoperative and 12-month follow-up photos as well as the measurements compared with the ideal aesthetic standard of a Caucasian nose were analyzed objectively. Student t test and standard deviation test were applied. Results There was a predominance of endonasal access (94.4%). The most common dorsum technique was hump removal (33.33%), and the predominance of sutures (24.76%) was observed on the nasal tip, with the lateral intercrural the most frequent (32.39%). Comparison between preoperative and postoperative photos found statistically significant alterations on the anthropometric measurements of the noses. Conclusion The main surgical techniques on Caucasian noses were evaluated, and a great variety was found. The evaluation of anthropometric measurements of the nose proved the efficiency of the performed procedures. PMID:25992149

Top