Sample records for analysis technique called

  1. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  2. Static analysis of class invariants in Java programs

    NASA Astrophysics Data System (ADS)

    Bonilla-Quintero, Lidia Dionisia

    2011-12-01

    This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.

  3. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  4. Recent advances of liquid chromatography-(tandem) mass spectrometry in clinical and forensic toxicology - An update.

    PubMed

    Remane, Daniela; Wissenbach, Dirk K; Peters, Frank T

    2016-09-01

    Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) is a well-established and widely used technique in clinical and forensic toxicology as well as doping control especially for quantitative analysis. In recent years, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in biological matrices have been developed. Such methods have proven particularly useful for analysis of so-called new psychoactive substances that have appeared on recreational drug markets throughout the world. Moreover, the evolvement of high resolution MS techniques and the development of data-independent detection modes have opened new possibilities for applications of LC-(MS/MS) in systematic toxicological screening analysis in the so called general unknown setting. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2010. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. Using Single Drop Microextraction for Headspace Analysis with Gas Chromatography

    ERIC Educational Resources Information Center

    Riccio, Daniel; Wood, Derrick C.; Miller, James M.

    2008-01-01

    Headspace (HS) gas chromatography (GC) is commonly used to analyze samples that contain non-volatiles. In 1996, a new sampling technique called single drop microextraction, SDME, was introduced, and in 2001 it was applied to HS analysis. It is a simple technique that uses equipment normally found in the undergraduate laboratory, making it ideal…

  6. Inverse Function: Pre-Service Teachers' Techniques and Meanings

    ERIC Educational Resources Information Center

    Paoletti, Teo; Stevens, Irma E.; Hobson, Natalie L. F.; Moore, Kevin C.; LaForest, Kevin R.

    2018-01-01

    Researchers have argued teachers and students are not developing connected meanings for function inverse, thus calling for a closer examination of teachers' and students' inverse function meanings. Responding to this call, we characterize 25 pre-service teachers' inverse function meanings as inferred from our analysis of clinical interviews. After…

  7. Computer Code for Transportation Network Design and Analysis

    DOT National Transportation Integrated Search

    1977-01-01

    This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...

  8. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  9. Dynamic malware analysis using IntroVirt: a modified hypervisor-based system

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.

    2013-05-01

    In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.

  10. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  11. Synchronous Stroboscopic Electronic Speckle Pattern Interferometry

    NASA Astrophysics Data System (ADS)

    Soares, Oliverio D. D.

    1986-10-01

    Electronic Speckle Pattern Interferometry (E.S.P.I) oftenly called Electronic Holography is a practical powerful technique in non-destructive testing. Practical capabilities of the technique have been improved by fringe betterment and the control of analysis in the time domain, in particular, the scanning of the vibration cycle, with introduction of: synchronized amplitude and phase modulated pulse illumination, microcomputer control, fibre optics design, and moire evaluation techniques.

  12. Group decision-making techniques for natural resource management applications

    USGS Publications Warehouse

    Coughlan, Beth A.K.; Armour, Carl L.

    1992-01-01

    This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.

  13. Nonlinear acoustics in cicada mating calls enhance sound propagation.

    PubMed

    Hughes, Derke R; Nuttall, Albert H; Katz, Richard A; Carter, G Clifford

    2009-02-01

    An analysis of cicada mating calls, measured in field experiments, indicates that the very high levels of acoustic energy radiated by this relatively small insect are mainly attributed to the nonlinear characteristics of the signal. The cicada emits one of the loudest sounds in all of the insect population with a sound production system occupying a physical space typically less than 3 cc. The sounds made by tymbals are amplified by the hollow abdomen, functioning as a tuned resonator, but models of the signal based solely on linear techniques do not fully account for a sound radiation capability that is so disproportionate to the insect's size. The nonlinear behavior of the cicada signal is demonstrated by combining the mutual information and surrogate data techniques; the results obtained indicate decorrelation when the phase-randomized and non-phase-randomized data separate. The Volterra expansion technique is used to fit the nonlinearity in the insect's call. The second-order Volterra estimate provides further evidence that the cicada mating calls are dominated by nonlinear characteristics and also suggests that the medium contributes to the cicada's efficient sound propagation. Application of the same principles has the potential to improve radiated sound levels for sonar applications.

  14. Analysis Techniques for Microwave Dosimetric Data.

    DTIC Science & Technology

    1985-10-01

    the number of steps in the frequency list . 0062 C ----------------------------------------------------------------------- 0063 CALL FILE2() 0064...starting frequency, 0061 C the step size, and the number of steps in the frequency list . 0062 C

  15. Calling depths of baleen whales from single sensor data: development of an autocorrelation method using multipath localization.

    PubMed

    Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M

    2013-09-01

    Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.

  16. Mastering Overdetection and Underdetection in Learner-Answer Processing: Simple Techniques for Analysis and Diagnosis

    ERIC Educational Resources Information Center

    Blanchard, Alexia; Kraif, Olivier; Ponton, Claude

    2009-01-01

    This paper presents a "didactic triangulation" strategy to cope with the problem of reliability of NLP applications for computer-assisted language learning (CALL) systems. It is based on the implementation of basic but well mastered NLP techniques and puts the emphasis on an adapted gearing between computable linguistic clues and didactic features…

  17. Setting technical standards for visual assessment procedures

    Treesearch

    Kenneth H. Craik; Nickolaus R. Feimer

    1979-01-01

    Under the impetus of recent legislative and administrative mandates concerning analysis and management of the landscape, governmental agencies are being called upon to adopt or develop visual resource and impact assessment (VRIA) systems. A variety of techniques that combine methods of psychological assessment and landscape analysis to serve these purposes is being...

  18. Conversation Analysis in Computer-Assisted Language Learning

    ERIC Educational Resources Information Center

    González-Lloret, Marta

    2015-01-01

    The use of Conversation Analysis (CA) in the study of technology-mediated interactions is a recent methodological addition to qualitative research in the field of Computer-assisted Language Learning (CALL). The expansion of CA in Second Language Acquisition research, coupled with the need for qualitative techniques to explore how people interact…

  19. The Association for Behavior Analysis International Position Statement on Restraint and Seclusion

    ERIC Educational Resources Information Center

    Vollmer, Timothy R.; Hagopian, Louis P.; Bailey, Jon S.; Dorsey, Michael F.; Hanley, Gregory P.; Lennox, David; Riordan, Mary M.; Spreat, Scott

    2011-01-01

    A task force authorized by the Executive Council of the Association for Behavior Analysis International (ABAI) generated the statement below concerning the techniques called "restraint" and "seclusion." Members of the task force independently reviewed the scientific literature concerning restraint and seclusion and agreed unanimously to the…

  20. Moving beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis

    ERIC Educational Resources Information Center

    Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.

    2016-01-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…

  1. A pilot modeling technique for handling-qualities research

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  2. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  3. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  4. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  5. Application of Local Linear Embedding to Nonlinear Exploratory Latent Structure Analysis

    ERIC Educational Resources Information Center

    Wang, Haonan; Iyer, Hari

    2007-01-01

    In this paper we discuss the use of a recent dimension reduction technique called Locally Linear Embedding, introduced by Roweis and Saul, for performing an exploratory latent structure analysis. The coordinate variables from the locally linear embedding describing the manifold on which the data reside serve as the latent variable scores. We…

  6. Risk Benefit Analysis of Health Promotion: Opportunities and Threats for Physical Education.

    ERIC Educational Resources Information Center

    Vertinsky, Patricia

    1985-01-01

    The increasing popularity of health promotion and lifestyle management techniques call for a careful look at the misuse and costs of suasion, imposition of values as science, social inequities and individual consequences, and biases in communication of health risk information. The application of more systematic cost-benefit analysis is…

  7. Comparison of VRX CT scanners geometries

    NASA Astrophysics Data System (ADS)

    DiBianca, Frank A.; Melnyk, Roman; Duckworth, Christopher N.; Russ, Stephan; Jordan, Lawrence M.; Laughter, Joseph S.

    2001-06-01

    A technique called Variable-Resolution X-ray (VRX) detection greatly increases the spatial resolution in computed tomography (CT) and digital radiography (DR) as the field size decreases. The technique is based on a principle called `projective compression' that allows both the resolution element and the sampling distance of a CT detector to scale with the subject or field size. For very large (40 - 50 cm) field sizes, resolution exceeding 2 cy/mm is possible and for very small fields, microscopy is attainable with resolution exceeding 100 cy/mm. This paper compares the benefits obtainable with two different VRX detector geometries: the single-arm geometry and the dual-arm geometry. The analysis is based on Monte Carlo simulations and direct calculations. The results of this study indicate that the dual-arm system appears to have more advantages than the single-arm technique.

  8. [Techniques for rapid production of monoclonal antibodies for use with antibody technology].

    PubMed

    Kamada, Haruhiko

    2012-01-01

    A monoclonal antibody (Mab), due to its specific binding ability to a target protein, can potentially be one of the most useful tools for the functional analysis of proteins in recent proteomics-based research. However, the production of Mab is a very time-consuming and laborious process (i.e., preparation of recombinant antigens, immunization of animals, preparation of hybridomas), making it the rate-limiting step in using Mabs in high-throughput proteomics research, which heavily relies on comprehensive and rapid methods. Therefore, there is a great demand for new methods to efficiently generate Mabs against a group of proteins identified by proteome analysis. Here, we describe a useful method called "Antibody proteomic technique" for the rapid generations of Mabs to pharmaceutical target, which were identified by proteomic analyses of disease samples (ex. tumor tissue, etc.). We also introduce another method to find profitable targets on vasculature, which is called "Vascular proteomic technique". Our results suggest that this method for the rapid generation of Mabs to proteins may be very useful in proteomics-based research as well as in clinical applications.

  9. Non-song social call bouts of migrating humpback whales

    PubMed Central

    Rekdahl, Melinda L.; Dunlop, Rebecca A.; Goldizen, Anne W.; Garland, Ellen C.; Biassoni, Nicoletta; Miller, Patrick; Noad, Michael J.

    2015-01-01

    The use of stereotyped calls within structured bouts has been described for a number of species and may increase the information potential of call repertoires. Humpback whales produce a repertoire of social calls, although little is known about the complexity or function of these calls. In this study, digital acoustic tag recordings were used to investigate social call use within bouts, the use of bouts across different social contexts, and whether particular call type combinations were favored. Call order within bouts was investigated using call transition frequencies and information theory techniques. Call bouts were defined through analysis of inter-call intervals, as any calls within 3.9 s of each other. Bouts were produced significantly more when new whales joined a group compared to groups that did not change membership, and in groups containing multiple adults escorting a female and calf compared to adult only groups. Although social calls tended to be produced in bouts, there were few repeated bout types. However, the order in which most call types were produced within bouts was non-random and dependent on the preceding call type. These bouts appear to be at least partially governed by rules for how individual components are combined. PMID:26093396

  10. Non-song social call bouts of migrating humpback whales.

    PubMed

    Rekdahl, Melinda L; Dunlop, Rebecca A; Goldizen, Anne W; Garland, Ellen C; Biassoni, Nicoletta; Miller, Patrick; Noad, Michael J

    2015-06-01

    The use of stereotyped calls within structured bouts has been described for a number of species and may increase the information potential of call repertoires. Humpback whales produce a repertoire of social calls, although little is known about the complexity or function of these calls. In this study, digital acoustic tag recordings were used to investigate social call use within bouts, the use of bouts across different social contexts, and whether particular call type combinations were favored. Call order within bouts was investigated using call transition frequencies and information theory techniques. Call bouts were defined through analysis of inter-call intervals, as any calls within 3.9 s of each other. Bouts were produced significantly more when new whales joined a group compared to groups that did not change membership, and in groups containing multiple adults escorting a female and calf compared to adult only groups. Although social calls tended to be produced in bouts, there were few repeated bout types. However, the order in which most call types were produced within bouts was non-random and dependent on the preceding call type. These bouts appear to be at least partially governed by rules for how individual components are combined.

  11. Addressee Errors in ATC Communications: The Call Sign Problem

    NASA Technical Reports Server (NTRS)

    Monan, W. P.

    1983-01-01

    Communication errors involving aircraft call signs were portrayed in reports of 462 hazardous incidents voluntarily submitted to the ASRS during an approximate four-year period. These errors resulted in confusion, disorder, and uncoordinated traffic conditions and produced the following types of operational anomalies: altitude deviations, wrong-way headings, aborted takeoffs, go arounds, runway incursions, missed crossing altitude restrictions, descents toward high terrain, and traffic conflicts in flight and on the ground. Analysis of the report set resulted in identification of five categories of errors involving call signs: (1) faulty radio usage techniques, (2) call sign loss or smearing due to frequency congestion, (3) confusion resulting from similar sounding call signs, (4) airmen misses of call signs leading to failures to acknowledge or readback, and (5) controller failures regarding confirmation of acknowledgements or readbacks. These error categories are described in detail and several associated hazard mitigating measures that might be aken are considered.

  12. Mathematical Idea Analysis: What Embodied Cognitive Science Can Say about the Human Nature of Mathematics.

    ERIC Educational Resources Information Center

    Nunez, Rafael E.

    This paper gives a brief introduction to a discipline called the cognitive science of mathematics. The theoretical background of the arguments is based on embodied cognition and findings in cognitive linguistics. It discusses Mathematical Idea Analysis, a set of techniques for studying implicit structures in mathematics. Particular attention is…

  13. Incorporating principal component analysis into air quality model evaluation

    EPA Science Inventory

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Princi...

  14. Visualising nursing data using correspondence analysis.

    PubMed

    Kokol, Peter; Blažun Vošner, Helena; Železnik, Danica

    2016-09-01

    Digitally stored, large healthcare datasets enable nurses to use 'big data' techniques and tools in nursing research. Big data is complex and multi-dimensional, so visualisation may be a preferable approach to analyse and understand it. To demonstrate the use of visualisation of big data in a technique called correspondence analysis. In the authors' study, relations among data in a nursing dataset were shown visually in graphs using correspondence analysis. The case presented demonstrates that correspondence analysis is easy to use, shows relations between data visually in a form that is simple to interpret, and can reveal hidden associations between data. Correspondence analysis supports the discovery of new knowledge. Implications for practice Knowledge obtained using correspondence analysis can be transferred immediately into practice or used to foster further research.

  15. Digression and Value Concatenation to Enable Privacy-Preserving Regression.

    PubMed

    Li, Xiao-Bai; Sarkar, Sumit

    2014-09-01

    Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.

  16. In-situ Isotopic Analysis at Nanoscale using Parallel Ion Electron Spectrometry: A Powerful New Paradigm for Correlative Microscopy

    NASA Astrophysics Data System (ADS)

    Yedra, Lluís; Eswara, Santhana; Dowsett, David; Wirtz, Tom

    2016-06-01

    Isotopic analysis is of paramount importance across the entire gamut of scientific research. To advance the frontiers of knowledge, a technique for nanoscale isotopic analysis is indispensable. Secondary Ion Mass Spectrometry (SIMS) is a well-established technique for analyzing isotopes, but its spatial-resolution is fundamentally limited. Transmission Electron Microscopy (TEM) is a well-known method for high-resolution imaging down to the atomic scale. However, isotopic analysis in TEM is not possible. Here, we introduce a powerful new paradigm for in-situ correlative microscopy called the Parallel Ion Electron Spectrometry by synergizing SIMS with TEM. We demonstrate this technique by distinguishing lithium carbonate nanoparticles according to the isotopic label of lithium, viz. 6Li and 7Li and imaging them at high-resolution by TEM, adding a new dimension to correlative microscopy.

  17. Impact of genotyping errors on statistical power of association tests in genomic analyses: A case study

    PubMed Central

    Hou, Lin; Sun, Ning; Mane, Shrikant; Sayward, Fred; Rajeevan, Nallakkandi; Cheung, Kei-Hoi; Cho, Kelly; Pyarajan, Saiju; Aslan, Mihaela; Miller, Perry; Harvey, Philip D.; Gaziano, J. Michael; Concato, John; Zhao, Hongyu

    2017-01-01

    A key step in genomic studies is to assess high throughput measurements across millions of markers for each participant’s DNA, either using microarrays or sequencing techniques. Accurate genotype calling is essential for downstream statistical analysis of genotype-phenotype associations, and next generation sequencing (NGS) has recently become a more common approach in genomic studies. How the accuracy of variant calling in NGS-based studies affects downstream association analysis has not, however, been studied using empirical data in which both microarrays and NGS were available. In this article, we investigate the impact of variant calling errors on the statistical power to identify associations between single nucleotides and disease, and on associations between multiple rare variants and disease. Both differential and nondifferential genotyping errors are considered. Our results show that the power of burden tests for rare variants is strongly influenced by the specificity in variant calling, but is rather robust with regard to sensitivity. By using the variant calling accuracies estimated from a substudy of a Cooperative Studies Program project conducted by the Department of Veterans Affairs, we show that the power of association tests is mostly retained with commonly adopted variant calling pipelines. An R package, GWAS.PC, is provided to accommodate power analysis that takes account of genotyping errors (http://zhaocenter.org/software/). PMID:28019059

  18. Using Single Drop Microextraction for Headspace Analysis with Gas Chromatography

    NASA Astrophysics Data System (ADS)

    Riccio, Daniel; Wood, Derrick C.; Miller, James M.

    2008-07-01

    Headspace (HS) gas chromatography (GC) is commonly used to analyze samples that contain non-volatiles. In 1996, a new sampling technique called single drop microextraction, SDME, was introduced, and in 2001 it was applied to HS analysis. It is a simple technique that uses equipment normally found in the undergraduate laboratory, making it ideal for instructional use, especially to illustrate HS analysis or as an alternative to solid-phase microextraction (SPME) to which it is very similar. The basic principles and practice of HS-GC using SDME are described, including a complete review of the literature. Some possible experiments are suggested using water and N -methylpyrrolidone (NMP) as solvents.

  19. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  20. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  1. Mathematical Practice in Textbooks Analysis: Praxeological Reference Models, the Case of Proportion

    ERIC Educational Resources Information Center

    Wijayanti, Dyana; Winsløw, Carl

    2017-01-01

    We present a new method in textbook analysis, based on so-called praxeological reference models focused on specific content at task level. This method implies that the mathematical contents of a textbook (or textbook part) is analyzed in terms of the tasks and techniques which are exposed to or demanded from readers; this can then be interpreted…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genest-Beaulieu, C.; Bergeron, P., E-mail: genest@astro.umontreal.ca, E-mail: bergeron@astro.umontreal.ca

    We present a comparative analysis of atmospheric parameters obtained with the so-called photometric and spectroscopic techniques. Photometric and spectroscopic data for 1360 DA white dwarfs from the Sloan Digital Sky Survey (SDSS) are used, as well as spectroscopic data from the Villanova White Dwarf Catalog. We first test the calibration of the ugriz photometric system by using model atmosphere fits to observed data. Our photometric analysis indicates that the ugriz photometry appears well calibrated when the SDSS to AB{sub 95} zeropoint corrections are applied. The spectroscopic analysis of the same data set reveals that the so-called high-log g problem canmore » be solved by applying published correction functions that take into account three-dimensional hydrodynamical effects. However, a comparison between the SDSS and the White Dwarf Catalog spectra also suggests that the SDSS spectra still suffer from a small calibration problem. We then compare the atmospheric parameters obtained from both fitting techniques and show that the photometric temperatures are systematically lower than those obtained from spectroscopic data. This systematic offset may be linked to the hydrogen line profiles used in the model atmospheres. We finally present the results of an analysis aimed at measuring surface gravities using photometric data only.« less

  3. Interoperability Policy Roadmap

    DTIC Science & Technology

    2010-01-01

    Retrieval – SMART The technique developed by Dr. Gerard Salton for automated information retrieval and text analysis is called the vector-space... Salton , G., Wong, A., Yang, C.S., “A Vector Space Model for Automatic Indexing”, Commu- nications of the ACM, 18, 613-620. [10] Salton , G., McGill

  4. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  5. Loran-C Signal Stability Study: St. Lawrence Seaway

    DTIC Science & Technology

    1982-07-01

    call this the " porno movie syndrome.’) If, several years from now, we think of another analysis technique, we can immediately apply it. There will...d -S ccr ,,. * -. . WE O- W K D-4. 20. . ...... I s 41 in Ado " 4- f . 4.......... ......... ................. ..... ... .... . .. .. .. . .. . . Li

  6. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  7. Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent

    PubMed Central

    De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle

    2018-01-01

    Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770

  8. Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.

  9. Avoiding escalation from play to aggression in adult male rats: The role of ultrasonic calls.

    PubMed

    Burke, Candace J; Kisko, Theresa M; Pellis, Sergio M; Euston, David R

    2017-11-01

    Play fighting is most commonly associated with juvenile animals, but in some species, including rats, it can continue into adulthood. Post-pubertal engagement in play fighting is often rougher and has an increased chance of escalation to aggression, making the use of play signals to regulate the encounter more critical. During play, both juvenile and adult rats emit many 50-kHz calls and some of these may function as play facilitating signals. In the present study, unfamiliar adult male rats were introduced in a neutral enclosure and their social interactions were recorded. While all pairs escalated their playful encounters to become rougher, only the pairs in which one member was devocalized escalated to serious biting. A Monte Carlo shuffling technique was used for the analysis of the correlations between the overt playful and aggressive actions performed and the types and frequencies of various 50-kHz calls that were emitted. The analysis revealed that lower frequency (20-30kHz) calls with a flat component maybe particularly critical for de-escalating encounters and so allowing play to continue. Moreover, coordinating calls reciprocally, with either the same call mimicked in close, temporal association or with complementary calls emitted by participants as they engage in complementary actions (e.g., attacking the nape, being attacked on the nape), appeared to be ways with which calls could be potentially used to avoid escalation to aggression and so sustain playful interactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Memory Forensics: Review of Acquisition and Analysis Techniques

    DTIC Science & Technology

    2013-11-01

    Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within

  11. Discovering Authorities and Hubs in Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2002-01-01

    Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)

  12. Qualitative Analysis Techniques for the Review of the Literature

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.; Collins, Kathleen M. T.

    2012-01-01

    In this article, we provide a framework for analyzing and interpreting sources that inform a literature review or, as it is more aptly called, a research synthesis. Specifically, using Leech and Onwuegbuzie's (2007, 2008) frameworks, we delineate how the following four major source types inform research syntheses: talk, observations,…

  13. Designing a more efficient, effective and safe Medical Emergency Team (MET) service using data analysis

    PubMed Central

    Bilgrami, Irma; Bain, Christopher; Webb, Geoffrey I.; Orosz, Judit; Pilcher, David

    2017-01-01

    Introduction Hospitals have seen a rise in Medical Emergency Team (MET) reviews. We hypothesised that the commonest MET calls result in similar treatments. Our aim was to design a pre-emptive management algorithm that allowed direct institution of treatment to patients without having to wait for attendance of the MET team and to model its potential impact on MET call incidence and patient outcomes. Methods Data was extracted for all MET calls from the hospital database. Association rule data mining techniques were used to identify the most common combinations of MET call causes, outcomes and therapies. Results There were 13,656 MET calls during the 34-month study period in 7936 patients. The most common MET call was for hypotension [31%, (2459/7936)]. These MET calls were strongly associated with the immediate administration of intra-venous fluid (70% [1714/2459] v 13% [739/5477] p<0.001), unless the patient was located on a respiratory ward (adjusted OR 0.41 [95%CI 0.25–0.67] p<0.001), had a cardiac cause for admission (adjusted OR 0.61 [95%CI 0.50–0.75] p<0.001) or was under the care of the heart failure team (adjusted OR 0.29 [95%CI 0.19–0.42] p<0.001). Modelling the effect of a pre-emptive management algorithm for immediate fluid administration without MET activation on data from a test period of 24 months following the study period, suggested it would lead to a 68.7% (2541/3697) reduction in MET calls for hypotension and a 19.6% (2541/12938) reduction in total METs without adverse effects on patients. Conclusion Routinely collected data and analytic techniques can be used to develop a pre-emptive management algorithm to administer intravenous fluid therapy to a specific group of hypotensive patients without the need to initiate a MET call. This could both lead to earlier treatment for the patient and less total MET calls. PMID:29281665

  14. Real-time surgical simulation for deformable soft-tissue objects with a tumour using Boundary Element techniques

    NASA Astrophysics Data System (ADS)

    Wang, P.; Becker, A. A.; Jones, I. A.; Glover, A. T.; Benford, S. D.; Vloeberghs, M.

    2009-08-01

    A virtual-reality real-time simulation of surgical operations that incorporates the inclusion of a hard tumour is presented. The software is based on Boundary Element (BE) technique. A review of the BE formulation for real-time analysis of two-domain deformable objects, using the pre-solution technique, is presented. The two-domain BE software is incorporated into a surgical simulation system called VIRS to simulate the initiation of a cut on the surface of the soft tissue and extending the cut deeper until the tumour is reached.

  15. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  16. Horses and cows might teach us about human knees

    NASA Astrophysics Data System (ADS)

    Holland, C.; Vollrath, F.; Gill, H. S.

    2014-04-01

    Our comparative study of the knees of horses and cows (paraphrased as highly evolved joggers and as domesticated couch-potatoes, respectively) demonstrates significant differences in the posterior sections of bovine and equine tibial cartilage, which are consistent with specialisation for gait. These insights were possible using a novel analytical measuring technique based on the shearing of small biopsy samples, called dynamic shear analysis. We assert that this technique could provide a powerful new tool to precisely quantify the pathology of osteoarthritis for the medical field.

  17. Diffraction analysis of customized illumination technique

    NASA Astrophysics Data System (ADS)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  18. Analysis of Questionnaires Applied in the Evaluation Process of Academicians in Higher Education Institutes

    ERIC Educational Resources Information Center

    Kalayci, Nurdan; Cimen, Orhan

    2012-01-01

    The aim of this study is to examine the questionnaires used to evaluate teaching performance in higher education institutes and called "Instructor and Course Evaluation Questionnaires (ICEQ)" in terms of questionnaire preparation techniques and components of curriculum. Obtaining at least one ICEQ belonging to any state and private…

  19. Multiscale Embedded Gene Co-expression Network Analysis

    PubMed Central

    Song, Won-Min; Zhang, Bin

    2015-01-01

    Gene co-expression network analysis has been shown effective in identifying functional co-expressed gene modules associated with complex human diseases. However, existing techniques to construct co-expression networks require some critical prior information such as predefined number of clusters, numerical thresholds for defining co-expression/interaction, or do not naturally reproduce the hallmarks of complex systems such as the scale-free degree distribution of small-worldness. Previously, a graph filtering technique called Planar Maximally Filtered Graph (PMFG) has been applied to many real-world data sets such as financial stock prices and gene expression to extract meaningful and relevant interactions. However, PMFG is not suitable for large-scale genomic data due to several drawbacks, such as the high computation complexity O(|V|3), the presence of false-positives due to the maximal planarity constraint, and the inadequacy of the clustering framework. Here, we developed a new co-expression network analysis framework called Multiscale Embedded Gene Co-expression Network Analysis (MEGENA) by: i) introducing quality control of co-expression similarities, ii) parallelizing embedded network construction, and iii) developing a novel clustering technique to identify multi-scale clustering structures in Planar Filtered Networks (PFNs). We applied MEGENA to a series of simulated data and the gene expression data in breast carcinoma and lung adenocarcinoma from The Cancer Genome Atlas (TCGA). MEGENA showed improved performance over well-established clustering methods and co-expression network construction approaches. MEGENA revealed not only meaningful multi-scale organizations of co-expressed gene clusters but also novel targets in breast carcinoma and lung adenocarcinoma. PMID:26618778

  20. Multiscale Embedded Gene Co-expression Network Analysis.

    PubMed

    Song, Won-Min; Zhang, Bin

    2015-11-01

    Gene co-expression network analysis has been shown effective in identifying functional co-expressed gene modules associated with complex human diseases. However, existing techniques to construct co-expression networks require some critical prior information such as predefined number of clusters, numerical thresholds for defining co-expression/interaction, or do not naturally reproduce the hallmarks of complex systems such as the scale-free degree distribution of small-worldness. Previously, a graph filtering technique called Planar Maximally Filtered Graph (PMFG) has been applied to many real-world data sets such as financial stock prices and gene expression to extract meaningful and relevant interactions. However, PMFG is not suitable for large-scale genomic data due to several drawbacks, such as the high computation complexity O(|V|3), the presence of false-positives due to the maximal planarity constraint, and the inadequacy of the clustering framework. Here, we developed a new co-expression network analysis framework called Multiscale Embedded Gene Co-expression Network Analysis (MEGENA) by: i) introducing quality control of co-expression similarities, ii) parallelizing embedded network construction, and iii) developing a novel clustering technique to identify multi-scale clustering structures in Planar Filtered Networks (PFNs). We applied MEGENA to a series of simulated data and the gene expression data in breast carcinoma and lung adenocarcinoma from The Cancer Genome Atlas (TCGA). MEGENA showed improved performance over well-established clustering methods and co-expression network construction approaches. MEGENA revealed not only meaningful multi-scale organizations of co-expressed gene clusters but also novel targets in breast carcinoma and lung adenocarcinoma.

  1. Mitigating Handoff Call Dropping in Wireless Cellular Networks: A Call Admission Control Technique

    NASA Astrophysics Data System (ADS)

    Ekpenyong, Moses Effiong; Udoh, Victoria Idia; Bassey, Udoma James

    2016-06-01

    Handoff management has been an important but challenging issue in the field of wireless communication. It seeks to maintain seamless connectivity of mobile users changing their points of attachment from one base station to another. This paper derives a call admission control model and establishes an optimal step-size coefficient (k) that regulates the admission probability of handoff calls. An operational CDMA network carrier was investigated through the analysis of empirical data collected over a period of 1 month, to verify the performance of the network. Our findings revealed that approximately 23 % of calls in the existing system were lost, while 40 % of the calls (on the average) were successfully admitted. A simulation of the proposed model was then carried out under ideal network conditions to study the relationship between the various network parameters and validate our claim. Simulation results showed that increasing the step-size coefficient degrades the network performance. Even at optimum step-size (k), the network could still be compromised in the presence of severe network crises, but our model was able to recover from these problems and still functions normally.

  2. Texton-based analysis of paintings

    NASA Astrophysics Data System (ADS)

    van der Maaten, Laurens J. P.; Postma, Eric O.

    2010-08-01

    The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of their study of paintings.

  3. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  4. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  5. Feasibility of MOS Task Analysis and Redesign to Reduce Physical Demands in the U.S. Army

    DTIC Science & Technology

    1997-12-01

    developed to study perchery workers (Scott & Lamb , 1996). Another posture analysis technique is called postural targeting (Corlett, et al., 1979). A...method which had been successfully applied to a variety of situations (Lee & Chiou, 1995; Scott & Lamb , 1996). Some modifications were made in the...Scott, G.B., & Lamb , N.R. (1996). Working practices in a perchery system, using the Ovako Working Posture Analyzing System (OWAS). Applied Ergonomics

  6. Single-molecule fluorescence microscopy review: shedding new light on old problems

    PubMed Central

    Shashkova, Sviatlana

    2017-01-01

    Fluorescence microscopy is an invaluable tool in the biosciences, a genuine workhorse technique offering exceptional contrast in conjunction with high specificity of labelling with relatively minimal perturbation to biological samples compared with many competing biophysical techniques. Improvements in detector and dye technologies coupled to advances in image analysis methods have fuelled recent development towards single-molecule fluorescence microscopy, which can utilize light microscopy tools to enable the faithful detection and analysis of single fluorescent molecules used as reporter tags in biological samples. For example, the discovery of GFP, initiating the so-called ‘green revolution’, has pushed experimental tools in the biosciences to a completely new level of functional imaging of living samples, culminating in single fluorescent protein molecule detection. Today, fluorescence microscopy is an indispensable tool in single-molecule investigations, providing a high signal-to-noise ratio for visualization while still retaining the key features in the physiological context of native biological systems. In this review, we discuss some of the recent discoveries in the life sciences which have been enabled using single-molecule fluorescence microscopy, paying particular attention to the so-called ‘super-resolution’ fluorescence microscopy techniques in live cells, which are at the cutting-edge of these methods. In particular, how these tools can reveal new insights into long-standing puzzles in biology: old problems, which have been impossible to tackle using other more traditional tools until the emergence of new single-molecule fluorescence microscopy techniques. PMID:28694303

  7. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  8. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  9. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  10. Visualization of Concurrent Program Executions

    NASA Technical Reports Server (NTRS)

    Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi

    2007-01-01

    Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.

  11. "Whoa! We're Going Deep in the Trees!": Patterns of Collaboration around an Interactive Information Visualization Exhibit

    ERIC Educational Resources Information Center

    Davis, Pryce; Horn, Michael; Block, Florian; Phillips, Brenda; Evans, E. Margaret; Diamond, Judy; Shen, Chia

    2015-01-01

    In this paper we present a qualitative analysis of natural history museum visitor interaction around a multi-touch tabletop exhibit called "DeepTree" that we designed around concepts of evolution and common descent. DeepTree combines several large scientific datasets and an innovative visualization technique to display a phylogenetic…

  12. What does nonforest land contribute to the global carbon balance?

    Treesearch

    Jennifer C. Jenkins; Rachel Riemann

    2002-01-01

    An inventory of land traditionally called "nonforest" and therefore not sampled by the Forest Inventory and Analysis (FIA) program was implemented by the FIA unit at the Northeastern Station in 1999 for five counties in Maryland. Biomass and biomass increment were estimated from the nonforest inventory data using techniques developed for application to large-...

  13. Ultrasonic inspection and analysis techniques in green and dried lumber

    Treesearch

    Mark E. Schafer; Robert J. Ross; Brian K. Brashaw; Roy D. Adams

    1999-01-01

    Ultrasonic inspection of lumber has been under investigation for over 20 years, with little commercial impact. Recently, the USDA Forest Products Laboratory (FPL) developed ultrasound-based scanning technology to examine both green and dried lumber. In green lumber, the bacterial infection called wetwood (a significant source of degradation in oak at the kiln-drying...

  14. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  15. Taming the Wild: A Unified Analysis of Hogwild!-Style Algorithms.

    PubMed

    De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher

    2015-12-01

    Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we use our new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild!) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild!, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.

  16. Transmission ultrasonography. [time delay spectrometry for soft tissue transmission imaging

    NASA Technical Reports Server (NTRS)

    Heyser, R. C.; Le Croissette, D. H.

    1973-01-01

    Review of the results of the application of an advanced signal-processing technique, called time delay spectrometry, in obtaining soft tissue transmission images by transmission ultrasonography, both in vivo and in vitro. The presented results include amplitude ultrasound pictures and phase ultrasound pictures obtained by this technique. While amplitude ultrasonographs of tissue are closely analogous to X-ray pictures in that differential absorption is imaged, phase ultrasonographs represent an entirely new source of information based on differential time of propagation. Thus, a new source of information is made available for detailed analysis.

  17. Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.

  18. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  19. Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.

    There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less

  20. Distributed Contour Trees

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Weber, Gunther H.

    2014-03-31

    Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.

  1. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  2. Numerical solution of stiff systems of ordinary differential equations with applications to electronic circuits

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. S.

    1971-01-01

    Systems of ordinary differential equations in which the magnitudes of the eigenvalues (or time constants) vary greatly are commonly called stiff. Such systems of equations arise in nuclear reactor kinetics, the flow of chemically reacting gas, dynamics, control theory, circuit analysis and other fields. The research reported develops an A-stable numerical integration technique for solving stiff systems of ordinary differential equations. The method, which is called the generalized trapezoidal rule, is a modification of the trapezoidal rule. However, the method is computationally more efficient than the trapezoidal rule when the solution of the almost-discontinuous segments is being calculated.

  3. Tracking urban human activity from mobile phone calling patterns

    PubMed Central

    Ghosh, Asim; Bhattacharya, Kunal; Dunbar, Robin I. M.; Kaski, Kimmo

    2017-01-01

    Timings of human activities are marked by circadian clocks which in turn are entrained to different environmental signals. In an urban environment the presence of artificial lighting and various social cues tend to disrupt the natural entrainment with the sunlight. However, it is not completely understood to what extent this is the case. Here we exploit the large-scale data analysis techniques to study the mobile phone calling activity of people in large cities to infer the dynamics of urban daily rhythms. From the calling patterns of about 1,000,000 users spread over different cities but lying inside the same time-zone, we show that the onset and termination of the calling activity synchronizes with the east-west progression of the sun. We also find that the onset and termination of the calling activity of users follows a yearly dynamics, varying across seasons, and that its timings are entrained to solar midnight. Furthermore, we show that the average mid-sleep time of people living in urban areas depends on the age and gender of each cohort as a result of biological and social factors. PMID:29161270

  4. Tracking urban human activity from mobile phone calling patterns.

    PubMed

    Monsivais, Daniel; Ghosh, Asim; Bhattacharya, Kunal; Dunbar, Robin I M; Kaski, Kimmo

    2017-11-01

    Timings of human activities are marked by circadian clocks which in turn are entrained to different environmental signals. In an urban environment the presence of artificial lighting and various social cues tend to disrupt the natural entrainment with the sunlight. However, it is not completely understood to what extent this is the case. Here we exploit the large-scale data analysis techniques to study the mobile phone calling activity of people in large cities to infer the dynamics of urban daily rhythms. From the calling patterns of about 1,000,000 users spread over different cities but lying inside the same time-zone, we show that the onset and termination of the calling activity synchronizes with the east-west progression of the sun. We also find that the onset and termination of the calling activity of users follows a yearly dynamics, varying across seasons, and that its timings are entrained to solar midnight. Furthermore, we show that the average mid-sleep time of people living in urban areas depends on the age and gender of each cohort as a result of biological and social factors.

  5. Interactive visual optimization and analysis for RFID benchmarking.

    PubMed

    Wu, Yingcai; Chung, Ka-Kei; Qu, Huamin; Yuan, Xiaoru; Cheung, S C

    2009-01-01

    Radio frequency identification (RFID) is a powerful automatic remote identification technique that has wide applications. To facilitate RFID deployment, an RFID benchmarking instrument called aGate has been invented to identify the strengths and weaknesses of different RFID technologies in various environments. However, the data acquired by aGate are usually complex time varying multidimensional 3D volumetric data, which are extremely challenging for engineers to analyze. In this paper, we introduce a set of visualization techniques, namely, parallel coordinate plots, orientation plots, a visual history mechanism, and a 3D spatial viewer, to help RFID engineers analyze benchmark data visually and intuitively. With the techniques, we further introduce two workflow procedures (a visual optimization procedure for finding the optimum reader antenna configuration and a visual analysis procedure for comparing the performance and identifying the flaws of RFID devices) for the RFID benchmarking, with focus on the performance analysis of the aGate system. The usefulness and usability of the system are demonstrated in the user evaluation.

  6. A generalized baleen whale call detection and classification system.

    PubMed

    Baumgartner, Mark F; Mussoline, Sarah E

    2011-05-01

    Passive acoustic monitoring allows the assessment of marine mammal occurrence and distribution at greater temporal and spatial scales than is now possible with traditional visual surveys. However, the large volume of acoustic data and the lengthy and laborious task of manually analyzing these data have hindered broad application of this technique. To overcome these limitations, a generalized automated detection and classification system (DCS) was developed to efficiently and accurately identify low-frequency baleen whale calls. The DCS (1) accounts for persistent narrowband and transient broadband noise, (2) characterizes temporal variation of dominant call frequencies via pitch-tracking, and (3) classifies calls based on attributes of the resulting pitch tracks using quadratic discriminant function analysis (QDFA). Automated detections of sei whale (Balaenoptera borealis) downsweep calls and North Atlantic right whale (Eubalaena glacialis) upcalls were evaluated using recordings collected in the southwestern Gulf of Maine during the spring seasons of 2006 and 2007. The accuracy of the DCS was similar to that of a human analyst: variability in differences between the DCS and an analyst was similar to that between independent analysts, and temporal variability in call rates was similar among the DCS and several analysts.

  7. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. MPATHav: A software prototype for multiobjective routing in transportation risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganter, J.H.; Smith, J.D.

    Most routing problems depend on several important variables: transport distance, population exposure, accident rate, mandated roads (e.g., HM-164 regulations), and proximity to emergency response resources are typical. These variables may need to be minimized or maximized, and often are weighted. `Objectives` to be satisfied by the analysis are thus created. The resulting problems can be approached by combining spatial analysis techniques from geographic information systems (GIS) with multiobjective analysis techniques from the field of operations research (OR); we call this hybrid multiobjective spatial analysis` (MOSA). MOSA can be used to discover, display, and compare a range of solutions that satisfymore » a set of objectives to varying degrees. For instance, a suite of solutions may include: one solution that provides short transport distances, but at a cost of high exposure; another solution that provides low exposure, but long distances; and a range of solutions between these two extremes.« less

  9. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  10. Principles, Techniques, and Applications of Tissue Microfluidics

    NASA Technical Reports Server (NTRS)

    Wade, Lawrence A.; Kartalov, Emil P.; Shibata, Darryl; Taylor, Clive

    2011-01-01

    The principle of tissue microfluidics and its resultant techniques has been applied to cell analysis. Building microfluidics to suit a particular tissue sample would allow the rapid, reliable, inexpensive, highly parallelized, selective extraction of chosen regions of tissue for purposes of further biochemical analysis. Furthermore, the applicability of the techniques ranges beyond the described pathology application. For example, they would also allow the posing and successful answering of new sets of questions in many areas of fundamental research. The proposed integration of microfluidic techniques and tissue slice samples is called "tissue microfluidics" because it molds the microfluidic architectures in accordance with each particular structure of each specific tissue sample. Thus, microfluidics can be built around the tissues, following the tissue structure, or alternatively, the microfluidics can be adapted to the specific geometry of particular tissues. By contrast, the traditional approach is that microfluidic devices are structured in accordance with engineering considerations, while the biological components in applied devices are forced to comply with these engineering presets.

  11. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  12. A Laboratory Course for Teaching Laboratory Techniques, Experimental Design, Statistical Analysis, and Peer Review Process to Undergraduate Science Students

    ERIC Educational Resources Information Center

    Gliddon, C. M.; Rosengren, R. J.

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…

  13. A Study in Critical Listening Using Eight to Ten Year Olds in an Analysis of Commercial Propaganda Emanating from Television.

    ERIC Educational Resources Information Center

    Cook, Jimmie Ellis

    Selected eight to ten year old Maryland children were used in this study measuring the effect of lessons in becoming aware of propaganda employed by commercial advertisers in television programs. Sixteen 45-minute lessons directed to the propaganda techniques of Band Wagon, Card Stacking, Glittering Generalities, Name Calling, Plain Folks,…

  14. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  15. Computer codes for thermal analysis of a solid rocket motor nozzle

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1988-01-01

    A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.

  16. Scaling range of power laws that originate from fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz; Mazur, Zygmunt

    2013-05-01

    We extend our previous study of scaling range properties performed for detrended fluctuation analysis (DFA) [Physica A0378-437110.1016/j.physa.2013.01.049 392, 2384 (2013)] to other techniques of fluctuation analysis (FA). The new technique, called modified detrended moving average analysis (MDMA), is introduced, and its scaling range properties are examined and compared with those of detrended moving average analysis (DMA) and DFA. It is shown that contrary to DFA, DMA and MDMA techniques exhibit power law dependence of the scaling range with respect to the length of the searched signal and with respect to the accuracy R2 of the fit to the considered scaling law imposed by DMA or MDMA methods. This power law dependence is satisfied for both uncorrelated and autocorrelated data. We find also a simple generalization of this power law relation for series with a different level of autocorrelations measured in terms of the Hurst exponent. Basic relations between scaling ranges for different techniques are also discussed. Our findings should be particularly useful for local FA in, e.g., econophysics, finances, or physiology, where the huge number of short time series has to be examined at once and wherever the preliminary check of the scaling range regime for each of the series separately is neither effective nor possible.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatt, A.

    The 60th anniversary of the discovery of neutron activation analysis (NAA) by Hevesy and Levi is being celebrated in 1996. With the availability of nuclear reactors capable of producing fluxes of the order of 10{sup 12} to 10{sup 14} n/cm{sup 2}s, the development of high-resolution and high-efficiency conventional and anticoincidence gamma-ray detectors, multichannel pulse-height analyzers, and personal computer-based softwares, NAA has become an extremely valuable analytical technique, especially for the simultaneous determinations of multielement concentrations. This technique can be used in a number of ways, depending on the nature of the matrix, the major elements in the sample, and onmore » the elements of interest. In most cases, several elements can be determined without any chemical pretreatment of the sample; the technique is then called instrumental NAA (INAA). In other cases, an element can be concentrated from an interfering matrix prior to irradiation; the technique is then termed preconcentration NAA (PNAA). In opposite instances, the irradiation is followed by a chemical separation of the desired element; the technique is then called radiochemical NAA (RNAA). All three forms of NAA can provide elemental concentrations of high accuracy and precision with excellent sensitivity. The number of research reactors in developing countries has increased steadily from 17 in 1955 through 71 in 1975 to 89 in 1995. Low flux reactors such as SLOWPOKE and the Chinese MNSR are primarily used for NAA.« less

  18. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  19. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks

    PubMed Central

    Yeom, Jeong Seon; Jung, Bang Chul; Jin, Hu

    2018-01-01

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations. PMID:29439413

  20. Restoration of out-of-focus images based on circle of confusion estimate

    NASA Astrophysics Data System (ADS)

    Vivirito, Paolo; Battiato, Sebastiano; Curti, Salvatore; La Cascia, M.; Pirrone, Roberto

    2002-11-01

    In this paper a new method for a fast out-of-focus blur estimation and restoration is proposed. It is suitable for CFA (Color Filter Array) images acquired by typical CCD/CMOS sensor. The method is based on the analysis of a single image and consists of two steps: 1) out-of-focus blur estimation via Bayer pattern analysis; 2) image restoration. Blur estimation is based on a block-wise edge detection technique. This edge detection is carried out on the green pixels of the CFA sensor image also called Bayer pattern. Once the blur level has been estimated the image is restored through the application of a new inverse filtering technique. This algorithm gives sharp images reducing ringing and crisping artifact, involving wider region of frequency. Experimental results show the effectiveness of the method, both in subjective and numerical way, by comparison with other techniques found in literature.

  1. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks.

    PubMed

    Yeom, Jeong Seon; Chu, Eunmi; Jung, Bang Chul; Jin, Hu

    2018-02-10

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations.

  2. Input-output relationship in social communications characterized by spike train analysis

    NASA Astrophysics Data System (ADS)

    Aoki, Takaaki; Takaguchi, Taro; Kobayashi, Ryota; Lambiotte, Renaud

    2016-10-01

    We study the dynamical properties of human communication through different channels, i.e., short messages, phone calls, and emails, adopting techniques from neuronal spike train analysis in order to characterize the temporal fluctuations of successive interevent times. We first measure the so-called local variation (LV) of incoming and outgoing event sequences of users and find that these in- and out-LV values are positively correlated for short messages and uncorrelated for phone calls and emails. Second, we analyze the response-time distribution after receiving a message to focus on the input-output relationship in each of these channels. We find that the time scales and amplitudes of response differ between the three channels. To understand the effects of the response-time distribution on the correlations between the LV values, we develop a point process model whose activity rate is modulated by incoming and outgoing events. Numerical simulations of the model indicate that a quick response to incoming events and a refractory effect after outgoing events are key factors to reproduce the positive LV correlations.

  3. Three Reading Comprehension Strategies: TELLS, Story Mapping, and QARs.

    ERIC Educational Resources Information Center

    Sorrell, Adrian L.

    1990-01-01

    Three reading comprehension strategies are presented to assist learning-disabled students: an advance organizer technique called "TELLS Fact or Fiction" used before reading a passage, a schema-based technique called "Story Mapping" used while reading, and a postreading method of categorizing questions called…

  4. Managing a work-life balance: the experiences of midwives working in a group practice setting.

    PubMed

    Fereday, Jennifer; Oster, Candice

    2010-06-01

    To explore how a group of midwives achieved a work-life balance working within a caseload model of care with flexible work hours and on-call work. in-depth interviews were conducted and the data were analysed using a data-driven thematic analysis technique. Children, Youth and Women's Health Service (CYWHS) (previously Women's and Children's Hospital), Adelaide, where a midwifery service known as Midwifery Group Practice (MGP) offers a caseload model of care to women within a midwife-managed unit. 17 midwives who were currently working, or had previously worked, in MGP. analysis of the midwives' individual experiences provided insight into how midwives managed the flexible hours and on-call work to achieve a sustainable work-life balance within a caseload model of care. it is important for midwives working in MGP to actively manage the flexibility of their role with time on call. Organisational, team and individual structure influenced how flexibility of hours was managed; however, a period of adjustment was required to achieve this balance. the study findings offer a description of effective, sustainable strategies to manage flexible hours and on-call work that may assist other midwives working in a similar role or considering this type of work setting. Copyright 2008 Elsevier Ltd. All rights reserved.

  5. LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS

    PubMed Central

    Einstein, Daniel R.; Dyedov, Vladimir

    2010-01-01

    Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546

  6. A data analysis expert system for large established distributed databases

    NASA Technical Reports Server (NTRS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  7. Self-consistent analysis of high drift velocity measurements with the STARE system

    NASA Technical Reports Server (NTRS)

    Reinleitner, L. A.; Nielsen, E.

    1985-01-01

    The use of the STARE and SABRE coherent radar systems as valuable tools for geophysical research has been enhanced by a new technique called the Superimposed-Grid-Point method. This method permits an analysis of E-layer plasma irregularity phase velocity versus flow angle utilizing only STARE or SABRE data. As previous work with STARE has indicated, this analysis has clearly shown that the cosine law assumption breaks down for velocities near and exceeding the local ion acoustic velocities. Use of this method is improving understanding of naturally-occurring plasma irregularities in the E-layer.

  8. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function

    PubMed Central

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.

    2009-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575

  9. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function.

    PubMed

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D

    2008-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.

  10. ALIF: A New Promising Technique for the Decomposition and Analysis of Nonlinear and Nonstationary Signals

    NASA Astrophysics Data System (ADS)

    Cicone, A.; Zhou, H.; Piersanti, M.; Materassi, M.; Spogli, L.

    2017-12-01

    Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this poster we present a new method, called Adaptive Local Iterative Filtering (ALIF). This technique, originally developed to study mono-dimensional signals, unlike any other algorithm proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the technique can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, length of the day signal, pressure measured at ground level on a global grid, radio power scintillation from GNSS signals,

  11. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.

  12. Influence of Synaptic Depression on Memory Storage Capacity

    NASA Astrophysics Data System (ADS)

    Otsubo, Yosuke; Nagata, Kenji; Oizumi, Masafumi; Okada, Masato

    2011-08-01

    Synaptic efficacy between neurons is known to change within a short time scale dynamically. Neurophysiological experiments show that high-frequency presynaptic inputs decrease synaptic efficacy between neurons. This phenomenon is called synaptic depression, a short term synaptic plasticity. Many researchers have investigated how the synaptic depression affects the memory storage capacity. However, the noise has not been taken into consideration in their analysis. By introducing ``temperature'', which controls the level of the noise, into an update rule of neurons, we investigate the effects of synaptic depression on the memory storage capacity in the presence of the noise. We analytically compute the storage capacity by using a statistical mechanics technique called Self Consistent Signal to Noise Analysis (SCSNA). We find that the synaptic depression decreases the storage capacity in the case of finite temperature in contrast to the case of the low temperature limit, where the storage capacity does not change.

  13. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  14. Estimation of Dynamical Parameters in Atmospheric Data Sets

    NASA Technical Reports Server (NTRS)

    Wenig, Mark O.

    2004-01-01

    In this study a new technique is used to derive dynamical parameters out of atmospheric data sets. This technique, called the structure tensor technique, can be used to estimate dynamical parameters such as motion, source strengths, diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. The fundamental algorithm will be extended to the analysis of multi- channel (e.g. multi trace gas) image sequences and to provide solutions to the extended aperture problem. In this study sensitivity studies have been performed to determine the usability of this technique for data sets with different resolution in time and space and different dimensions.

  15. Automatic welding detection by an intelligent tool pipe inspection

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  16. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  17. Decorrelation correction for nanoparticle tracking analysis of dilute polydisperse suspensions in bulk flow

    NASA Astrophysics Data System (ADS)

    Hartman, John; Kirby, Brian

    2017-03-01

    Nanoparticle tracking analysis, a multiprobe single particle tracking technique, is a widely used method to quickly determine the concentration and size distribution of colloidal particle suspensions. Many popular tools remove non-Brownian components of particle motion by subtracting the ensemble-average displacement at each time step, which is termed dedrifting. Though critical for accurate size measurements, dedrifting is shown here to introduce significant biasing error and can fundamentally limit the dynamic range of particle size that can be measured for dilute heterogeneous suspensions such as biological extracellular vesicles. We report a more accurate estimate of particle mean-square displacement, which we call decorrelation analysis, that accounts for correlations between individual and ensemble particle motion, which are spuriously introduced by dedrifting. Particle tracking simulation and experimental results show that this approach more accurately determines particle diameters for low-concentration polydisperse suspensions when compared with standard dedrifting techniques.

  18. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  19. On the equivalence of the RTI and SVM approaches to time correlated analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, S.; Favalli, A.; Henzlova, D.

    2014-11-21

    Recently two papers on how to perform passive neutron auto-correlation analysis on time gated histograms formed from pulse train data, generically called time correlation analysis (TCA), have appeared in this journal [1,2]. For those of us working in international nuclear safeguards these treatments are of particular interest because passive neutron multiplicity counting is a widely deployed technique for the quantification of plutonium. The purpose of this letter is to show that the skewness-variance-mean (SVM) approach developed in [1] is equivalent in terms of assay capability to the random trigger interval (RTI) analysis laid out in [2]. Mathematically we could alsomore » use other numerical ways to extract the time correlated information from the histogram data including for example what we might call the mean, mean square, and mean cube approach. The important feature however, from the perspective of real world applications, is that the correlated information extracted is the same, and subsequently gets interpreted in the same way based on the same underlying physics model.« less

  20. A Change Impact Analysis to Characterize Evolving Program Behaviors

    NASA Technical Reports Server (NTRS)

    Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua

    2012-01-01

    Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks

  1. Digital correlation detector for low-cost Omega navigation

    NASA Technical Reports Server (NTRS)

    Chamberlin, K. A.

    1976-01-01

    Techniques to lower the cost of using the Omega global navigation network with phase-locked loops (PLL) were developed. The technique that was accepted as being "optimal" is called the memory-aided phase-locked loop (MAPLL) since it allows operation on all eight Omega time slots with one PLL through the implementation of a random access memory. The receiver front-end and the signals that it transmits to the PLL were first described. A brief statistical analysis of these signals was then made to allow a rough comparison between the front-end presented in this work and a commercially available front-end to be made. The hardware and theory of application of the MAPLL were described, ending with an analysis of data taken with the MAPLL. Some conclusions and recommendations were also given.

  2. Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Llames, Rene Lim

    1991-01-01

    Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.

  3. CHROMATOGRAPHIC TECHNIQUES IN PHARMACEUTICAL ANALYSIS IN POIAND: HISTORY AND THE PRESENCE ON THE BASIS OF PAPERS PUBLISHED IN SELECTED POLISH PHARMACEUTICAL JOURNALS IN XX CENTURY.

    PubMed

    Bilek, Maciej; Namieśnik, Jacek

    2016-01-01

    For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".

  4. Analysis of an Unusual Mirror in a 16th-Century Painting: A Museum Exercise for Physics Students

    ERIC Educational Resources Information Center

    Swaminathan, Sudha; Lamelas, Frank

    2017-01-01

    Physics students at Worcester State University visit the Worcester Art Museum (WAM) at the end of a special 100- level course called Physics in Art. The students have studied geometrical optics, and they have been introduced to concepts in atomic physics. The purpose of the museum tour is to show how physics-based techniques can be used in a…

  5. Antibiotic and Modulation of Microbiota: A New Paradigm?

    PubMed

    Rizzatti, Gianenrico; Ianiro, Gianluca; Gasbarrini, Antonio

    2018-06-16

    Recently new insights on gut microbiota have revolutionized many concepts of the modern medicine. The alteration of microbiota, which is called dysbiosis, has been associated with an expanding list of diseases and conditions. The development of next-generation sequencing techniques allowed comprehensive analysis of gut microbiota composition without the limitations of classic culture methods. Furthermore, introduction of functional techniques such as metabolomics and proteomics allowed for integrated analysis thus obtaining more robust insights on microbiota functions in health and disease. These tools allow to address the role of factors able to modify the gut microbiota, the so called "microbiota influencers." These data are useful to explain the physiopathology of several disease and thus to identify new potential therapeutic targets. Among microbiota influencers, many studies focused on the impact of antibiotic administration on the gut microbiota, because of their widespread use. Notably, beside the known beneficial effect of antibiotic in treating infectious diseases, these drugs have shown detrimental effects on gut microbiota which, in turn, might have long-term consequences on the host. Finally, therapeutic modulation of gut microbiota, by means of selected antibiotics with eubiotic effects, probiotics and with fecal microbiota transplantation seems of great interest as it might be able to prevent or even revert antibiotic-induced dysbiosis.

  6. Response spectra analysis of the modal summation technique verified by observed seismometer and accelerometer waveform data of the M6.5 Pidie Jaya Earthquake

    NASA Astrophysics Data System (ADS)

    Irwandi; Rusydy, Ibnu; Muksin, Umar; Rudyanto, Ariska; Daryono

    2018-05-01

    Wave vibration confined in the boundary will produce stationary wave solution in discrete states called modes. There are many physics applications related to modal solutions such as air column resonance, string vibration, and emission spectrum of the atomic Hydrogen. Naturally, energy is distributed in several modes so that the complete calculation is obtained from the sum of the whole modes called modal summation. The modal summation technique was applied to simulate the surface wave propagation above crustal structure of the earth. The method is computational because it uses 1D structural model which is not necessary to calculate the overall wave propagation. The simulation results of the magnitude 6.5 Pidie Jaya earthquake show the response spectral of the Summation Technique has a good correlation to the observed seismometer and accelerometer waveform data, especially at the KCSI (Kotacane) station. On the other hand, at the LASI (Langsa) station shows the modal simulation result of response is relatively lower than observation. The lower value of the reaction spectral estimation is obtained because the station is located in the thick sedimentary basin causing the amplification effect. This is the limitation of modal summation technique, and therefore it should be combined with different finite simulation on the 2D local structural model of the basin.

  7. E-GRASP/Eratosthenes: a mission proposal for millimetric TRF realization

    NASA Astrophysics Data System (ADS)

    Biancale, Richard; Pollet, Arnaud; Coulot, David; Mandea, Mioara

    2017-04-01

    The ITRF is currently worked out by independent concatenation of space technique information. GNSS, DORIS, SLR and VLBI data are processed independently by analysis centers before combination centers form mono-technique sets which are then combined together to produce official ITRF solutions. Actually this approach performs quite well, although systematisms between techniques remain visible in origin or scale parameters of the underlying terrestrial frames, for instance. Improvement and homogenization of TRF are expected in the future, provided that dedicated multi-technique platforms are used at best. The goal fixed by GGOS to realizing the terrestrial reference system with an accuracy of 1 mm and a long-term stability of 0.1 mm/yr can be next achieved in the E-GRASP/Eratosthenes scenario. This mission proposed to ESA as response of the 2017 Earth Explorer-9 call was already scientifically well assessed in the 2016 EE9 call. It co-locates all of the fundamental space-based geodetic instruments, GNSS and DORIS receivers, laser retro-reflectors, and a VLBI transmitter on the same satellite platform on a highly eccentric orbit with particular attention paid to the time and space metrology on board. Different kinds of simulations were performed both for discriminating the best orbital scenario according to many geometric/technical/physical criteria and for assessing the expected performances on the TRF according to GGOS goals. The presentation will focus on the mission scenario and simulation results.

  8. Efficient Ada multitasking on a RISC register window architecture

    NASA Technical Reports Server (NTRS)

    Kearns, J. P.; Quammen, D.

    1987-01-01

    This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.

  9. Optimization technique for problems with an inequality constraint

    NASA Technical Reports Server (NTRS)

    Russell, K. J.

    1972-01-01

    General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.

  10. Computational techniques for design optimization of thermal protection systems for the space shuttle vehicle. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.

  11. dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms.

    PubMed

    Puritz, Jonathan B; Hollenbeck, Christopher M; Gold, John R

    2014-01-01

    Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com.

  12. dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms

    PubMed Central

    Hollenbeck, Christopher M.; Gold, John R.

    2014-01-01

    Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com. PMID:24949246

  13. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  14. Topometry of technical and biological objects by fringe projection

    NASA Astrophysics Data System (ADS)

    Windecker, R.; Tiziani, H. J.

    1995-07-01

    Fringe projection is a fast and accurate technique for obtaining the topometry of a wide range of surfaces. Here some features of the principle are described, together with the possibilities of adapting this technique for the measurement of vaulted surfaces. We discuss various methods of phase evaluation and compare them with simulated computer data to obtain the resolution limits. Under certain restrictions a semispatial algorithm, called the modified Fourier analysis algorithm, gives the best results. One special subject of interest is the application of fringe projection for the measurement of the three-dimensional surface of the cornea. First results of in vivo measurements are presented.

  15. Safer Liquid Natural Gas

    NASA Technical Reports Server (NTRS)

    1976-01-01

    After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).

  16. Multi-Spacecraft 3D differential emission measure tomography of the solar corona: STEREO results.

    NASA Astrophysics Data System (ADS)

    Vásquez, A. M.; Frazin, R. A.

    We have recently developed a novel technique (called DEMT) for the em- pirical determination of the three-dimensional (3D) distribution of the so- lar corona differential emission measure through multi-spacecraft solar ro- tational tomography of extreme-ultaviolet (EUV) image time series (like those provided by EIT/SOHO and EUVI/STEREO). The technique allows, for the first time, to develop global 3D empirical maps of the coronal elec- tron temperature and density, in the height range 1.0 to 1.25 RS . DEMT constitutes a simple and powerful 3D analysis tool that obviates the need for structure specific modeling.

  17. Machine vision for real time orbital operations

    NASA Technical Reports Server (NTRS)

    Vinz, Frank L.

    1988-01-01

    Machine vision for automation and robotic operation of Space Station era systems has the potential for increasing the efficiency of orbital servicing, repair, assembly and docking tasks. A machine vision research project is described in which a TV camera is used for inputing visual data to a computer so that image processing may be achieved for real time control of these orbital operations. A technique has resulted from this research which reduces computer memory requirements and greatly increases typical computational speed such that it has the potential for development into a real time orbital machine vision system. This technique is called AI BOSS (Analysis of Images by Box Scan and Syntax).

  18. Wavelength modulated surface enhanced (resonance) Raman scattering for background-free detection.

    PubMed

    Praveen, Bavishna B; Steuwe, Christian; Mazilu, Michael; Dholakia, Kishan; Mahajan, Sumeet

    2013-05-21

    Spectra in surface-enhanced Raman scattering (SERS) are always accompanied by a continuum emission called the 'background' which complicates analysis and is especially problematic for quantification and automation. Here, we implement a wavelength modulation technique to eliminate the background in SERS and its resonant version, surface-enhanced resonance Raman scattering (SERRS). This is demonstrated on various nanostructured substrates used for SER(R)S. An enhancement in the signal to noise ratio for the Raman bands of the probe molecules is also observed. This technique helps to improve the analytical ability of SERS by alleviating the problem due to the accompanying background and thus making observations substrate independent.

  19. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    PubMed

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  20. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    PubMed Central

    Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250

  1. Noncontact, Electrode-free Capacitance/Voltage Measurement Based on General Theory of Metal-Oxide-Semiconductor (MOS) Structure

    NASA Astrophysics Data System (ADS)

    Sakai, Takamasa; Kohno, Motohiro; Hirae, Sadao; Nakatani, Ikuyoshi; Kusuda, Tatsufumi

    1993-09-01

    In this paper, we discussed a novel approach to semiconductor surface inspection, which is analysis using the C--V curve measured in a noncontact method by the metal-air-semiconductor (MAIS) technique. A new gap sensing method using the so-called Goos-Haenchen effect was developed to achieve the noncontact C--V measurement. The MAIS technique exhibited comparable sensitivity and repeatability to those of conventional C--V measurement, and hence, good reproducibility and resolution for quantifying the electrically active impurity on the order of 1× 109/cm2, which is better than most spectrometric techniques, such as secondary ion mass spectroscopy (SIMS), electron spectroscopy for chemical analysis (ESCA) and Auger electron spectrocopy (AES) which are time-consuming and destructive. This measurement without preparation of any electrical contact metal electrode suggested, for the first time, the possibility of measuring an intrinsic characteristic of the semiconductor surface, using the examples of a concrete examination.

  2. Preliminary Analysis of Photoreading

    NASA Technical Reports Server (NTRS)

    McNamara, Danielle S.

    2000-01-01

    The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.

  3. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research

    PubMed Central

    Golino, Hudson F.; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman’s eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study. PMID:28594839

  4. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research.

    PubMed

    Golino, Hudson F; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman's eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study.

  5. Progressive Sampling Technique for Efficient and Robust Uncertainty and Sensitivity Analysis of Environmental Systems Models: Stability and Convergence

    NASA Astrophysics Data System (ADS)

    Sheikholeslami, R.; Hosseini, N.; Razavi, S.

    2016-12-01

    Modern earth and environmental models are usually characterized by a large parameter space and high computational cost. These two features prevent effective implementation of sampling-based analysis such as sensitivity and uncertainty analysis, which require running these computationally expensive models several times to adequately explore the parameter/problem space. Therefore, developing efficient sampling techniques that scale with the size of the problem, computational budget, and users' needs is essential. In this presentation, we propose an efficient sequential sampling strategy, called Progressive Latin Hypercube Sampling (PLHS), which provides an increasingly improved coverage of the parameter space, while satisfying pre-defined requirements. The original Latin hypercube sampling (LHS) approach generates the entire sample set in one stage; on the contrary, PLHS generates a series of smaller sub-sets (also called `slices') while: (1) each sub-set is Latin hypercube and achieves maximum stratification in any one dimensional projection; (2) the progressive addition of sub-sets remains Latin hypercube; and thus (3) the entire sample set is Latin hypercube. Therefore, it has the capability to preserve the intended sampling properties throughout the sampling procedure. PLHS is deemed advantageous over the existing methods, particularly because it nearly avoids over- or under-sampling. Through different case studies, we show that PHLS has multiple advantages over the one-stage sampling approaches, including improved convergence and stability of the analysis results with fewer model runs. In addition, PLHS can help to minimize the total simulation time by only running the simulations necessary to achieve the desired level of quality (e.g., accuracy, and convergence rate).

  6. Coarse analysis of collective behaviors: Bifurcation analysis of the optimal velocity model for traffic jam formation

    NASA Astrophysics Data System (ADS)

    Miura, Yasunari; Sugiyama, Yuki

    2017-12-01

    We present a general method for analyzing macroscopic collective phenomena observed in many-body systems. For this purpose, we employ diffusion maps, which are one of the dimensionality-reduction techniques, and systematically define a few relevant coarse-grained variables for describing macroscopic phenomena. The time evolution of macroscopic behavior is described as a trajectory in the low-dimensional space constructed by these coarse variables. We apply this method to the analysis of the traffic model, called the optimal velocity model, and reveal a bifurcation structure, which features a transition to the emergence of a moving cluster as a traffic jam.

  7. Automatic movie skimming with general tempo analysis

    NASA Astrophysics Data System (ADS)

    Lee, Shih-Hung; Yeh, Chia-Hung; Kuo, C. C. J.

    2003-11-01

    Story units are extracted by general tempo analysis including tempos analysis including tempos of audio and visual information in this research. Although many schemes have been proposed to successfully segment video data into shots using basic low-level features, how to group shots into meaningful units called story units is still a challenging problem. By focusing on a certain type of video such as sport or news, we can explore models with the specific application domain knowledge. For movie contents, many heuristic rules based on audiovisual clues have been proposed with limited success. We propose a method to extract story units using general tempo analysis. Experimental results are given to demonstrate the feasibility and efficiency of the proposed technique.

  8. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  9. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  10. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  11. Strain analysis of SiGe microbridges

    NASA Astrophysics Data System (ADS)

    Anthony, Ross; Gilbank, Ashley; Crowe, Iain; Knights, Andrew

    2018-02-01

    We present the analysis of UV (325 nm) Raman scattering spectra from silicon-germanium (SiGe) microbridges where the SiGe has been formed using the so-called "condensation technique". As opposed to the conventional condensation technique in which SiGe is grown epitaxially, we use high-dose ion implantation of Ge ions into SOI as a means to introduce the initial Ge profile. The subsequent oxidation both repairs implantation induced damage, and forms epitaxial Ge. Using Si-Si and Si-Ge optical phonon modes, as well as the ratio of integrated intensities for Ge-Ge and Si-Si, we can determine both the composition and strain of the material. We show that although the material is compressively strained following condensation, by fabricating microbridge structures we can create strain relaxed or tensile strained structures, with subsequent interest for photonic applications.

  12. Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J.

    2018-06-01

    The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.

  13. Flux control coefficients determined by inhibitor titration: the design and analysis of experiments to minimize errors.

    PubMed Central

    Small, J R

    1993-01-01

    This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434

  14. Delboeuf and Janet as influences in Freud's treatment of Emmy von N.

    PubMed

    Macmillan, M B

    1979-10-01

    An analysis is made of Freud's treatment of the patient known as Emmy von N. in which for the first time he used what he called "Breuer's technique of investigation under hypnosis." It is shown that the main component of Freud's therapy owed nothing to Breuer: the patient's traumatic memories were altered by direct suggestion under hypnosis. The abreaction which did take place seems to have resulted from Freud's expectation that it should occur. Two cases published by Delboeuf and Janet in late 1888 and early 1889 were treated by a then unusual method which analysis demonstrates to have been virtually identical to the technique used by Freud. Evidence is presented that the Delboeuf and Janet cases could have been known to Freud before he began his treatment of Emmy von N.

  15. Covert Channels in SIP for VoIP Signalling

    NASA Astrophysics Data System (ADS)

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    In this paper, we evaluate available steganographic techniques for SIP (Session Initiation Protocol) that can be used for creating covert channels during signaling phase of VoIP (Voice over IP) call. Apart from characterizing existing steganographic methods we provide new insights by introducing new techniques. We also estimate amount of data that can be transferred in signalling messages for typical IP telephony call.

  16. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  17. Comparative analysis of methods and optical-electronic equipment to control the form parameters of spherical mirrors

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexander N.; Baryshnikov, Nikolay; Denisov, Dmitrii; Karasik, Valerii; Sakharov, Alexey; Romanov, Pavel; Sheldakova, Julia; Kudryashov, Alexis

    2018-02-01

    In this paper we consider two approaches widely used in testing of spherical optical surfaces: Fizeau interferometer and Shack-Hartmann wavefront sensor. Fizeau interferometer that is widely used in optical testing can be transformed to a device using Shack-Hartmann wavefront sensor, the alternative technique to check spherical optical components. We call this device Hartmannometer, and compare its features to those of Fizeau interferometer.

  18. Integration of Audit Data Analysis and Mining Techniques into Aide

    DTIC Science & Technology

    2006-07-01

    results produced by the anomaly-detection subsystem. A successor system to NIDES, called EMERALD [35], currently under development at SRI, extends...to represent attack scenarios in a networked environment. eBayes of SRI’s Emerald uses Bayes net technology to analyze bursts of traffic [40...snmpget2, we have to resort to the TCP raw data (packets) to see what operations these connections performed. (E.g., long login trails in the PSSWD attack

  19. Planning effectiveness may grow on fault trees.

    PubMed

    Chow, C W; Haddad, K; Mannino, B

    1991-10-01

    The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.

  20. Method to simulate and analyse induced stresses for laser crystal packaging technologies.

    PubMed

    Ribes-Pleguezuelo, Pol; Zhang, Site; Beckert, Erik; Eberhardt, Ramona; Wyrowski, Frank; Tünnermann, Andreas

    2017-03-20

    A method to simulate induced stresses for a laser crystal packaging technique and the consequent study of birefringent effects inside the laser cavities has been developed. The method has been implemented by thermo-mechanical simulations implemented with ANSYS 17.0. ANSYS results were later imported in VirtualLab Fusion software where input/output beams in terms of wavelengths and polarization were analysed. The study has been built in the context of a low-stress soldering technique implemented for glass or crystal optics packaging's called the solderjet bumping technique. The outcome of the analysis showed almost no difference between the input and output laser beams for the laser cavity constructed with an yttrium aluminum garnet active laser crystal, a second harmonic generator beta-barium borate, and the output laser mirror made of fused silica assembled by the low-stress solderjet bumping technique.

  1. A spot pattern test chart technique for measurement of geometric aberrations caused by an intervening medium—a novel method

    NASA Astrophysics Data System (ADS)

    Ganesan, A. R.; Arulmozhivarman, P.; Jesson, M.

    2005-12-01

    Accurate surface metrology and transmission characteristics measurements have become vital to certify the manufacturing excellence in the field of glass visors, windshields, menu boards and transportation industries. We report a simple, cost-effective and novel technique for the measurement of geometric aberrations in transparent materials such as glass sheets, Perspex, etc. The technique makes use of an array of spot pattern, we call the spot pattern test chart technique, in the diffraction limited imaging position having large field of view. Performance features include variable angular dynamic range and angular sensitivity. Transparent sheets as the intervening medium introduced in the line of sight, causing aberrations, are estimated in real time using the Zernike reconstruction method. Quantitative comparative analysis between a Shack-Hartmann wavefront sensor and the proposed new method is presented and the results are discussed.

  2. Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA

    NASA Astrophysics Data System (ADS)

    Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.

    2012-01-01

    In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.

  3. 77 FR 56710 - Proposed Information Collection (Call Center Satisfaction Survey): Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0744] Proposed Information Collection (Call Center Satisfaction Survey): Comment Request AGENCY: Veterans Benefits Administration, Department of... techniques or the use of other forms of information technology. Title: VBA Call Center Satisfaction Survey...

  4. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  5. A proposed technique for vehicle tracking, direction, and speed determination

    NASA Astrophysics Data System (ADS)

    Fisher, Paul S.; Angaye, Cleopas O.; Fisher, Howard P.

    2004-12-01

    A technique for recognition of vehicles in terms of direction, distance, and rate of change is presented. This represents very early work on this problem with significant hurdles still to be addressed. These are discussed in the paper. However, preliminary results also show promise for this technique for use in security and defense environments where the penetration of a perimeter is of concern. The material described herein indicates a process whereby the protection of a barrier could be augmented by computers and installed cameras assisting the individuals charged with this responsibility. The technique we employ is called Finite Inductive Sequences (FI) and is proposed as a means for eliminating data requiring storage and recognition where conventional mathematical models don"t eliminate enough and statistical models eliminate too much. FI is a simple idea and is based upon a symbol push-out technique that allows the order (inductive base) of the model to be set to an a priori value for all derived rules. The rules are obtained from exemplar data sets, and are derived by a technique called Factoring, yielding a table of rules called a Ruling. These rules can then be used in pattern recognition applications such as described in this paper.

  6. The physics of a popsicle stick bomb

    NASA Astrophysics Data System (ADS)

    Sautel, Jérémy; Bourges, Andréane; Caussarieu, Aude; Plihon, Nicolas; Taberlet, Nicolas

    2017-10-01

    Popsicle sticks can be interlocked in the so-called "cobra weave" to form a chain under tension. When one end of the chain is released, the sticks rapidly disentangle, forming a traveling wave that propagates down the chain. In this paper, the properties of the traveling front are studied experimentally, and classical results from the theory of elasticity allow for a dimensional analysis of the height and speed of the traveling wave. The study presented here can help undergraduate students familiarize themselves with experimental techniques of image processing, and it also demonstrates the power of dimensional analysis and scaling laws.

  7. Training Humans to Categorize Monkey Calls: Auditory Feature- and Category-Selective Neural Tuning Changes.

    PubMed

    Jiang, Xiong; Chevillet, Mark A; Rauschecker, Josef P; Riesenhuber, Maximilian

    2018-04-18

    Grouping auditory stimuli into common categories is essential for a variety of auditory tasks, including speech recognition. We trained human participants to categorize auditory stimuli from a large novel set of morphed monkey vocalizations. Using fMRI-rapid adaptation (fMRI-RA) and multi-voxel pattern analysis (MVPA) techniques, we gained evidence that categorization training results in two distinct sets of changes: sharpened tuning to monkey call features (without explicit category representation) in left auditory cortex and category selectivity for different types of calls in lateral prefrontal cortex. In addition, the sharpness of neural selectivity in left auditory cortex, as estimated with both fMRI-RA and MVPA, predicted the steepness of the categorical boundary, whereas categorical judgment correlated with release from adaptation in the left inferior frontal gyrus. These results support the theory that auditory category learning follows a two-stage model analogous to the visual domain, suggesting general principles of perceptual category learning in the human brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. The scale invariant generator technique for quantifying anisotropic scale invariance

    NASA Astrophysics Data System (ADS)

    Lewis, G. M.; Lovejoy, S.; Schertzer, D.; Pecknold, S.

    1999-11-01

    Scale invariance is rapidly becoming a new paradigm for geophysics. However, little attention has been paid to the anisotropy that is invariably present in geophysical fields in the form of differential stratification and rotation, texture and morphology. In order to account for scaling anisotropy, the formalism of generalized scale invariance (GSI) was developed. Until now there has existed only a single fairly ad hoc GSI analysis technique valid for studying differential rotation. In this paper, we use a two-dimensional representation of the linear approximation to generalized scale invariance, to obtain a much improved technique for quantifying anisotropic scale invariance called the scale invariant generator technique (SIG). The accuracy of the technique is tested using anisotropic multifractal simulations and error estimates are provided for the geophysically relevant range of parameters. It is found that the technique yields reasonable estimates for simulations with a diversity of anisotropic and statistical characteristics. The scale invariant generator technique can profitably be applied to the scale invariant study of vertical/horizontal and space/time cross-sections of geophysical fields as well as to the study of the texture/morphology of fields.

  9. Atmospheric Precorrected Differential Absorption technique to retrieve columnar water vapor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlaepfer, D.; Itten, K.I.; Borel, C.C.

    1998-09-01

    Differential absorption techniques are suitable to retrieve the total column water vapor contents from imaging spectroscopy data. A technique called Atmospheric Precorrected Differential Absorption (APDA) is derived directly from simplified radiative transfer equations. It combines a partial atmospheric correction with a differential absorption technique. The atmospheric path radiance term is iteratively corrected during the retrieval of water vapor. This improves the results especially over low background albedos. The error of the method for various ground reflectance spectra is below 7% for most of the spectra. The channel combinations for two test cases are then defined, using a quantitative procedure, whichmore » is based on MODTRAN simulations and the image itself. An error analysis indicates that the influence of aerosols and channel calibration is minimal. The APDA technique is then applied to two AVIRIS images acquired in 1991 and 1995. The accuracy of the measured water vapor columns is within a range of {+-}5% compared to ground truth radiosonde data.« less

  10. Lip reposition surgery: A new call in periodontics

    PubMed Central

    Sheth, Tejal; Shah, Shilpi; Shah, Mihir; Shah, Ekta

    2013-01-01

    “Gummy smile” is a major concern for a large number of patients visiting the dentist. Esthetics has now become an integral part of periodontal treatment plan. This article presents a case of a gummy smile in which esthetic correction was achieved through periodontal plastic surgical procedure wherein a 10-12 mm of partial-thickness flap was dissected apical to mucogingival junction followed by approximation of the flaps. This novel technique gave excellent post-operative results with enormous patient satisfaction. This surgical chair-side procedure being one of its kinds with outstanding results is very rarely performed by Periodontists. Thus, a lot of clinical work and literature review with this surgical technique is required. To make it a routine surgical procedure this technique can be incorporated as a part of periodontal plastic surgery in the text. Hence, we have put forward experience of a case with critical analysis of the surgical technique including the limitations of the technique. PMID:24124310

  11. Categorisation of full waveform data provided by laser scanning devices

    NASA Astrophysics Data System (ADS)

    Ullrich, Andreas; Pfennigbauer, Martin

    2011-11-01

    In 2004, a laser scanner device for commercial airborne laser scanning applications, the RIEGL LMS-Q560, was introduced to the market, making use of a radical alternative approach to the traditional analogue signal detection and processing schemes found in LIDAR instruments so far: digitizing the echo signals received by the instrument for every laser pulse and analysing these echo signals off-line in a so-called full waveform analysis in order to retrieve almost all information contained in the echo signal using transparent algorithms adaptable to specific applications. In the field of laser scanning the somewhat unspecific term "full waveform data" has since been established. We attempt a categorisation of the different types of the full waveform data found in the market. We discuss the challenges in echo digitization and waveform analysis from an instrument designer's point of view and we will address the benefits to be gained by using this technique, especially with respect to the so-called multi-target capability of pulsed time-of-flight LIDAR instruments.

  12. AI in CALL--Artificially Inflated or Almost Imminent?

    ERIC Educational Resources Information Center

    Schulze, Mathias

    2008-01-01

    The application of techniques from artificial intelligence (AI) to CALL has commonly been referred to as intelligent CALL (ICALL). ICALL is only slightly older than the "CALICO Journal", and this paper looks back at a quarter century of published research mainly in North America and by North American scholars. This "inventory…

  13. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  14. Development and verification of local/global analysis techniques for laminated composites

    NASA Technical Reports Server (NTRS)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  15. Resolving the morphology of niobium carbonitride nano-precipitates in steel using atom probe tomography.

    PubMed

    Breen, Andrew J; Xie, Kelvin Y; Moody, Michael P; Gault, Baptiste; Yen, Hung-Wei; Wong, Christopher C; Cairney, Julie M; Ringer, Simon P

    2014-08-01

    Atom probe is a powerful technique for studying the composition of nano-precipitates, but their morphology within the reconstructed data is distorted due to the so-called local magnification effect. A new technique has been developed to mitigate this limitation by characterizing the distribution of the surrounding matrix atoms, rather than those contained within the nano-precipitates themselves. A comprehensive chemical analysis enables further information on size and chemistry to be obtained. The method enables new insight into the morphology and chemistry of niobium carbonitride nano-precipitates within ferrite for a series of Nb-microalloyed ultra-thin cast strip steels. The results are supported by complementary high-resolution transmission electron microscopy.

  16. A direct-inverse method for transonic and separated flows about airfoils

    NASA Technical Reports Server (NTRS)

    Carlson, K. D.

    1985-01-01

    A direct-inverse technique and computer program called TAMSEP that can be sued for the analysis of the flow about airfoils at subsonic and low transonic freestream velocities is presented. The method is based upon a direct-inverse nonconservative full potential inviscid method, a Thwaites laminar boundary layer technique, and the Barnwell turbulent momentum integral scheme; and it is formulated using Cartesian coordinates. Since the method utilizes inverse boundary conditions in regions of separated flow, it is suitable for predicing the flowfield about airfoils having trailing edge separated flow under high lift conditions. Comparisons with experimental data indicate that the method should be a useful tool for applied aerodynamic analyses.

  17. A direct-inverse method for transonic and separated flows about airfoils

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1990-01-01

    A direct-inverse technique and computer program called TAMSEP that can be used for the analysis of the flow about airfoils at subsonic and low transonic freestream velocities is presented. The method is based upon a direct-inverse nonconservative full potential inviscid method, a Thwaites laminar boundary layer technique, and the Barnwell turbulent momentum integral scheme; and it is formulated using Cartesian coordinates. Since the method utilizes inverse boundary conditions in regions of separated flow, it is suitable for predicting the flow field about airfoils having trailing edge separated flow under high lift conditions. Comparisons with experimental data indicate that the method should be a useful tool for applied aerodynamic analyses.

  18. Facilitation techniques as predictors of crew participation in LOFT debriefings

    NASA Technical Reports Server (NTRS)

    McDonnell, L. K.

    1996-01-01

    Based on theories of adult learning and airline industry guidelines for Crew Resource Management (CRM), the stated objective during Line Oriented Flight Training (LOFT) debriefings is for instructor pilots (IP's) to facilitate crew self-analysis of performance. This study reviews 19 LOFT debriefings from two major U.S. airlines to examine the relationship between IP efforts at facilitation and associated characteristics of crew participation. A subjective rating scale called the Debriefing Assessment Battery was developed and utilized to evaluate the effectiveness of IP facilitation and the quality of crew participation. The results indicate that IP content, encouragement, and questioning techniques are highly and significantly correlated with, and can therefore predict, the degree and depth of crew participation.

  19. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  20. Visual mining business service using pixel bar charts

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Casati, Fabio

    2004-06-01

    Basic bar charts have been commonly available, but they only show highly aggregated data. Finding the valuable information hidden in the data is essential to the success of business. We describe a new visualization technique called pixel bar charts, which are derived from regular bar charts. The basic idea of a pixel bar chart is to present all data values directly instead of aggregating them into a few data values. Pixel bar charts provide data distribution and exceptions besides aggregated data. The approach is to represent each data item (e.g. a business transaction) by a single pixel in the bar chart. The attribute of each data item is encoded into the pixel color and can be accessed and drilled down to the detail information as needed. Different color mappings are used to represent multiple attributes. This technique has been prototyped in three business service applications-Business Operation Analysis, Sales Analysis, and Service Level Agreement Analysis at Hewlett Packard Laboratories. Our applications show the wide applicability and usefulness of this new idea.

  1. ALIF: a new promising technique for the decomposition and analysis of nonlinear and nonstationary signals

    NASA Astrophysics Data System (ADS)

    Cicone, Antonio; Zhou, Haomin; Piersanti, Mirko; Materassi, Massimo; Spogli, Luca

    2017-04-01

    Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this talk we will review the state of the art and present a new method, called Adaptive Local Iterative Filtering (ALIF). This method, developed originally to study mono-dimensional signals, unlike any other technique proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the method can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, the length of the day signal, the temperature and pressure measured at ground level on a global grid, and the radio power scintillation from GNSS signals.

  2. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    PubMed

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  3. Analysis of High Spatial, Temporal, and Directional Resolution Recordings of Biological Sounds in the Southern California Bight

    DTIC Science & Technology

    2013-09-30

    transiting whales in the Southern California Bight, b) the use of passive underwater acoustic techniques for improved habitat assessment in biologically...sensitive areas and improved ecosystem modeling, and c) the application of the physics of excitable media to numerical modeling of biological choruses...was on the potential impact of man-made sounds on the calling behavior of transiting humpback whales in the Southern California Bight. The main

  4. Analysis of High Spatial, Temporal, and Directional Resolution Recordings of Biological Sounds in the Southern California Bight

    DTIC Science & Technology

    2012-09-30

    tested in this research is that the evolution of unit structure and song characteristics in the population of transiting humpback whales in the Southern...behavior of transiting humpback whales in the Southern California Bight, b) the use of passive underwater acoustic techniques for improved habitat...man-made sounds on the calling behavior of transiting humpback whales in the Southern California Bight”. The main scientific hypothesis to be

  5. Formal Verification at System Level

    NASA Astrophysics Data System (ADS)

    Mazzini, S.; Puri, S.; Mari, F.; Melatti, I.; Tronci, E.

    2009-05-01

    System Level Analysis calls for a language comprehensible to experts with different background and yet precise enough to support meaningful analyses. SysML is emerging as an effective balance between such conflicting goals. In this paper we outline some the results obtained as for SysML based system level functional formal verification by an ESA/ESTEC study, with a collaboration among INTECS and La Sapienza University of Roma. The study focuses on SysML based system level functional requirements techniques.

  6. An adaptive, object oriented strategy for base calling in DNA sequence analysis.

    PubMed Central

    Giddings, M C; Brumley, R L; Haker, M; Smith, L M

    1993-01-01

    An algorithm has been developed for the determination of nucleotide sequence from data produced in fluorescence-based automated DNA sequencing instruments employing the four-color strategy. This algorithm takes advantage of object oriented programming techniques for modularity and extensibility. The algorithm is adaptive in that data sets from a wide variety of instruments and sequencing conditions can be used with good results. Confidence values are provided on the base calls as an estimate of accuracy. The algorithm iteratively employs confidence determinations from several different modules, each of which examines a different feature of the data for accurate peak identification. Modules within this system can be added or removed for increased performance or for application to a different task. In comparisons with commercial software, the algorithm performed well. Images PMID:8233787

  7. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  8. Development of Super-Ensemble techniques for ocean analyses: the Mediterranean Sea case

    NASA Astrophysics Data System (ADS)

    Pistoia, Jenny; Pinardi, Nadia; Oddo, Paolo; Collins, Matthew; Korres, Gerasimos; Drillet, Yann

    2017-04-01

    Short-term ocean analyses for Sea Surface Temperature SST in the Mediterranean Sea can be improved by a statistical post-processing technique, called super-ensemble. This technique consists in a multi-linear regression algorithm applied to a Multi-Physics Multi-Model Super-Ensemble (MMSE) dataset, a collection of different operational forecasting analyses together with ad-hoc simulations produced by modifying selected numerical model parameterizations. A new linear regression algorithm based on Empirical Orthogonal Function filtering techniques is capable to prevent overfitting problems, even if best performances are achieved when we add correlation to the super-ensemble structure using a simple spatial filter applied after the linear regression. Our outcomes show that super-ensemble performances depend on the selection of an unbiased operator and the length of the learning period, but the quality of the generating MMSE dataset has the largest impact on the MMSE analysis Root Mean Square Error (RMSE) evaluated with respect to observed satellite SST. Lower RMSE analysis estimates result from the following choices: 15 days training period, an overconfident MMSE dataset (a subset with the higher quality ensemble members), and the least square algorithm being filtered a posteriori.

  9. Mapping brain activity in gradient-echo functional MRI using principal component analysis

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Singh, Manbir; Don, Manuel

    1997-05-01

    The detection of sites of brain activation in functional MRI has been a topic of immense research interest and many technique shave been proposed to this end. Recently, principal component analysis (PCA) has been applied to extract the activated regions and their time course of activation. This method is based on the assumption that the activation is orthogonal to other signal variations such as brain motion, physiological oscillations and other uncorrelated noises. A distinct advantage of this method is that it does not require any knowledge of the time course of the true stimulus paradigm. This technique is well suited to EPI image sequences where the sampling rate is high enough to capture the effects of physiological oscillations. In this work, we propose and apply tow methods that are based on PCA to conventional gradient-echo images and investigate their usefulness as tools to extract reliable information on brain activation. The first method is a conventional technique where a single image sequence with alternating on and off stages is subject to a principal component analysis. The second method is a PCA-based approach called the common spatial factor analysis technique (CSF). As the name suggests, this method relies on common spatial factors between the above fMRI image sequence and a background fMRI. We have applied these methods to identify active brain ares during visual stimulation and motor tasks. The results from these methods are compared to those obtained by using the standard cross-correlation technique. We found good agreement in the areas identified as active across all three techniques. The results suggest that PCA and CSF methods have good potential in detecting the true stimulus correlated changes in the presence of other interfering signals.

  10. Combined use of quantitative ED-EPMA, Raman microspectrometry, and ATR-FTIR imaging techniques for the analysis of individual particles.

    PubMed

    Jung, Hae-Jin; Eom, Hyo-Jin; Kang, Hyun-Woo; Moreau, Myriam; Sobanska, Sophie; Ro, Chul-Un

    2014-08-21

    In this work, quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA) (called low-Z particle EPMA), Raman microspectrometry (RMS), and attenuated total reflectance Fourier transform infrared spectroscopic (ATR-FTIR) imaging were applied in combination for the analysis of the same individual airborne particles for the first time. After examining individual particles of micrometer size by low-Z particle EPMA, consecutive examinations by RMS and ATR-FTIR imaging of the same individual particles were then performed. The relocation of the same particles on Al or Ag foils was successfully carried out among the three standalone instruments for several standard samples and an indoor airborne particle sample, resulting in the successful acquisition of quality spectral data from the three single-particle analytical techniques. The combined application of the three techniques to several different standard particles confirmed that those techniques provided consistent and complementary chemical composition information on the same individual particles. Further, it was clearly demonstrated that the three different types of spectral and imaging data from the same individual particles in an indoor aerosol sample provided richer information on physicochemical characteristics of the particle ensemble than that obtainable by the combined use of two single-particle analytical techniques.

  11. Fundamental limits of reconstruction-based superresolution algorithms under local translation.

    PubMed

    Lin, Zhouchen; Shum, Heung-Yeung

    2004-01-01

    Superresolution is a technique that can produce images of a higher resolution than that of the originally captured ones. Nevertheless, improvement in resolution using such a technique is very limited in practice. This makes it significant to study the problem: "Do fundamental limits exist for superresolution?" In this paper, we focus on a major class of superresolution algorithms, called the reconstruction-based algorithms, which compute high-resolution images by simulating the image formation process. Assuming local translation among low-resolution images, this paper is the first attempt to determine the explicit limits of reconstruction-based algorithms, under both real and synthetic conditions. Based on the perturbation theory of linear systems, we obtain the superresolution limits from the conditioning analysis of the coefficient matrix. Moreover, we determine the number of low-resolution images that are sufficient to achieve the limit. Both real and synthetic experiments are carried out to verify our analysis.

  12. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  13. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  14. A comparison in Colorado of three methods to monitor breeding amphibians

    USGS Publications Warehouse

    Corn, P.S.; Muths, E.; Iko, W.M.

    2000-01-01

    We surveyed amphibians at 4 montane and 2 plains lentic sites in northern Colorado using 3 techniques: standardized call surveys, automated recording devices (frog-loggers), and intensive surveys including capture-recapture techniques. Amphibians were observed at 5 sites. Species richness varied from 0 to 4 species at each site. Richness scores, the sums of species richness among sites, were similar among methods: 8 for call surveys, 10 for frog-loggers, and 11 for intensive surveys (9 if the non-vocal salamander Ambystoma tigrinum is excluded). The frog-logger at 1 site recorded Spea bombifrons which was not active during the times when call and intensive surveys were conducted. Relative abundance scores from call surveys failed to reflect a relatively large population of Bufo woodhousii at 1 site and only weakly differentiated among different-sized populations of Pseudacris maculata at 3 other sites. For extensive applications, call surveys have the lowest costs and fewest requirements for highly trained personnel. However, for a variety of reasons, call surveys cannot be used with equal effectiveness in all parts of North America.

  15. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  16. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  17. Automatic archaeological feature extraction from satellite VHR images

    NASA Astrophysics Data System (ADS)

    Jahjah, Munzer; Ulivieri, Carlo

    2010-05-01

    Archaeological applications need a methodological approach on a variable scale able to satisfy the intra-site (excavation) and the inter-site (survey, environmental research). The increased availability of high resolution and micro-scale data has substantially favoured archaeological applications and the consequent use of GIS platforms for reconstruction of archaeological landscapes based on remotely sensed data. Feature extraction of multispectral remotely sensing image is an important task before any further processing. High resolution remote sensing data, especially panchromatic, is an important input for the analysis of various types of image characteristics; it plays an important role in the visual systems for recognition and interpretation of given data. The methods proposed rely on an object-oriented approach based on a theory for the analysis of spatial structures called mathematical morphology. The term "morphology" stems from the fact that it aims at analysing object shapes and forms. It is mathematical in the sense that the analysis is based on the set theory, integral geometry, and lattice algebra. Mathematical morphology has proven to be a powerful image analysis technique; two-dimensional grey tone images are seen as three-dimensional sets by associating each image pixel with an elevation proportional to its intensity level. An object of known shape and size, called the structuring element, is then used to investigate the morphology of the input set. This is achieved by positioning the origin of the structuring element to every possible position of the space and testing, for each position, whether the structuring element either is included or has a nonempty intersection with the studied set. The shape and size of the structuring element must be selected according to the morphology of the searched image structures. Other two feature extraction techniques were used, eCognition and ENVI module SW, in order to compare the results. These techniques were applied to different archaeological sites in Turkmenistan (Nisa) and in Iraq (Babylon); a further change detection analysis was applied to the Babylon site using two HR images as a pre-post second gulf war. We had different results or outputs, taking into consideration the fact that the operative scale of sensed data determines the final result of the elaboration and the output of the information quality, because each of them was sensitive to specific shapes in each input image, we had mapped linear and nonlinear objects, updating archaeological cartography, automatic change detection analysis for the Babylon site. The discussion of these techniques has the objective to provide the archaeological team with new instruments for the orientation and the planning of a remote sensing application.

  18. Multi- and monofractal indices of short-term heart rate variability.

    PubMed

    Fischer, R; Akay, M; Castiglioni, P; Di Rienzo, M

    2003-09-01

    Indices of heart rate variability (HRV) based on fractal signal models have recently been shown to possess value as predictors of mortality in specific patient populations. To develop more powerful clinical indices of HRV based on a fractal signal model, the study investigated two HRV indices based on a monofractal signal model called fractional Brownian motion and an index based on a multifractal signal model called multifractional Brownian motion. The performance of the indices was compared with an HRV index in common clinical use. To compare the indices, 18 normal subjects were subjected to postural changes, and the indices were compared on their ability to respond to the resulting autonomic events in HRV recordings. The magnitude of the response to postural change (normalised by the measurement variability) was assessed by analysis of variance and multiple comparison testing. Four HRV indices were investigated for this study: the standard deviation of all normal R-R intervals; an HRV index commonly used in the clinic; detrended fluctuation analysis, an HRV index found to be the most powerful predictor of mortality in a study of patients with depressed left ventricular function; an HRV index developed using the maximum likelihood estimation (MLE) technique for a monofractal signal model; and an HRV index developed for the analysis of multifractional Brownian motion signals. The HRV index based on the MLE technique was found to respond most strongly to the induced postural changes (95% CI). The magnitude of its response (normalised by the measurement variability) was at least 25% greater than any of the other indices tested.

  19. High-Bandwidth Tactical-Network Data Analysis in a High-Performance-Computing (HPC) Environment: Voice Call Analysis

    DTIC Science & Technology

    2015-09-01

    Gateway 2 4. Voice Packet Flow: SIP , Session Description Protocol (SDP), and RTP 3 5. Voice Data Analysis 5 6. Call Analysis 6 7. Call Metrics 6...analysis processing is designed for a general VoIP system architecture based on Session Initiation Protocol ( SIP ) for negotiating call sessions and...employs Skinny Client Control Protocol for network communication between the phone and the local CallManager (e.g., for each dialed digit), SIP

  20. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  1. Quantification issues of trace metal contaminants on silicon wafers by means of TOF-SIMS, ICP-MS, and TXRF

    NASA Astrophysics Data System (ADS)

    Rostam-Khani, P.; Hopstaken, M. J. P.; Vullings, P.; Noij, G.; O'Halloran, O.; Claassen, W.

    2004-06-01

    Measurement of surface metal contamination on silicon wafers is essential for yield enhancement in IC manufacturing. Vapor phase decomposition coupled with either inductively coupled plasma mass spectrometry (VPD-ICP-MS), or total reflection X-ray fluorescence (VPD-TXRF), TXRF and more recently time of flight secondary ion mass spectrometry (TOF-SIMS) are used to monitor surface metal contamination. These techniques complement each other in their respective strengths and weaknesses. For reliable and accurate quantification, so-called relative sensitivity factors (RSF) are required for TOF-SIMS analysis. For quantification purposes in VPD, the collection efficiency (CE) is important to ensure complete collection of contamination. A standard procedure has been developed that combines the determination of these RSFs as well as the collection efficiency using all the analytical techniques mentioned above. Therefore, sample wafers were intentionally contaminated and analyzed (by TOF-SIMS) directly after preparation. After VPD-ICP-MS, several scanned surfaces were analyzed again by TOF-SIMS. Comparing the intensities of the specific metals before and after the VPD-DC procedure on the scanned surface allows the determination of so-called removing efficiency (RE). In general, very good agreement was obtained comparing the four analytical techniques after updating the RSFs for TOF-SIMS. Progress has been achieved concerning the CE evaluation as well as determining the RSFs more precisely for TOF-SIMS.

  2. Using State Merging and State Pruning to Address the Path Explosion Problem Faced by Symbolic Execution

    DTIC Science & Technology

    2014-06-19

    urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n

  3. In Situ Warming and Soil Venting to Enhance the Biodegradation of JP-4 in Cold Climates: A Critical Study and Analysis

    DTIC Science & Technology

    1995-12-01

    1178-1180 (1991). Atlas , Ronald M. and Richard Bartha . Microbial Ecology : Fundamentals and Applications. 3d ed. Redwood City CA: The Benjamin/Cummings...technique called bioventing. In cold climates, in situ bioremediation is limited to the summer when soil temperatures are sufficient to support microbial ...actively warmed the soil -- warm water circulation and heat tape; the other passively warmed the plot with insulatory covers. Microbial respiration (02

  4. Computer tomography of flows external to test models

    NASA Technical Reports Server (NTRS)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  5. Predicting and controlling infectious disease epidemics using temporal networks

    PubMed Central

    Holme, Petter

    2013-01-01

    Infectious diseases can be considered to spread over social networks of people or animals. Mainly owing to the development of data recording and analysis techniques, an increasing amount of social contact data with time stamps has been collected in the last decade. Such temporal data capture the dynamics of social networks on a timescale relevant to epidemic spreading and can potentially lead to better ways to analyze, forecast, and prevent epidemics. However, they also call for extended analysis tools for network epidemiology, which has, to date, mostly viewed networks as static entities. We review recent results of network epidemiology for such temporal network data and discuss future developments. PMID:23513178

  6. An investigation of correlation between pilot scanning behavior and workload using stepwise regression analysis

    NASA Technical Reports Server (NTRS)

    Waller, M. C.

    1976-01-01

    An electro-optical device called an oculometer which tracks a subject's lookpoint as a time function has been used to collect data in a real-time simulation study of instrument landing system (ILS) approaches. The data describing the scanning behavior of a pilot during the instrument approaches have been analyzed by use of a stepwise regression analysis technique. A statistically significant correlation between pilot workload, as indicated by pilot ratings, and scanning behavior has been established. In addition, it was demonstrated that parameters derived from the scanning behavior data can be combined in a mathematical equation to provide a good representation of pilot workload.

  7. Predicting and controlling infectious disease epidemics using temporal networks.

    PubMed

    Masuda, Naoki; Holme, Petter

    2013-01-01

    Infectious diseases can be considered to spread over social networks of people or animals. Mainly owing to the development of data recording and analysis techniques, an increasing amount of social contact data with time stamps has been collected in the last decade. Such temporal data capture the dynamics of social networks on a timescale relevant to epidemic spreading and can potentially lead to better ways to analyze, forecast, and prevent epidemics. However, they also call for extended analysis tools for network epidemiology, which has, to date, mostly viewed networks as static entities. We review recent results of network epidemiology for such temporal network data and discuss future developments.

  8. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  9. How nonlinear optics can merge interferometry for high resolution imaging

    NASA Astrophysics Data System (ADS)

    Ceus, D.; Reynaud, F.; Tonello, A.; Delage, L.; Grossard, L.

    2017-11-01

    High resolution stellar interferometers are very powerful efficient instruments to get a better knowledge of our Universe through the spatial coherence analysis of the light. For this purpose, the optical fields collected by each telescope Ti are mixed together. From the interferometric pattern, two expected information called the contrast Cij and the phase information φij are extracted. These information lead to the Vij, called the complex visibility, with Vij=Cijexp(jφij). For each telescope doublet TiTj, it is possible to get a complex visibility Vij. The Zernike Van Cittert theorem gives a relationship between the intensity distribution of the object observed and the complex visibility. The combination of the acquired complex visibilities and a reconstruction algorithm allows imaging reconstruction. To avoid lots of technical difficulties related to infrared optics (components transmission, thermal noises, thermal cooling…), our team proposes to explore the possibility of using nonlinear optical techniques. This is a promising alternative detection technique for detecting infrared optical signals. This way, we experimentally demonstrate that frequency conversion does not result in additional bias on the interferometric data supplied by a stellar interferometer. In this presentation, we report on wavelength conversion of the light collected by each telescope from the infrared domain to the visible. The interferometric pattern is observed in the visible domain with our, so called, upconversion interferometer. Thereby, one can benefit from mature optical components mainly used in optical telecommunications (waveguide, coupler, multiplexer…) and efficient low-noise detection schemes up to the single-photon counting level.

  10. Multidisciplinary Responses to the Sexual Victimization of Children: Use of Control Phone Calls.

    PubMed

    Canavan, J William; Borowski, Christine; Essex, Stacy; Perkowski, Stefan

    2017-10-01

    This descriptive study addresses the question of the value of one-party consent phone calls regarding the sexual victimization of children. The authors reviewed 4 years of experience with children between the ages of 3 and 18 years selected for the control phone calls after a forensic interview by the New York State Police forensic interviewer. The forensic interviewer identified appropriate cases for control phone calls considering New York State law, the child's capacity to make the call, the presence of another person to make the call and a supportive residence. The control phone call process has been extremely effective forensically. Offenders choose to avoid trial by taking a plea bargain thereby dramatically speeding up the criminal judicial and family court processes. An additional outcome of the control phone call is the alleged offender's own words saved the child from the trauma of testifying in court. The control phone call reduced the need for children to repeat their stories to various interviewers. A successful control phone call gives the child a sense of vindication. This technique is the only technique that preserves the actual communication pattern between the alleged victim and the alleged offender. This can be of great value to the mental health professionals working with both the child and the alleged offender. Cautions must be considered regarding potential serious adverse effects on the child. The multidisciplinary team members must work together in the control phone call. The descriptive nature of this study did not allow the authors adequate demographic data, a subject that should be addressed in future prospective study.

  11. Source localization of narrow band signals in multipath environments, with application to marine mammals

    NASA Astrophysics Data System (ADS)

    Valtierra, Robert Daniel

    Passive acoustic localization has benefited from many major developments and has become an increasingly important focus point in marine mammal research. Several challenges still remain. This work seeks to address several of these challenges such as tracking the calling depths of baleen whales. In this work, data from an array of widely spaced Marine Acoustic Recording Units (MARUs) was used to achieve three dimensional localization by combining the methods Time Difference of Arrival (TDOA) and Direct-Reflected Time Difference of Arrival (DRTD) along with a newly developed autocorrelation technique. TDOA was applied to data for two dimensional (latitude and longitude) localization and depth was resolved using DRTD. Previously, DRTD had been limited to pulsed broadband signals, such as sperm whale or dolphin echolocation, where individual direct and reflected signals are separated in time. Due to the length of typical baleen whale vocalizations, individual multipath signal arrivals can overlap making time differences of arrival difficult to resolve. This problem can be solved using an autocorrelation, which can extract reflection information from overlapping signals. To establish this technique, a derivation was made to model the autocorrelation of a direct signal and its overlapping reflection. The model was exploited to derive performance limits allowing for prediction of the minimum resolvable direct-reflected time difference for a known signal type. The dependence on signal parameters (sweep rate, call duration) was also investigated. The model was then verified using both recorded and simulated data from two analysis cases for North Atlantic right whales (NARWs, Eubalaena glacialis) and humpback whales (Megaptera noveaengliae). The newly developed autocorrelation technique was then combined with DRTD and tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The combined DRTD-autocorrelation methods enabled calling depth and range estimations of a vocalizing NARW and humpback whale in two separate cases. The DRTD-autocorrelation method was then combined with TDOA to create a three dimensional track of a NARW in the Stellwagen Bank National Marine Sanctuary. Results from these experiments illustrated the potential of the combined methods to successfully resolve baleen calling depths in three dimensions.

  12. Improving Reliability of Spectrum Analysis for Software Quality Requirements Using TCM

    NASA Astrophysics Data System (ADS)

    Kaiya, Haruhiko; Tanigawa, Masaaki; Suzuki, Shunichi; Sato, Tomonori; Osada, Akira; Kaijiri, Kenji

    Quality requirements are scattered over a requirements specification, thus it is hard to measure and trace such quality requirements to validate the specification against stakeholders' needs. We proposed a technique called “spectrum analysis for quality requirements” which enabled analysts to sort a requirements specification to measure and track quality requirements in the specification. In the same way as a spectrum in optics, a quality spectrum of a specification shows a quantitative feature of the specification with respect to quality. Therefore, we can compare a specification of a system to another one with respect to quality. As a result, we can validate such a specification because we can check whether the specification has common quality features and know its specific features against specifications of existing similar systems. However, our first spectrum analysis for quality requirements required a lot of effort and knowledge of a problem domain and it was hard to reuse such knowledge to reduce the effort. We thus introduce domain knowledge called term-characteristic map (TCM) to reuse the knowledge for our quality spectrum analysis. Through several experiments, we evaluate our spectrum analysis, and main finding are as follows. First, we confirmed specifications of similar systems have similar quality spectra. Second, results of spectrum analysis using TCM are objective, i.e., different analysts can generate almost the same spectra when they analyze the same specification.

  13. Methods for collection and analysis of aquatic biological and microbiological samples

    USGS Publications Warehouse

    Britton, L.J.; Greeson, P.E.

    1989-01-01

    The series of chapters on techniques describes methods used by the U.S. Geological Survey for planning and conducting water-resources investigations. The material is arranged under major subject headings called books and is further subdivided into sections and chapters. Book 5 is on laboratory analysis. Section A is on water. The unit of publication, the chapter, is limited to a narrow field of subject matter. "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" is the fourth chapter to be published under Section A of Book 5. The chapter number includes the letter of the section.This chapter was prepared by several aquatic biologists and microbiologists of the U.S. Geological Survey to provide accurate and precise methods for the collection and analysis of aquatic biological and microbiological samples.Use of brand, firm, and trade names in this chapter is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey.This chapter supersedes "Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" edited by P.E. Greeson, T.A. Ehlke, G.A. Irwin, B.W. Lium, and K.V. Slack (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4, 1977) and also supersedes "A Supplement to-Methods for Collection and Analysis of Aquatic Biological and Microbiological Samples" by P.E. Greeson (U.S. Geological Survey Techniques of Water-Resources Investigations, Book 5, Chapter A4), Open-File Report 79-1279, 1979.

  14. The aggregated unfitted finite element method for elliptic problems

    NASA Astrophysics Data System (ADS)

    Badia, Santiago; Verdugo, Francesc; Martín, Alberto F.

    2018-07-01

    Unfitted finite element techniques are valuable tools in different applications where the generation of body-fitted meshes is difficult. However, these techniques are prone to severe ill conditioning problems that obstruct the efficient use of iterative Krylov methods and, in consequence, hinders the practical usage of unfitted methods for realistic large scale applications. In this work, we present a technique that addresses such conditioning problems by constructing enhanced finite element spaces based on a cell aggregation technique. The presented method, called aggregated unfitted finite element method, is easy to implement, and can be used, in contrast to previous works, in Galerkin approximations of coercive problems with conforming Lagrangian finite element spaces. The mathematical analysis of the new method states that the condition number of the resulting linear system matrix scales as in standard finite elements for body-fitted meshes, without being affected by small cut cells, and that the method leads to the optimal finite element convergence order. These theoretical results are confirmed with 2D and 3D numerical experiments.

  15. Analysis of Compression Algorithm in Ground Collision Avoidance Systems (Auto-GCAS)

    NASA Technical Reports Server (NTRS)

    Schmalz, Tyler; Ryan, Jack

    2011-01-01

    Automatic Ground Collision Avoidance Systems (Auto-GCAS) utilizes Digital Terrain Elevation Data (DTED) stored onboard a plane to determine potential recovery maneuvers. Because of the current limitations of computer hardware on military airplanes such as the F-22 and F-35, the DTED must be compressed through a lossy technique called binary-tree tip-tilt. The purpose of this study is to determine the accuracy of the compressed data with respect to the original DTED. This study is mainly interested in the magnitude of the error between the two as well as the overall distribution of the errors throughout the DTED. By understanding how the errors of the compression technique are affected by various factors (topography, density of sampling points, sub-sampling techniques, etc.), modifications can be made to the compression technique resulting in better accuracy. This, in turn, would minimize unnecessary activation of A-GCAS during flight as well as maximizing its contribution to fighter safety.

  16. Ranking the strategies for Indian medical tourism sector through the integration of SWOT analysis and TOPSIS method.

    PubMed

    Ajmera, Puneeta

    2017-10-09

    Purpose Organizations have to evaluate their internal and external environments in this highly competitive world. Strengths, weaknesses, opportunities and threats (SWOT) analysis is a very useful technique which analyzes the strengths, weaknesses, opportunities and threats of an organization for taking strategic decisions and it also provides a foundation for the formulation of strategies. But the drawback of SWOT analysis is that it does not quantify the importance of individual factors affecting the organization and the individual factors are described in brief without weighing them. Because of this reason, SWOT analysis can be integrated with any multiple attribute decision-making (MADM) technique like the technique for order preference by similarity to ideal solution (TOPSIS), analytical hierarchy process, etc., to evaluate the best alternative among the available strategic alternatives. The paper aims to discuss these issues. Design/methodology/approach In this study, SWOT analysis is integrated with a multicriteria decision-making technique called TOPSIS to rank different strategies for Indian medical tourism in order of priority. Findings SO strategy (providing best facilitation and care to the medical tourists at par to developed countries) is the best strategy which matches with the four elements of S, W, O and T of SWOT matrix and 35 strategic indicators. Practical implications This paper proposes a solution based on a combined SWOT analysis and TOPSIS approach to help the organizations to evaluate and select strategies. Originality/value Creating a new technology or administering a new strategy always has some degree of resistance by employees. To minimize resistance, the author has used TOPSIS as it involves group thinking, requiring every manager of the organization to analyze and evaluate different alternatives and average measure of each parameter in final decision matrix.

  17. Expediting Scientific Data Analysis with Reorganization of Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byna, Surendra; Wu, Kesheng

    2013-08-19

    Data producers typically optimize the layout of data files to minimize the write time. In most cases, data analysis tasks read these files in access patterns different from the write patterns causing poor read performance. In this paper, we introduce Scientific Data Services (SDS), a framework for bridging the performance gap between writing and reading scientific data. SDS reorganizes data to match the read patterns of analysis tasks and enables transparent data reads from the reorganized data. We implemented a HDF5 Virtual Object Layer (VOL) plugin to redirect the HDF5 dataset read calls to the reorganized data. To demonstrate themore » effectiveness of SDS, we applied two parallel data organization techniques: a sort-based organization on a plasma physics data and a transpose-based organization on mass spectrometry imaging data. We also extended the HDF5 data access API to allow selection of data based on their values through a query interface, called SDS Query. We evaluated the execution time in accessing various subsets of data through existing HDF5 Read API and SDS Query. We showed that reading the reorganized data using SDS is up to 55X faster than reading the original data.« less

  18. Double differential neutron spectra generated by the interaction of a 12 MeV/nucleon 36S beam on a thick natCu target

    NASA Astrophysics Data System (ADS)

    Trinh, N. D.; Fadil, M.; Lewitowicz, M.; Ledoux, X.; Laurent, B.; Thomas, J.-C.; Clerc, T.; Desmezières, V.; Dupuis, M.; Madeline, A.; Dessay, E.; Grinyer, G. F.; Grinyer, J.; Menard, N.; Porée, F.; Achouri, L.; Delaunay, F.; Parlog, M.

    2018-07-01

    Double differential neutron spectra (energy, angle) originating from a thick natCu target bombarded by a 12 MeV/nucleon 36S16+ beam were measured by the activation method and the Time-of-flight technique at the Grand Accélérateur National d'Ions Lourds (GANIL). A neutron spectrum unfolding algorithm combining the SAND-II iterative method and Monte-Carlo techniques was developed for the analysis of the activation results that cover a wide range of neutron energies. It was implemented into a graphical user interface program, called GanUnfold. The experimental neutron spectra are compared to Monte-Carlo simulations performed using the PHITS and FLUKA codes.

  19. Encoding techniques for complex information structures in connectionist systems

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.

  20. Geological structure analysis in Central Java using travel time tomography technique of S waves

    NASA Astrophysics Data System (ADS)

    Palupi, I. R.; Raharjo, W.; Nurdian, S. W.; Giamboro, W. S.; Santoso, A.

    2016-11-01

    Java is one of the islands in Indonesia that is prone to the earthquakes, in south of Java, there is the Australian Plate move to the Java island and press with perpendicular direction. This plate movement formed subduction zone and cause earthquakes. The earthquake is the release of energy due to the sudden movement of the plates. When an earthquake occurs, the energy is released and record by seismometers in the waveform. The first wave recorded is called the P waves (primary) and the next wave is called S waves (secondary). Both of these waves have different characteristics in terms of propagation and direction of movement. S wave is composed of waves of Rayleigh and Love waves, with each direction of movement of the vertical and horizontal, subsurface imaging by using S wave tomography technique can describe the type of the S wave through the medium. The variation of wave velocity under Central Java (esearch area) is ranging from -10% to 10% at the depth of 20, 30 and 40 km, the velocity decrease with the depth increase. Moho discontinuity is lies in the depth of 32 km under the crust, it is indicates there is strong heterogenity in Moho.

  1. Order reduction for a model of marine bacteriophage evolution

    NASA Astrophysics Data System (ADS)

    Pagliarini, Silvia; Korobeinikov, Andrei

    2017-02-01

    A typical mechanistic model of viral evolution necessary includes several time scales which can differ by orders of magnitude. Such a diversity of time scales makes analysis of these models difficult. Reducing the order of a model is highly desirable when handling such a model. A typical approach applied to such slow-fast (or singularly perturbed) systems is the time scales separation technique. Constructing the so-called quasi-steady-state approximation is the usual first step in applying the technique. While this technique is commonly applied, in some cases its straightforward application can lead to unsatisfactory results. In this paper we construct the quasi-steady-state approximation for a model of evolution of marine bacteriophages based on the Beretta-Kuang model. We show that for this particular model the quasi-steady-state approximation is able to produce only qualitative but not quantitative fit.

  2. Domain decomposition methods in aerodynamics

    NASA Technical Reports Server (NTRS)

    Venkatakrishnan, V.; Saltz, Joel

    1990-01-01

    Compressible Euler equations are solved for two-dimensional problems by a preconditioned conjugate gradient-like technique. An approximate Riemann solver is used to compute the numerical fluxes to second order accuracy in space. Two ways to achieve parallelism are tested, one which makes use of parallelism inherent in triangular solves and the other which employs domain decomposition techniques. The vectorization/parallelism in triangular solves is realized by the use of a recording technique called wavefront ordering. This process involves the interpretation of the triangular matrix as a directed graph and the analysis of the data dependencies. It is noted that the factorization can also be done in parallel with the wave front ordering. The performances of two ways of partitioning the domain, strips and slabs, are compared. Results on Cray YMP are reported for an inviscid transonic test case. The performances of linear algebra kernels are also reported.

  3. Advanced smoke meter development survey and analysis

    NASA Technical Reports Server (NTRS)

    Pitz, R. W.; Penney, C. M.; Stanforth, C. M.; Shaffernocker, W. M.

    1984-01-01

    Ideal smoke meter characteristics are determined to provide a basis for evaluation of candidate systems. Five promising techniques are analyzed in detail to evaluate compilance with the practical smoke meter requirements. Four of the smoke measurement concepts are optical methods: Modulated Transmission (MODTRAN), Cross Beam Absorption Counter (CBAC), Laser Induced Incandescence (LIN), and Photoacoustic Spectroscopy (PAS). A rapid response filter instrument called a Taper Element Oscillating Microbalance (TEOM) is also evaluated. For each technique, the theoretical principles are described, the expected performance is determined, and the advantages and disadvantages are discussed The expected performance is evaluated against each of the smoke meter specifications, and the key questions for further study are given. The most promising smoke meter technique analyzed was MODTRAN, which is a variation on a direct transmission measurement. The soot-laden gas is passed through a transmission cell, and the gas pressure is modulated by a speaker.

  4. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  5. Integrating multi-omic features exploiting Chromosome Conformation Capture data.

    PubMed

    Merelli, Ivan; Tordini, Fabio; Drocco, Maurizio; Aldinucci, Marco; Liò, Pietro; Milanesi, Luciano

    2015-01-01

    The representation, integration, and interpretation of omic data is a complex task, in particular considering the huge amount of information that is daily produced in molecular biology laboratories all around the world. The reason is that sequencing data regarding expression profiles, methylation patterns, and chromatin domains is difficult to harmonize in a systems biology view, since genome browsers only allow coordinate-based representations, discarding functional clusters created by the spatial conformation of the DNA in the nucleus. In this context, recent progresses in high throughput molecular biology techniques and bioinformatics have provided insights into chromatin interactions on a larger scale and offer a formidable support for the interpretation of multi-omic data. In particular, a novel sequencing technique called Chromosome Conformation Capture allows the analysis of the chromosome organization in the cell's natural state. While performed genome wide, this technique is usually called Hi-C. Inspired by service applications such as Google Maps, we developed NuChart, an R package that integrates Hi-C data to describe the chromosomal neighborhood starting from the information about gene positions, with the possibility of mapping on the achieved graphs genomic features such as methylation patterns and histone modifications, along with expression profiles. In this paper we show the importance of the NuChart application for the integration of multi-omic data in a systems biology fashion, with particular interest in cytogenetic applications of these techniques. Moreover, we demonstrate how the integration of multi-omic data can provide useful information in understanding why genes are in certain specific positions inside the nucleus and how epigenetic patterns correlate with their expression.

  6. Multi-component separation and analysis of bat echolocation calls.

    PubMed

    DiCecco, John; Gaudette, Jason E; Simmons, James A

    2013-01-01

    The vast majority of animal vocalizations contain multiple frequency modulated (FM) components with varying amounts of non-linear modulation and harmonic instability. This is especially true of biosonar sounds where precise time-frequency templates are essential for neural information processing of echoes. Understanding the dynamic waveform design by bats and other echolocating animals may help to improve the efficacy of man-made sonar through biomimetic design. Bats are known to adapt their call structure based on the echolocation task, proximity to nearby objects, and density of acoustic clutter. To interpret the significance of these changes, a method was developed for component separation and analysis of biosonar waveforms. Techniques for imaging in the time-frequency plane are typically limited due to the uncertainty principle and interference cross terms. This problem is addressed by extending the use of the fractional Fourier transform to isolate each non-linear component for separate analysis. Once separated, empirical mode decomposition can be used to further examine each component. The Hilbert transform may then successfully extract detailed time-frequency information from each isolated component. This multi-component analysis method is applied to the sonar signals of four species of bats recorded in-flight by radiotelemetry along with a comparison of other common time-frequency representations.

  7. The improving efficiency frontier of inpatient rehabilitation hospitals.

    PubMed

    Harrison, Jeffrey P; Kirkpatrick, Nicole

    2011-01-01

    This study uses a linear programming technique called data envelopment analysis to identify changes in the efficiency frontier of inpatient rehabilitation hospitals after implementation of the prospective payment system. The study provides a time series analysis of the efficiency frontier for inpatient rehabilitation hospitals in 2003 immediately after implementation of PPS and then again in 2006. Results indicate that the efficiency frontier of inpatient rehabilitation hospitals increased from 84% in 2003 to 85% in 2006. Similarly, an analysis of slack or inefficiency shows improvements in output efficiency over the study period. This clearly documents that efficiency in the inpatient rehabilitation hospital industry after implementation of PPS is improving. Hospital executives, health care policymakers, taxpayers, and other stakeholders benefit from studies that improve health care efficiency.

  8. CleAir Monitoring System for Particulate Matter: A Case in the Napoleonic Museum in Rome

    PubMed Central

    Bonacquisti, Valerio; Di Michele, Marta; Frasca, Francesca; Chianese, Angelo; Siani, Anna Maria

    2017-01-01

    Monitoring the air particulate concentration both outdoors and indoors is becoming a more relevant issue in the past few decades. An innovative, fully automatic, monitoring system called CleAir is presented. Such a system wants to go beyond the traditional technique (gravimetric analysis), allowing for a double monitoring approach: the traditional gravimetric analysis as well as the optical spectroscopic analysis of the scattering on the same filters in steady-state conditions. The experimental data are interpreted in terms of light percolation through highly scattering matter by means of the stretched exponential evolution. CleAir has been applied to investigate the daily distribution of particulate matter within the Napoleonic Museum in Rome as a test case. PMID:28892016

  9. Pse-Analysis: a python package for DNA/RNA and protein/ peptide sequence analysis based on pseudo components and kernel methods.

    PubMed

    Liu, Bin; Wu, Hao; Zhang, Deyuan; Wang, Xiaolong; Chou, Kuo-Chen

    2017-02-21

    To expedite the pace in conducting genome/proteome analysis, we have developed a Python package called Pse-Analysis. The powerful package can automatically complete the following five procedures: (1) sample feature extraction, (2) optimal parameter selection, (3) model training, (4) cross validation, and (5) evaluating prediction quality. All the work a user needs to do is to input a benchmark dataset along with the query biological sequences concerned. Based on the benchmark dataset, Pse-Analysis will automatically construct an ideal predictor, followed by yielding the predicted results for the submitted query samples. All the aforementioned tedious jobs can be automatically done by the computer. Moreover, the multiprocessing technique was adopted to enhance computational speed by about 6 folds. The Pse-Analysis Python package is freely accessible to the public at http://bioinformatics.hitsz.edu.cn/Pse-Analysis/, and can be directly run on Windows, Linux, and Unix.

  10. An inexpensive active optical remote sensing instrument for assessing aerosol distributions.

    PubMed

    Barnes, John E; Sharma, Nimmi C P

    2012-02-01

    Air quality studies on a broad variety of topics from health impacts to source/sink analyses, require information on the distributions of atmospheric aerosols over both altitude and time. An inexpensive, simple to implement, ground-based optical remote sensing technique has been developed to assess aerosol distributions. The technique, called CLidar (Charge Coupled Device Camera Light Detection and Ranging), provides aerosol altitude profiles over time. In the CLidar technique a relatively low-power laser transmits light vertically into the atmosphere. The transmitted laser light scatters off of air molecules, clouds, and aerosols. The entire beam from ground to zenith is imaged using a CCD camera and wide-angle (100 degree) optics which are a few hundred meters from the laser. The CLidar technique is optimized for low altitude (boundary layer and lower troposphere) measurements where most aerosols are found and where many other profiling techniques face difficulties. Currently the technique is limited to nighttime measurements. Using the CLidar technique aerosols may be mapped over both altitude and time. The instrumentation required is portable and can easily be moved to locations of interest (e.g. downwind from factories or power plants, near highways). This paper describes the CLidar technique, implementation and data analysis and offers specifics for users wishing to apply the technique for aerosol profiles.

  11. Identification of immunoglobulins using Chou's pseudo amino acid composition with feature selection technique.

    PubMed

    Tang, Hua; Chen, Wei; Lin, Hao

    2016-04-01

    Immunoglobulins, also called antibodies, are a group of cell surface proteins which are produced by the immune system in response to the presence of a foreign substance (called antigen). They play key roles in many medical, diagnostic and biotechnological applications. Correct identification of immunoglobulins is crucial to the comprehension of humoral immune function. With the avalanche of protein sequences identified in postgenomic age, it is highly desirable to develop computational methods to timely identify immunoglobulins. In view of this, we designed a predictor called "IGPred" by formulating protein sequences with the pseudo amino acid composition into which nine physiochemical properties of amino acids were incorporated. Jackknife cross-validated results showed that 96.3% of immunoglobulins and 97.5% of non-immunoglobulins can be correctly predicted, indicating that IGPred holds very high potential to become a useful tool for antibody analysis. For the convenience of most experimental scientists, a web-server for IGPred was established at http://lin.uestc.edu.cn/server/IGPred. We believe that the web-server will become a powerful tool to study immunoglobulins and to guide related experimental validations.

  12. MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.

    PubMed

    Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M

    2002-05-30

    Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.

  13. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  14. Methods and materials, for locating and studying spotted owls.

    Treesearch

    Eric D. Forsman

    1983-01-01

    Nocturnal calling surveys are the most effective and most frequently used technique for locating spotted owls. Roosts and general nest locations may be located during the day by calling in suspected roost or nest areas. Specific nest trees are located by: (1) baiting with a live mouse to induce owls to visit the nest, (2) calling in suspected nest areas to stimulate...

  15. What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?

    ERIC Educational Resources Information Center

    Cushion, Steve

    2006-01-01

    We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…

  16. A knowledge based system for scientific data visualization

    NASA Technical Reports Server (NTRS)

    Senay, Hikmet; Ignatius, Eve

    1992-01-01

    A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.

  17. Post-coronagraphic tip-tilt sensing for vortex phase masks: The QACITS technique

    NASA Astrophysics Data System (ADS)

    Huby, E.; Baudoz, P.; Mawet, D.; Absil, O.

    2015-12-01

    Context. Small inner working angle coronagraphs, such as the vortex phase mask, are essential to exploit the full potential of ground-based telescopes in the context of exoplanet detection and characterization. However, the drawback of this attractive feature is a high sensitivity to pointing errors, which degrades the performance of the coronagraph. Aims: We propose a tip-tilt retrieval technique based on the analysis of the final coronagraphic image, hereafter called Quadrant Analysis of Coronagraphic Images for Tip-tilt Sensing (QACITS). Methods: Under the assumption of small phase aberrations, we show that the behavior of the vortex phase mask can be simply described from the entrance pupil to the Lyot stop plane with Zernike polynomials. This convenient formalism is used to establish the theoretical basis of the QACITS technique. We performed simulations to demonstrate the validity and limits of the technique, including the case of a centrally obstructed pupil. Results: The QACITS technique principle is validated with experimental results in the case of an unobstructed circular aperture, as well as simulations in presence of a central obstruction. The typical configuration of the Keck telescope (24% central obstruction) has been simulated with additional high order aberrations. In these conditions, our simulations show that the QACITS technique is still adapted to centrally obstructed pupils and performs tip-tilt retrieval with a precision of 5 × 10-2λ/D when wavefront errors amount to λ/ 14 rms and 10-2λ/D for λ/ 70 rms errors (with λ the wavelength and D the pupil diameter). Conclusions: We have developed and demonstrated a tip-tilt sensing technique for vortex coronagraphs. The implementation of the QACITS technique is based on the analysis of the scientific image and does not require any modification of the original setup. Current facilities equipped with a vortex phase mask can thus directly benefit from this technique to improve the contrast performance close to the axis.

  18. Calling behavior of blue and fin whales off California

    NASA Astrophysics Data System (ADS)

    Oleson, Erin Marie

    Passive acoustic monitoring is an effective means for evaluating cetacean presence in remote regions and over long time periods, and may become an important component of cetacean abundance surveys. To use passive acoustic recordings for abundance estimation, an understanding of the behavioral ecology of cetacean calling is crucial. In this dissertation, I develop a better understanding of how blue (Balaenoptera musculus) and fin (B. physalus ) whales use sound with the goal of evaluating passive acoustic techniques for studying their populations. Both blue and fin whales produce several different call types, though the behavioral and environmental context of these calls have not been widely investigated. To better understand how calling is used by these whales off California I have employed both new technologies and traditional techniques, including acoustic recording tags, continuous long-term autonomous acoustic recordings, and simultaneous shipboard acoustic and visual surveys. The outcome of these investigations has led to several conclusions. The production of blue whale calls varies with sex, behavior, season, location, and time of day. Each blue whale call type has a distinct behavioral context, including a male-only bias in the production of song, a call type thought to function in reproduction, and the production of some calls by both sexes. Long-term acoustic records, when interpreted using all call types, provide a more accurate measure of the local seasonal presence of whales, and how they use the region annually, seasonally and daily. The relative occurrence of different call types may indicate prime foraging habitat and the presence of different segments of the population. The proportion of animals heard calling changes seasonally and geographically relative to the number seen, indicating the calibration of acoustic and visual surveys is complex and requires further study on the motivations behind call production and the behavior of calling whales. These findings will play a role in the future development of acoustic census methods and habitat studies for these species, and will provide baseline information for the determination of anthropogenic impacts on these populations.

  19. Chapter 7. Cloning and analysis of natural product pathways.

    PubMed

    Gust, Bertolt

    2009-01-01

    The identification of gene clusters of natural products has lead to an enormous wealth of information about their biosynthesis and its regulation, and about self-resistance mechanisms. Well-established routine techniques are now available for the cloning and sequencing of gene clusters. The subsequent functional analysis of the complex biosynthetic machinery requires efficient genetic tools for manipulation. Until recently, techniques for the introduction of defined changes into Streptomyces chromosomes were very time-consuming. In particular, manipulation of large DNA fragments has been challenging due to the absence of suitable restriction sites for restriction- and ligation-based techniques. The homologous recombination approach called recombineering (referred to as Red/ET-mediated recombination in this chapter) has greatly facilitated targeted genetic modifications of complex biosynthetic pathways from actinomycetes by eliminating many of the time-consuming and labor-intensive steps. This chapter describes techniques for the cloning and identification of biosynthetic gene clusters, for the generation of gene replacements within such clusters, for the construction of integrative library clones and their expression in heterologous hosts, and for the assembly of entire biosynthetic gene clusters from the inserts of individual library clones. A systematic approach toward insertional mutation of a complete Streptomyces genome is shown by the use of an in vitro transposon mutagenesis procedure.

  20. Design of a 3D Navigation Technique Supporting VR Interaction

    NASA Astrophysics Data System (ADS)

    Boudoin, Pierre; Otmane, Samir; Mallem, Malik

    2008-06-01

    Multimodality is a powerful paradigm to increase the realness and the easiness of the interaction in Virtual Environments (VEs). In particular, the search for new metaphors and techniques for 3D interaction adapted to the navigation task is an important stage for the realization of future 3D interaction systems that support multimodality, in order to increase efficiency and usability. In this paper we propose a new multimodal 3D interaction model called Fly Over. This model is especially devoted to the navigation task. We present a qualitative comparison between Fly Over and a classical navigation technique called gaze-directed steering. The results from preliminary evaluation on the IBISC semi-immersive Virtual Reality/Augmented Realty EVR@ platform show that Fly Over is a user friendly and efficient navigation technique.

  1. Prediction of Hexaconazole Concentration in the Top Most Layer of Oil Palm Plantation Soil Using Exploratory Data Analysis (EDA)

    PubMed Central

    Maznah, Zainol; Halimah, Muhamad; Shitan, Mahendran; Kumar Karmokar, Provash; Najwa, Sulaiman

    2017-01-01

    Ganoderma boninense is a fungus that can affect oil palm trees and cause a serious disease called the basal stem root (BSR). This disease causes the death of more than 80% of oil palm trees midway through their economic life and hexaconazole is one of the particular fungicides that can control this fungus. Hexaconazole can be applied by the soil drenching method and it will be of interest to know the concentration of the residue in the soil after treatment with respect to time. Hence, a field study was conducted in order to determine the actual concentration of hexaconazole in soil. In the present paper, a new approach that can be used to predict the concentration of pesticides in the soil is proposed. The statistical analysis revealed that the Exploratory Data Analysis (EDA) techniques would be appropriate in this study. The EDA techniques were used to fit a robust resistant model and predict the concentration of the residue in the topmost layer of the soil. PMID:28060816

  2. Prediction of Hexaconazole Concentration in the Top Most Layer of Oil Palm Plantation Soil Using Exploratory Data Analysis (EDA).

    PubMed

    Maznah, Zainol; Halimah, Muhamad; Shitan, Mahendran; Kumar Karmokar, Provash; Najwa, Sulaiman

    2017-01-01

    Ganoderma boninense is a fungus that can affect oil palm trees and cause a serious disease called the basal stem root (BSR). This disease causes the death of more than 80% of oil palm trees midway through their economic life and hexaconazole is one of the particular fungicides that can control this fungus. Hexaconazole can be applied by the soil drenching method and it will be of interest to know the concentration of the residue in the soil after treatment with respect to time. Hence, a field study was conducted in order to determine the actual concentration of hexaconazole in soil. In the present paper, a new approach that can be used to predict the concentration of pesticides in the soil is proposed. The statistical analysis revealed that the Exploratory Data Analysis (EDA) techniques would be appropriate in this study. The EDA techniques were used to fit a robust resistant model and predict the concentration of the residue in the topmost layer of the soil.

  3. Optical transmission testing based on asynchronous sampling techniques: images analysis containing chromatic dispersion using convolutional neural network

    NASA Astrophysics Data System (ADS)

    Mrozek, T.; Perlicki, K.; Tajmajer, T.; Wasilewski, P.

    2017-08-01

    The article presents an image analysis method, obtained from an asynchronous delay tap sampling (ADTS) technique, which is used for simultaneous monitoring of various impairments occurring in the physical layer of the optical network. The ADTS method enables the visualization of the optical signal in the form of characteristics (so called phase portraits) that change their shape under the influence of impairments such as chromatic dispersion, polarization mode dispersion and ASE noise. Using this method, a simulation model was built with OptSim 4.0. After the simulation study, data were obtained in the form of images that were further analyzed using the convolutional neural network algorithm. The main goal of the study was to train a convolutional neural network to recognize the selected impairment (distortion); then to test its accuracy and estimate the impairment for the selected set of test images. The input data consisted of processed binary images in the form of two-dimensional matrices, with the position of the pixel. This article focuses only on the analysis of images containing chromatic dispersion.

  4. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth.

    PubMed

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year(-1), overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year(-1) occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique.

  5. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth

    PubMed Central

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year-1, overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year-1 occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique. PMID:22389587

  6. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  7. Application of structured analysis to a telerobotic system

    NASA Technical Reports Server (NTRS)

    Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven

    1990-01-01

    The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.

  8. Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.

    PubMed

    Chmelnitsky, Elly G; Ferguson, Steven H

    2012-06-01

    Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.

  9. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  10. Efficient morse decompositions of vector fields.

    PubMed

    Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene

    2008-01-01

    Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.

  11. Terrestrial Radiodetermination Performance and Cost

    DOT National Transportation Integrated Search

    1977-09-01

    The report summarizes information gathered during a study of the application of electronic techniques to geographical position determination on land and on inland waterways. Systems incorporating such techniques have been called terrestrial radiodete...

  12. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  13. Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment

    PubMed Central

    Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2017-01-01

    The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. PMID:29186754

  14. Overall equipment efficiency of Flexographic Printing process: A case study

    NASA Astrophysics Data System (ADS)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  15. Analysis and implementation of the foveated vision of the raptor eye

    NASA Astrophysics Data System (ADS)

    Long, Aaron D.; Narayanan, Ram M.; Kane, Timothy J.; Rice, Terence F.; Tauber, Michael J.

    2016-05-01

    A foveated optical system has non-uniform resolution across its field of view. Typically, the resolution of such a lens is peaked in the center region of field of view, such as in the human eye. In biological systems this is often a result of localized depressions on the retina called foveae. Birds of prey, or raptors, have two foveae in each eye, each of which accounts for a localized region of high magnification within the raptor's field of view. This paper presents an analysis of the bifoveated vision of raptors and presents a method whereby this unique optical characteristic may be achieved in an optical system using freeform optics and aberration correction techniques.

  16. Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution

    NASA Technical Reports Server (NTRS)

    Zoladz, T. F.; Jones, J. H.; Jong, J.

    1992-01-01

    A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.

  17. Performance Analysis and Design Synthesis (PADS) computer program. Volume 2: Program description, part 2

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The QL module of the Performance Analysis and Design Synthesis (PADS) computer program is described. Execution of this module is initiated when and if subroutine PADSI calls subroutine GROPE. Subroutine GROPE controls the high level logical flow of the QL module. The purpose of the module is to determine a trajectory that satisfies the necessary variational conditions for optimal performance. The module achieves this by solving a nonlinear multi-point boundary value problem. The numerical method employed is described. It is an iterative technique that converges quadratically when it does converge. The three basic steps of the module are: (1) initialization, (2) iteration, and (3) culmination. For Volume 1 see N73-13199.

  18. Application driven interface generation for EASIE. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kao, Ya-Chen

    1992-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a user interface and a set of utility programs which support the rapid integration and execution of analysis programs about a central relational database. EASIE provides users with two basic modes of execution. One of them is a menu-driven execution mode, called Application-Driven Execution (ADE), which provides sufficient guidance to review data, select a menu action item, and execute an application program. The other mode of execution, called Complete Control Execution (CCE), provides an extended executive interface which allows in-depth control of the design process. Currently, the EASIE system is based on alphanumeric techniques only. It is the purpose of this project to extend the flexibility of the EASIE system in the ADE mode by implementing it in a window system. Secondly, a set of utilities will be developed to assist the experienced engineer in the generation of an ADE application.

  19. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  20. Analysis of nonlinear modulation between sound and vibrations in metallic structure and its use for damage detection

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gang, Tie; Wan, Chuhao; Wang, Changxi; Luo, Zhiwei

    2015-07-01

    Vibro-acoustic modulation technique is a nonlinear ultrasonic method in nondestructive testing. This technique detects the defects by monitoring the modulation components generated by the interaction between the vibration and the ultrasound wave due to the nonlinear material behaviour caused by the damage. In this work, a swept frequency signal was used as high frequency excitation, then the Hilbert transform based amplitude and phase demodulation and synchronous demodulation (SD) were used to extract the modulation information from the received signal, the results were graphed in the time-frequency domain after the short time Fourier transform. The demodulation results were quite different from each other. The reason for the difference was investigated by analysing the demodulation process of the two methods. According to the analysis and the subsequent verification test, it was indicated that the SD method was more proper for the test and a new index called MISD was defined to evaluate the structure quality in the Vibro-acoustic modulation test with swept probing excitation.

  1. Generating a Multiphase Equation of State with Swarm Intelligence

    NASA Astrophysics Data System (ADS)

    Cox, Geoffrey

    2017-06-01

    Hydrocode calculations require knowledge of the variation of pressure of a material with density and temperature, which is given by the equation of state. An accurate model needs to account for discontinuities in energy, density and properties of a material across a phase boundary. When generating a multiphase equation of state the modeller attempts to balance the agreement between the available data for compression, expansion and phase boundary location. However, this can prove difficult because minor adjustments in the equation of state for a single phase can have a large impact on the overall phase diagram. Recently, Cox and Christie described a method for combining statistical-mechanics-based condensed matter physics models with a stochastic analysis technique called particle swarm optimisation. The models produced show good agreement with experiment over a wide range of pressure-temperature space. This talk details the general implementation of this technique, shows example results, and describes the types of analysis that can be performed with this method.

  2. [Application of virtual instrumentation technique in toxicological studies].

    PubMed

    Moczko, Jerzy A

    2005-01-01

    Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.

  3. Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.

    PubMed

    Khan, F I; Abbasi, S A

    2000-07-10

    Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.

  4. Similarities between principal components of protein dynamics and random diffusion

    NASA Astrophysics Data System (ADS)

    Hess, Berk

    2000-12-01

    Principal component analysis, also called essential dynamics, is a powerful tool for finding global, correlated motions in atomic simulations of macromolecules. It has become an established technique for analyzing molecular dynamics simulations of proteins. The first few principal components of simulations of large proteins often resemble cosines. We derive the principal components for high-dimensional random diffusion, which are almost perfect cosines. This resemblance between protein simulations and noise implies that for many proteins the time scales of current simulations are too short to obtain convergence of collective motions.

  5. On the deduction of chemical reaction pathways from measurements of time series of concentrations.

    PubMed

    Samoilov, Michael; Arkin, Adam; Ross, John

    2001-03-01

    We discuss the deduction of reaction pathways in complex chemical systems from measurements of time series of chemical concentrations of reacting species. First we review a technique called correlation metric construction (CMC) and show the construction of a reaction pathway from measurements on a part of glycolysis. Then we present two new improved methods for the analysis of time series of concentrations, entropy metric construction (EMC), and entropy reduction method (ERM), and illustrate (EMC) with calculations on a model reaction system. (c) 2001 American Institute of Physics.

  6. O(-) identified at high temperatures in CaO-based catalysts for oxidative methane dimerization

    NASA Technical Reports Server (NTRS)

    Freund, F.; Maiti, G. C.; Batllo, F.; Baerns, M.

    1990-01-01

    A technique called charge-distribution analysis (CDA) is employed to study mobile charge carriers in the oxidation catalysts CaO, CaO with 11 percent Na2O, and CaO with 10 percent La2O3. A threshold temperature of about 550-600 C is identified at which highly mobile charge carriers are present, and the CDA studies show that they are O(-) states. The present investigation indicates the usefulness of CDA in catalysis research with pressed powder samples and gas/solid reactions.

  7. Self-mapping in treating suicide ideation: a case study.

    PubMed

    Robertson, Lloyd Hawkeye

    2011-03-01

    This case study traces the development and use of a self-mapping exercise in the treatment of a youth who had been at risk for re-attempting suicide. A life skills exercise was modified to identify units of culture called memes from which a map of the youth's self was prepared. A successful treatment plan followed the mapping exercise. The process of self-map construction is presented along with an interpretive analysis. It is suggested that therapists from a range of perspectives could use this technique in assessment and treatment.

  8. Application of nuclear analytical techniques using long-life sealed-tube neutron generators.

    PubMed

    Bach, P; Cluzeau, S; Lambermont, C

    1994-01-01

    The new range of sealed-tube neutron generators developed by SODERN appears to be appropriate for the industrial environment. The main characteristics are the high emission stability during the very long lifetime of the tube, flexible pulsed mode capability, safety in operation with no radiation in "off" state, and the easy transportation of equipment. Some applications of the neutron generators, called GENIE, are considered: high-sensitivity measurement of transuranic elements in nuclear waste drums, bulk material analysis for process control, and determination of the airborne pollutants for environmental monitoring.

  9. Tool development in threat assessment: syntax regularization and correlative analysis. Final report Task I and Task II, November 21, 1977-May 21, 1978. [Linguistic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miron, M.S.; Christopher, C.; Hirshfield, S.

    1978-05-01

    Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less

  10. Codimension-Two Bifurcation Analysis in DC Microgrids Under Droop Control

    NASA Astrophysics Data System (ADS)

    Lenz, Eduardo; Pagano, Daniel J.; Tahim, André P. N.

    This paper addresses local and global bifurcations that may appear in electrical power systems, such as DC microgrids, which recently has attracted interest from the electrical engineering society. Most sources in these networks are voltage-type and operate in parallel. In such configuration, the basic technique for stabilizing the bus voltage is the so-called droop control. The main contribution of this work is a codimension-two bifurcation analysis of a small DC microgrid considering the droop control gain and the power processed by the load as bifurcation parameters. The codimension-two bifurcation set leads to practical rules for achieving a robust droop control design. Moreover, the bifurcation analysis also offers a better understanding of the dynamics involved in the problem and how to avoid possible instabilities. Simulation results are presented in order to illustrate the bifurcation analysis.

  11. Velocity control of servo systems using an integral retarded algorithm.

    PubMed

    Ramírez, Adrián; Garrido, Rubén; Mondié, Sabine

    2015-09-01

    This paper presents a design technique for the delay-based controller called Integral Retarded (IR), and its applications to velocity control of servo systems. Using spectral analysis, the technique yields a tuning strategy for the IR by assigning a triple real dominant root for the closed-loop system. This result ultimately guarantees a desired exponential decay rate σ(d) while achieving the IR tuning as explicit function of σ(d) and system parameters. The intentional introduction of delay allows using noisy velocity measurements without additional filtering. The structure of the controller is also able to avoid velocity measurements by using instead position information. The IR is compared to a classical PI, both tested in a laboratory prototype. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  12. What can health care marketing learn from bank marketing?

    PubMed

    Mindak, W A

    1988-01-01

    A useful technique in assessing opportunities for international marketers is called "lead lag" analysis. It suggests that one can predict developments, such as demand patterns, in one country by looking at an analogous country. Applying such a technique to the domestic scene, what could we predict about the development and application of marketing to the health care sector if we looked at an analogous service such as banking? Many experts believe that health care is following in the footsteps of banking and point to environmental similarities such as changes in government regulation, new forms of nontraditional competition, increased concern about retail sectors, and pressures on scarce resources. Are there lessons that health care marketers can learn from bankers that might help them avoid some false starts or expensive mistakes?

  13. Logistic regression applied to natural hazards: rare event logistic regression with replications

    NASA Astrophysics Data System (ADS)

    Guns, M.; Vanacker, V.

    2012-06-01

    Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.

  14. Automated synthesis and composition of taskblocks for control of manufacturing systems.

    PubMed

    Holloway, L E; Guan, X; Sundaravadivelu, R; Ashley, J R

    2000-01-01

    Automated control synthesis methods for discrete-event systems promise to reduce the time required to develop, debug, and modify control software. Such methods must be able to translate high-level control goals into detailed sequences of actuation and sensing signals. In this paper, we present such a technique. It relies on analysis of a system model, defined as a set of interacting components, each represented as a form of condition system Petri net. Control logic modules, called taskblocks, are synthesized from these individual models. These then interact hierarchically and sequentially to drive the system through specified control goals. The resulting controller is automatically converted to executable control code. The paper concludes with a discussion of a set of software tools developed to demonstrate the techniques on a small manufacturing system.

  15. Complementary use of ion beam elastic backscattering and recoil detection analysis for the precise determination of the composition of thin films made of light elements

    NASA Astrophysics Data System (ADS)

    Climent-Font, A.; Cervera, M.; Hernández, M. J.; Muñoz-Martín, A.; Piqueras, J.

    2008-04-01

    Rutherford backscattering spectrometry (RBS) is a well known powerful technique to obtain depth profiles of the constituent elements in a thin film deposited on a substrate made of lighter elements. In its standard use the probing beam is typically 2 MeV He. Its capabilities to obtain precise composition profiles are severely diminished when the overlaying film is made of elements lighter than the substrate. In this situation the analysis of the energy of the recoiled element from the sample in the elastic scattering event, the ERDA technique may be advantageous. For the detection of light elements it is also possible to use beams at specific energies producing elastic resonances with these light elements to be analyzed, with a much higher scattering cross sections than the Rutherford values. This technique may be called non-RBS. In this work we report on the complementary use of ERDA with a 30 MeV Cl beam and non-RBS with 1756 keV H ions to characterize thin films made of boron, carbon and nitrogen (BCN) deposited on Si substrates.

  16. Sentiment analysis: a comparison of deep learning neural network algorithm with SVM and naϊve Bayes for Indonesian text

    NASA Astrophysics Data System (ADS)

    Calvin Frans Mariel, Wahyu; Mariyah, Siti; Pramana, Setia

    2018-03-01

    Deep learning is a new era of machine learning techniques that essentially imitate the structure and function of the human brain. It is a development of deeper Artificial Neural Network (ANN) that uses more than one hidden layer. Deep Learning Neural Network has a great ability on recognizing patterns from various data types such as picture, audio, text, and many more. In this paper, the authors tries to measure that algorithm’s ability by applying it into the text classification. The classification task herein is done by considering the content of sentiment in a text which is also called as sentiment analysis. By using several combinations of text preprocessing and feature extraction techniques, we aim to compare the precise modelling results of Deep Learning Neural Network with the other two commonly used algorithms, the Naϊve Bayes and Support Vector Machine (SVM). This algorithm comparison uses Indonesian text data with balanced and unbalanced sentiment composition. Based on the experimental simulation, Deep Learning Neural Network clearly outperforms the Naϊve Bayes and SVM and offers a better F-1 Score while for the best feature extraction technique which improves that modelling result is Bigram.

  17. Micro-structural characterization of precipitation-synthesized fluorapatite nano-material by transmission electron microscopy using different sample preparation techniques.

    PubMed

    Chinthaka Silva, G W; Ma, Longzhou; Hemmers, Oliver; Lindle, Dennis

    2008-01-01

    Fluorapatite is a naturally occurring mineral of the apatite group and it is well known for its high physical and chemical stability. There is a recent interest in this ceramic to be used as a radioactive waste form material due to its intriguing chemical and physical properties. In this study, the nano-sized fluorapatite particles were synthesized using a precipitation method and the material was characterized using X-ray diffraction (XRD) and transmission electron microscopy (TEM). Two well-known methods, called solution-drop and the microtome cutting, were used to prepare the sample for TEM analysis. It was found that the microtome cutting technique is advantageous for examining the particle shape and cross-sectional morphology as well as for obtaining ultra-thin samples. However, this method introduces artifacts and strong background contrast for high-resolution transmission electron microscopy (HRTEM) observation. On the other hand, phase image simulations showed that the solution-drop method is reliable and stable for HRTEM analysis. Therefore, in order to comprehensively analyze the microstructure and morphology of the nano-material, it is necessary to combine both solution-drop and microtome cutting techniques for TEM sample preparation.

  18. Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.

    PubMed

    Demerdash, Omar N A; Mitchell, Julie C

    2012-07-01

    Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.

  19. Are EMS call volume predictions based on demand pattern analysis accurate?

    PubMed

    Brown, Lawrence H; Lerner, E Brooke; Larmon, Baxter; LeGassick, Todd; Taigman, Michael

    2007-01-01

    Most EMS systems determine the number of crews they will deploy in their communities and when those crews will be scheduled based on anticipated call volumes. Many systems use historical data to calculate their anticipated call volumes, a method of prediction known as demand pattern analysis. To evaluate the accuracy of call volume predictions calculated using demand pattern analysis. Seven EMS systems provided 73 consecutive weeks of hourly call volume data. The first 20 weeks of data were used to calculate three common demand pattern analysis constructs for call volume prediction: average peak demand (AP), smoothed average peak demand (SAP), and 90th percentile rank (90%R). The 21st week served as a buffer. Actual call volumes in the last 52 weeks were then compared to the predicted call volumes by using descriptive statistics. There were 61,152 hourly observations in the test period. All three constructs accurately predicted peaks and troughs in call volume but not exact call volume. Predictions were accurate (+/-1 call) 13% of the time using AP, 10% using SAP, and 19% using 90%R. Call volumes were overestimated 83% of the time using AP, 86% using SAP, and 74% using 90%R. When call volumes were overestimated, predictions exceeded actual call volume by a median (Interquartile range) of 4 (2-6) calls for AP, 4 (2-6) for SAP, and 3 (2-5) for 90%R. Call volumes were underestimated 4% of time using AP, 4% using SAP, and 7% using 90%R predictions. When call volumes were underestimated, call volumes exceeded predictions by a median (Interquartile range; maximum under estimation) of 1 (1-2; 18) call for AP, 1 (1-2; 18) for SAP, and 2 (1-3; 20) for 90%R. Results did not vary between systems. Generally, demand pattern analysis estimated or overestimated call volume, making it a reasonable predictor for ambulance staffing patterns. However, it did underestimate call volume between 4% and 7% of the time. Communities need to determine if these rates of over-and underestimation are acceptable given their resources and local priorities.

  20. Control algorithms for aerobraking in the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Ward, Donald T.; Shipley, Buford W., Jr.

    1991-01-01

    The Analytic Predictor Corrector (APC) and Energy Controller (EC) atmospheric guidance concepts were adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. Changes are made to the APC to improve its robustness to density variations. These changes include adaptation of a new exit phase algorithm, an adaptive transition velocity to initiate the exit phase, refinement of the reference dynamic pressure calculation and two improved density estimation techniques. The modified controller with the hybrid density estimation technique is called the Mars Hybrid Predictor Corrector (MHPC), while the modified controller with a polynomial density estimator is called the Mars Predictor Corrector (MPC). A Lyapunov Steepest Descent Controller (LSDC) is adapted to control the vehicle. The LSDC lacked robustness, so a Lyapunov tracking exit phase algorithm is developed to guide the vehicle along a reference trajectory. This algorithm, when using the hybrid density estimation technique to define the reference path, is called the Lyapunov Hybrid Tracking Controller (LHTC). With the polynomial density estimator used to define the reference trajectory, the algorithm is called the Lyapunov Tracking Controller (LTC). These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. The MHPC, MPC, LHTC, and LTC show dramatic improvements in robustness over the APC and EC.

  1. Blind source computer device identification from recorded VoIP calls for forensic investigation.

    PubMed

    Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul

    2017-03-01

    The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  2. Terrestrial Radiodetermination Potential Users and Their Requirements

    DOT National Transportation Integrated Search

    1976-07-01

    The report summarizes information gathered during a preliminary study of the application of electronic techniques to geographical position determination on land and on inland waterways. Systems incorporating such techniques have been called terrestri...

  3. Using Runtime Analysis to Guide Model Checking of Java Programs

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.

  4. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  5. Failure analysis of woven and braided fabric reinforced composites

    NASA Technical Reports Server (NTRS)

    Naik, Rajiv A.

    1994-01-01

    A general purpose micromechanics analysis that discretely models the yarn architecture within the textile repeating unit cell was developed to predict overall, three dimensional, thermal and mechanical properties, damage initiation and progression, and strength. This analytical technique was implemented in a user-friendly, personal computer-based, menu-driven code called Textile Composite Analysis for Design (TEXCAD). TEXCAD was used to analyze plain weave and 2x2, 2-D triaxial braided composites. The calculated tension, compression, and shear strengths correlated well with available test data for both woven and braided composites. Parametric studies were performed on both woven and braided architectures to investigate the effects of parameters such as yarn size, yarn spacing, yarn crimp, braid angle, and overall fiber volume fraction on the strength properties of the textile composite.

  6. Artificial intelligence for multi-mission planetary operations

    NASA Technical Reports Server (NTRS)

    Atkinson, David J.; Lawson, Denise L.; James, Mark L.

    1990-01-01

    A brief introduction is given to an automated system called the Spacecraft Health Automated Reasoning Prototype (SHARP). SHARP is designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for evaluation of the prototype in a real-time operations setting during the Voyager spacecraft encounter with Neptune in August, 1989. The preliminary results of the SHARP project and plans for future application of the technology are discussed.

  7. Analysis of memory use for improved design and compile-time allocation of local memory

    NASA Technical Reports Server (NTRS)

    Mcniven, Geoffrey D.; Davidson, Edward S.

    1986-01-01

    Trace analysis techniques are used to study memory referencing behavior for the purpose of designing local memories and determining how to allocate them for data and instructions. In an attempt to assess the inherent behavior of the source code, the trace analysis system described here reduced the effects of the compiler and host architecture on the trace by using a technical called flattening. The variables in the trace, their associated single-assignment values, and references are histogrammed on the basis of various parameters describing memory referencing behavior. Bounds are developed specifying the amount of memory space required to store all live values in a particular histogram class. The reduction achieved in main memory traffic by allocating local memory is specified for each class.

  8. Interactive algebraic grid-generation technique

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Wiese, M. R.

    1986-01-01

    An algebraic grid generation technique and use of an associated interactive computer program are described. The technique, called the two boundary technique, is based on Hermite cubic interpolation between two fixed, nonintersecting boundaries. The boundaries are referred to as the bottom and top, and they are defined by two ordered sets of points. Left and right side boundaries which intersect the bottom and top boundaries may also be specified by two ordered sets of points. when side boundaries are specified, linear blending functions are used to conform interior interpolation to the side boundaries. Spacing between physical grid coordinates is determined as a function of boundary data and uniformly space computational coordinates. Control functions relating computational coordinates to parametric intermediate variables that affect the distance between grid points are embedded in the interpolation formulas. A versatile control function technique with smooth-cubic-spline functions is presented. The technique works best in an interactive graphics environment where computational displays and user responses are quickly exchanged. An interactive computer program based on the technique and called TBGG (two boundary grid generation) is also described.

  9. A study on Improvisation in a Musical performance using Multifractal Detrended Cross Correlation Analysis

    NASA Astrophysics Data System (ADS)

    Sanyal, Shankha; Banerjee, Archi; Patranabis, Anirban; Banerjee, Kaushik; Sengupta, Ranjan; Ghosh, Dipak

    2016-11-01

    MFDFA (the most rigorous technique to assess multifractality) was performed on four Hindustani music samples played on same 'raga' sung by the same performer. Each music sample was divided into six parts and 'multifractal spectral width' was determined for each part corresponding to the four samples. The results obtained reveal that different parts of all the four sound signals possess spectral width of widely varying values. This gives a cue of the so called 'musical improvisation' in all music samples, keeping in mind they belong to the bandish part of the same raga. Formal compositions in Hindustani raga are juxtaposed with the improvised portions, where an artist manoeuvers his/her own creativity to bring out a mood that is specific for that particular performance, which is known as 'improvisation'. Further, this observation hints at the association of different emotions even in the same bandish of the same raga performed by the same artist, this interesting observation cannot be revealed unless rigorous non-linear technique explores the nature of musical structure. In the second part, we applied MFDXA technique to explore more in-depth about 'improvisation' and association with emotion. This technique is applied to find the degree of cross-correlation (γx) between the different parts of the samples. Pronounced correlation has been observed in the middle parts of the all the four samples evident from higher values of γx ​whereas the other parts show weak correlation. This gets further support from the values of spectral width from different parts of the sample - width of those parts is significantly different from other parts. This observation is extremely new both in respect of musical structure of so called improvisation and associated emotion. The importance of this study in application area of cognitive music therapy is immense.

  10. A modified prebind engagement process reduces biomechanical loading on front row players during scrummaging: a cross-sectional study of 11 elite teams.

    PubMed

    Cazzola, Dario; Preatoni, Ezio; Stokes, Keith A; England, Michael E; Trewartha, Grant

    2015-04-01

    Biomechanical studies of the rugby union scrum have typically been conducted using instrumented scrum machines, but a large-scale biomechanical analysis of live contested scrummaging is lacking. We investigated whether the biomechanical loading experienced by professional front row players during the engagement phase of live contested rugby scrums could be reduced using a modified engagement procedure. Eleven professional teams (22 forward packs) performed repeated scrum trials for each of the three engagement techniques, outdoors, on natural turf. The engagement processes were the 2011/2012 (referee calls crouch-touch-pause-engage), 2012/2013 (referee calls crouch-touch-set) and 2013/2014 (props prebind with the opposition prior to the 'Set' command; PreBind) variants. Forces were estimated by pressure sensors on the shoulders of the front row players of one forward pack. Inertial Measurement Units were placed on an upper spine cervical landmark (C7) of the six front row players to record accelerations. Players' motion was captured by multiple video cameras from three viewing perspectives and analysed in transverse and sagittal planes of motion. The PreBind technique reduced biomechanical loading in comparison with the other engagement techniques, with engagement speed, peak forces and peak accelerations of upper spine landmarks reduced by approximately 20%. There were no significant differences between techniques in terms of body kinematics and average force during the sustained push phase. Using a scrum engagement process which involves binding with the opposition prior to the engagement reduces the stresses acting on players and therefore may represent a possible improvement for players' safety. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. Continuous state-space representation of a bucket-type rainfall-runoff model: a case study with the GR4 model using state-space GR4 (version 1.0)

    NASA Astrophysics Data System (ADS)

    Santos, Léonard; Thirel, Guillaume; Perrin, Charles

    2018-04-01

    In many conceptual rainfall-runoff models, the water balance differential equations are not explicitly formulated. These differential equations are solved sequentially by splitting the equations into terms that can be solved analytically with a technique called operator splitting. As a result, only the solutions of the split equations are used to present the different models. This article provides a methodology to make the governing water balance equations of a bucket-type rainfall-runoff model explicit and to solve them continuously. This is done by setting up a comprehensive state-space representation of the model. By representing it in this way, the operator splitting, which makes the structural analysis of the model more complex, could be removed. In this state-space representation, the lag functions (unit hydrographs), which are frequent in rainfall-runoff models and make the resolution of the representation difficult, are first replaced by a so-called Nash cascade and then solved with a robust numerical integration technique. To illustrate this methodology, the GR4J model is taken as an example. The substitution of the unit hydrographs with a Nash cascade, even if it modifies the model behaviour when solved using operator splitting, does not modify it when the state-space representation is solved using an implicit integration technique. Indeed, the flow time series simulated by the new representation of the model are very similar to those simulated by the classic model. The use of a robust numerical technique that approximates a continuous-time model also improves the lag parameter consistency across time steps and provides a more time-consistent model with time-independent parameters.

  12. Plume Impingement to the Lunar Surface: A Challenging Problem for DSMC

    NASA Technical Reports Server (NTRS)

    Lumpkin, Forrest; Marichalar, Jermiah; Piplica, Anthony

    2007-01-01

    The President's Vision for Space Exploration calls for the return of human exploration of the Moon. The plans are ambitious and call for the creation of a lunar outpost. Lunar Landers will therefore be required to land near predeployed hardware, and the dust storm created by the Lunar Lander's plume impingement to the lunar surface presents a hazard. Knowledge of the number density, size distribution, and velocity of the grains in the dust cloud entrained into the flow is needing to develop mitigation strategies. An initial step to acquire such knowledge is simulating the associated plume impingement flow field. The following paper presents results from a loosely coupled continuum flow solver/Direct Simulation Monte Carlo (DSMC) technique for simulating the plume impingement of the Apollo Lunar module on the lunar surface. These cases were chosen for initial study to allow for comparison with available Apollo video. The relatively high engine thrust and the desire to simulate interesting cases near touchdown result in flow that is nearly entirely continuum. The DSMC region of the flow field was simulated using NASA's DSMC Analysis Code (DAC) and must begin upstream of the impingement shock for the loosely coupled technique to succeed. It was therefore impossible to achieve mean free path resolution with a reasonable number of molecules (say 100 million) as is shown. In order to mitigate accuracy and performance issues when using such large cells, advanced techniques such as collision limiting and nearest neighbor collisions were employed. The final paper will assess the benefits and shortcomings of such techniques. In addition, the effects of plume orientation, plume altitude, and lunar topography, such as craters, on the flow field, the surface pressure distribution, and the surface shear stress distribution are presented.

  13. [Key informers. When and How?].

    PubMed

    Martín González, R

    2009-03-01

    When information obtained through duly designed and developed studies is not available, the solution to certain problems that affect the population or that respond to certain questions may be approached by using the information and experience provided by the so-called key informer. The key informer is defined as a person who is in contact with the community or with the problem to be studied, who is considered to have good knowledge of the situation and therefore who is considered an expert. The search for consensus is the basis to obtain information through the key informers. The techniques used have different characteristics based on whether the experts chosen meet together or not, whether they are guided or not, whether they interact with each other or not. These techniques include the survey, the Delphi technique, the nominal group technique, brainwriting, brainstorming, the Phillips 66 technique, the 6-3-5 technique, the community forum and the community impressions technique. Information provided by key informers through the search for consensus is relevant when this is not available or cannot be obtained by other methods. It has permitted the analysis of the existing neurological care model, elaboration of recommendations on visit times for the out-patient neurological care, and the elaboration of guidelines and recommendations for the management of prevalent neurological problems.

  14. Detecting and classifying method based on similarity matching of Android malware behavior with profile.

    PubMed

    Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang

    2016-01-01

    Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.

  15. Intimate Debate Technique: Medicinal Use of Marijuana

    ERIC Educational Resources Information Center

    Herreid, Clyde Freeman; DeRei, Kristie

    2007-01-01

    Classroom debates used to be familiar exercises to students schooled in past generations. In this article, the authors describe the technique called "intimate debate". To cooperative learning specialists, the technique is known as "structured debate" or "constructive debate". It is a powerful method for dealing with case topics that involve…

  16. The Influence of Judgment Calls on Meta-Analytic Findings.

    PubMed

    Tarrahi, Farid; Eisend, Martin

    2016-01-01

    Previous research has suggested that judgment calls (i.e., methodological choices made in the process of conducting a meta-analysis) have a strong influence on meta-analytic findings and question their robustness. However, prior research applies case study comparison or reanalysis of a few meta-analyses with a focus on a few selected judgment calls. These studies neglect the fact that different judgment calls are related to each other and simultaneously influence the outcomes of a meta-analysis, and that meta-analytic findings can vary due to non-judgment call differences between meta-analyses (e.g., variations of effects over time). The current study analyzes the influence of 13 judgment calls in 176 meta-analyses in marketing research by applying a multivariate, multilevel meta-meta-analysis. The analysis considers simultaneous influences from different judgment calls on meta-analytic effect sizes and controls for alternative explanations based on non-judgment call differences between meta-analyses. The findings suggest that judgment calls have only a minor influence on meta-analytic findings, whereas non-judgment call differences between meta-analyses are more likely to explain differences in meta-analytic findings. The findings support the robustness of meta-analytic results and conclusions.

  17. Beluga whale (Delphinapterus leucas) vocalizations and call classification from the eastern Beaufort Sea population.

    PubMed

    Garland, Ellen C; Castellote, Manuel; Berchok, Catherine L

    2015-06-01

    Beluga whales, Delphinapterus leucas, have a graded call system; call types exist on a continuum making classification challenging. A description of vocalizations from the eastern Beaufort Sea beluga population during its spring migration are presented here, using both a non-parametric classification tree analysis (CART), and a Random Forest analysis. Twelve frequency and duration measurements were made on 1019 calls recorded over 14 days off Icy Cape, Alaska, resulting in 34 identifiable call types with 83% agreement in classification for both CART and Random Forest analyses. This high level of agreement in classification, with an initial subjective classification of calls into 36 categories, demonstrates that the methods applied here provide a quantitative analysis of a graded call dataset. Further, as calls cannot be attributed to individuals using single sensor passive acoustic monitoring efforts, these methods provide a comprehensive analysis of data where the influence of pseudo-replication of calls from individuals is unknown. This study is the first to describe the vocal repertoire of a beluga population using a robust and repeatable methodology. A baseline eastern Beaufort Sea beluga population repertoire is presented here, against which the call repertoire of other seasonally sympatric Alaskan beluga populations can be compared.

  18. Digital computer technique for setup and checkout of an analog computer

    NASA Technical Reports Server (NTRS)

    Ambaruch, R.

    1968-01-01

    Computer program technique, called Analog Computer Check-Out Routine Digitally /ACCORD/, generates complete setup and checkout data for an analog computer. In addition, the correctness of the analog program implementation is validated.

  19. Essentials of Suggestopedia: A Primer for Practitioners.

    ERIC Educational Resources Information Center

    Caskey, Owen L.; Flake, Muriel H.

    Suggestology is the scientific study of the psychology of suggestion and Suggestopedia in the application of relaxation and suggestion techniques to learning. The approach applied to learning processes (called Suggestopedic) developed by Dr. Georgi Lozanov (called the Lozanov Method) utilizes mental and physical relaxation, deep breathing,…

  20. Nebula--a web-server for advanced ChIP-seq data analysis.

    PubMed

    Boeva, Valentina; Lermine, Alban; Barette, Camille; Guillouf, Christel; Barillot, Emmanuel

    2012-10-01

    ChIP-seq consists of chromatin immunoprecipitation and deep sequencing of the extracted DNA fragments. It is the technique of choice for accurate characterization of the binding sites of transcription factors and other DNA-associated proteins. We present a web service, Nebula, which allows inexperienced users to perform a complete bioinformatics analysis of ChIP-seq data. Nebula was designed for both bioinformaticians and biologists. It is based on the Galaxy open source framework. Galaxy already includes a large number of functionalities for mapping reads and peak calling. We added the following to Galaxy: (i) peak calling with FindPeaks and a module for immunoprecipitation quality control, (ii) de novo motif discovery with ChIPMunk, (iii) calculation of the density and the cumulative distribution of peak locations relative to gene transcription start sites, (iv) annotation of peaks with genomic features and (v) annotation of genes with peak information. Nebula generates the graphs and the enrichment statistics at each step of the process. During Steps 3-5, Nebula optionally repeats the analysis on a control dataset and compares these results with those from the main dataset. Nebula can also incorporate gene expression (or gene modulation) data during these steps. In summary, Nebula is an innovative web service that provides an advanced ChIP-seq analysis pipeline providing ready-to-publish results. Nebula is available at http://nebula.curie.fr/ Supplementary data are available at Bioinformatics online.

  1. Comparison of frequency-domain and time-domain rotorcraft vibration control methods

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.

    1984-01-01

    Active control of rotor-induced vibration in rotorcraft has received significant attention recently. Two classes of techniques have been proposed. The more developed approach works with harmonic analysis of measured time histories and is called the frequency-domain approach. The more recent approach computes the control input directly using the measured time history data and is called the time-domain approach. The report summarizes the results of a theoretical investigation to compare the two approaches. Five specific areas were addressed: (1) techniques to derive models needed for control design (system identification methods), (2) robustness with respect to errors, (3) transient response, (4) susceptibility to noise, and (5) implementation difficulties. The system identification methods are more difficult for the time-domain models. The time-domain approach is more robust (e.g., has higher gain and phase margins) than the frequency-domain approach. It might thus be possible to avoid doing real-time system identification in the time-domain approach by storing models at a number of flight conditions. The most significant error source is the variation in open-loop vibrations caused by pilot inputs, maneuvers or gusts. The implementation requirements are similar except that the time-domain approach can be much simpler to implement if real-time system identification were not necessary.

  2. Interference tables: a useful model for interference analysis in asynchronous multicarrier transmission

    NASA Astrophysics Data System (ADS)

    Medjahdi, Yahia; Terré, Michel; Ruyet, Didier Le; Roviras, Daniel

    2014-12-01

    In this paper, we investigate the impact of timing asynchronism on the performance of multicarrier techniques in a spectrum coexistence context. Two multicarrier schemes are considered: cyclic prefix-based orthogonal frequency division multiplexing (CP-OFDM) with a rectangular pulse shape and filter bank-based multicarrier (FBMC) with physical layer for dynamic spectrum access and cognitive radio (PHYDYAS) and isotropic orthogonal transform algorithm (IOTA) waveforms. First, we present the general concept of the so-called power spectral density (PSD)-based interference tables which are commonly used for multicarrier interference characterization in spectrum sharing context. After highlighting the limits of this approach, we propose a new family of interference tables called `instantaneous interference tables'. The proposed tables give the interference power caused by a given interfering subcarrier on a victim one, not only as a function of the spectral distance separating both subcarriers but also with respect to the timing misalignment between the subcarrier holders. In contrast to the PSD-based interference tables, the accuracy of the proposed tables has been validated through different simulation results. Furthermore, due to the better frequency localization of both PHYDYAS and IOTA waveforms, FBMC technique is demonstrated to be more robust to timing asynchronism compared to OFDM one. Such a result makes FBMC a potential candidate for the physical layer of future cognitive radio systems.

  3. Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.

    PubMed

    Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J

    2015-01-01

    The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.

  4. Visualizing Ebolavirus Particles Using Single-Particle Interferometric Reflectance Imaging Sensor (SP-IRIS).

    PubMed

    Carter, Erik P; Seymour, Elif Ç; Scherr, Steven M; Daaboul, George G; Freedman, David S; Selim Ünlü, M; Connor, John H

    2017-01-01

    This chapter describes an approach for the label-free imaging and quantification of intact Ebola virus (EBOV) and EBOV viruslike particles (VLPs) using a light microscopy technique. In this technique, individual virus particles are captured onto a silicon chip that has been printed with spots of virus-specific capture antibodies. These captured virions are then detected using an optical approach called interference reflectance imaging. This approach allows for the detection of each virus particle that is captured on an antibody spot and can resolve the filamentous structure of EBOV VLPs without the need for electron microscopy. Capture of VLPs and virions can be done from a variety of sample types ranging from tissue culture medium to blood. The technique also allows automated quantitative analysis of the number of virions captured. This can be used to identify the virus concentration in an unknown sample. In addition, this technique offers the opportunity to easily image virions captured from native solutions without the need for additional labeling approaches while offering a means of assessing the range of particle sizes and morphologies in a quantitative manner.

  5. [Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].

    PubMed

    Golder, W

    1999-09-01

    To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.

  6. Combining LCT tools for the optimization of an industrial process: material and energy flow analysis and best available techniques.

    PubMed

    Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares

    2011-09-15

    Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. Big data in medical science--a biostatistical view.

    PubMed

    Binder, Harald; Blettner, Maria

    2015-02-27

    Inexpensive techniques for measurement and data storage now enable medical researchers to acquire far more data than can conveniently be analyzed by traditional methods. The expression "big data" refers to quantities on the order of magnitude of a terabyte (1012 bytes); special techniques must be used to evaluate such huge quantities of data in a scientifically meaningful way. Whether data sets of this size are useful and important is an open question that currently confronts medical science. In this article, we give illustrative examples of the use of analytical techniques for big data and discuss them in the light of a selective literature review. We point out some critical aspects that should be considered to avoid errors when large amounts of data are analyzed. Machine learning techniques enable the recognition of potentially relevant patterns. When such techniques are used, certain additional steps should be taken that are unnecessary in more traditional analyses; for example, patient characteristics should be differentially weighted. If this is not done as a preliminary step before similarity detection, which is a component of many data analysis operations, characteristics such as age or sex will be weighted no higher than any one out of 10 000 gene expression values. Experience from the analysis of conventional observational data sets can be called upon to draw conclusions about potential causal effects from big data sets. Big data techniques can be used, for example, to evaluate observational data derived from the routine care of entire populations, with clustering methods used to analyze therapeutically relevant patient subgroups. Such analyses can provide complementary information to clinical trials of the classic type. As big data analyses become more popular, various statistical techniques for causality analysis in observational data are becoming more widely available. This is likely to be of benefit to medical science, but specific adaptations will have to be made according to the requirements of the applications.

  8. Structural zooming research and development of an interactive computer graphical interface for stress analysis of cracks

    NASA Technical Reports Server (NTRS)

    Gerstle, Walter

    1989-01-01

    Engineering problems sometimes involve the numerical solution of boundary value problems over domains containing geometric feature with widely varying scales. Often, a detailed solution is required at one or more of these features. Small details in large structures may have profound effects upon global performance. Conversely, large-scale conditions may effect local performance. Many man-hours and CPU-hours are currently spent in modeling such problems. With the structural zooming technique, it is now possible to design an integrated program which allows the analyst to interactively focus upon a small region of interest, to modify the local geometry, and then to obtain highly accurate responses in that region which reflect both the properties of the overall structure and the local detail. A boundary integral equation analysis program, called BOAST, was recently developed for the stress analysis of cracks. This program can accurately analyze two-dimensional linear elastic fracture mechanics problems with far less computational effort than existing finite element codes. An interactive computer graphical interface to BOAST was written. The graphical interface would have several requirements: it would be menu-driven, with mouse input; all aspects of input would be entered graphically; the results of a BOAST analysis would be displayed pictorially but also the user would be able to probe interactively to get numerical values of displacement and stress at desired locations within the analysis domain; the entire procedure would be integrated into a single, easy to use package; and it would be written using calls to the graphic package called HOOPS. The program is nearing completion. All of the preprocessing features are working satisfactorily and were debugged. The postprocessing features are under development, and rudimentary postprocessing should be available by the end of the summer. The program was developed and run on a VAX workstation, and must be ported to the SUN workstation. This activity is currently underway.

  9. New fluorescence techniques for high-throughput drug discovery.

    PubMed

    Jäger, S; Brand, L; Eggeling, C

    2003-12-01

    The rapid increase of compound libraries as well as new targets emerging from the Human Genome Project require constant progress in pharmaceutical research. An important tool is High-Throughput Screening (HTS), which has evolved as an indispensable instrument in the pre-clinical target-to-IND (Investigational New Drug) discovery process. HTS requires machinery, which is able to test more than 100,000 potential drug candidates per day with respect to a specific biological activity. This calls for certain experimental demands especially with respect to sensitivity, speed, and statistical accuracy, which are fulfilled by using fluorescence technology instrumentation. In particular the recently developed family of fluorescence techniques, FIDA (Fluorescence Intensity Distribution Analysis), which is based on confocal single-molecule detection, has opened up a new field of HTS applications. This report describes the application of these new techniques as well as of common fluorescence techniques--such as confocal fluorescence lifetime and anisotropy--to HTS. It gives experimental examples and presents advantages and disadvantages of each method. In addition the most common artifacts (auto-fluorescence or quenching by the drug candidates) emerging from the fluorescence detection techniques are highlighted and correction methods for confocal fluorescence read-outs are presented, which are able to circumvent this deficiency.

  10. A Smoothing Technique for the Multifractal Analysis of a Medium Voltage Feeders Electric Current

    NASA Astrophysics Data System (ADS)

    de Santis, Enrico; Sadeghian, Alireza; Rizzi, Antonello

    2017-12-01

    The current paper presents a data-driven detrending technique allowing to smooth complex sinusoidal trends from a real-world electric load time series before applying the Detrended Multifractal Fluctuation Analysis (MFDFA). The algorithm we call Smoothed Sort and Cut Fourier Detrending (SSC-FD) is based on a suitable smoothing of high power periodicities operating directly in the Fourier spectrum through a polynomial fitting technique of the DFT. The main aim consists of disambiguating the characteristic slow varying periodicities, that can impair the MFDFA analysis, from the residual signal in order to study its correlation properties. The algorithm performances are evaluated on a simple benchmark test consisting of a persistent series where the Hurst exponent is known, with superimposed ten sinusoidal harmonics. Moreover, the behavior of the algorithm parameters is assessed computing the MFDFA on the well-known sunspot data, whose correlation characteristics are reported in literature. In both cases, the SSC-FD method eliminates the apparent crossover induced by the synthetic and natural periodicities. Results are compared with some existing detrending methods within the MFDFA paradigm. Finally, a study of the multifractal characteristics of the electric load time series detrendended by the SSC-FD algorithm is provided, showing a strong persistent behavior and an appreciable amplitude of the multifractal spectrum that allows to conclude that the series at hand has multifractal characteristics.

  11. [Marketing mix in a radiology department: challenges for future radiologists in management].

    PubMed

    Claikens, B

    1998-08-01

    Radiology has gained an enviable position among medial specialities. Developments in new technology expand its horizons and the volume of radiologic imaging techniques and procedures increase far more than the overall growth in health care services. In this position radiology has become a prime target for restrictions, cutbacks, controlled financing in an area of managed care and new national health care policy based on partially fixed budgets. Future health care takers have to choose the best available diagnostic and therapeutic techniques. Evidence based medicine, cost-utility analysis, diagnostic performance analysis, patient outcome analysis, technology assessment and guidelines for practice are means to guide us through our obligatory choice. Our major objective is to use the most performant available imaging technique or intervention to achieve the best possible outcome for our patient at lower possible costs. A strategic response from radiologists is required to meet the imperatives of this new management situation. They must do far more than interpret imaging procedures. They must work as efficient managers of imaging resources, organise their practices and define their marketing-strategies using the different, so-called, marketing-mix elements. The challenges will be great but the rewards are worth our best efforts. In this article we highlight the marketing responsibilities of future radiologists and their clinical practice in this new socio-economic environment and we present different useful marketing tools.

  12. The pearls of using real-world evidence to discover social groups

    NASA Astrophysics Data System (ADS)

    Cardillo, Raymond A.; Salerno, John J.

    2005-03-01

    In previous work, we introduced a new paradigm called Uni-Party Data Community Generation (UDCG) and a new methodology to discover social groups (a.k.a., community models) called Link Discovery based on Correlation Analysis (LDCA). We further advanced this work by experimenting with a corpus of evidence obtained from a Ponzi scheme investigation. That work identified several UDCG algorithms, developed what we called "Importance Measures" to compare the accuracy of the algorithms based on ground truth, and presented a Concept of Operations (CONOPS) that criminal investigators could use to discover social groups. However, that work used a rather small random sample of manually edited documents because the evidence contained far too many OCR and other extraction errors. Deferring the evidence extraction errors allowed us to continue experimenting with UDCG algorithms, but only used a small fraction of the available evidence. In attempt to discover techniques that are more practical in the near-term, our most recent work focuses on being able to use an entire corpus of real-world evidence to discover social groups. This paper discusses the complications of extracting evidence, suggests a method of performing name resolution, presents a new UDCG algorithm, and discusses our future direction in this area.

  13. A short review of variants calling for single-cell-sequencing data with applications.

    PubMed

    Wei, Zhuohui; Shu, Chang; Zhang, Changsheng; Huang, Jingying; Cai, Hongmin

    2017-11-01

    The field of single-cell sequencing is fleetly expanding, and many techniques have been developed in the past decade. With this technology, biologists can study not only the heterogeneity between two adjacent cells in the same tissue or organ, but also the evolutionary relationships and degenerative processes in a single cell. Calling variants is the main purpose in analyzing single cell sequencing (SCS) data. Currently, some popular methods used for bulk-cell-sequencing data analysis are tailored directly to be applied in dealing with SCS data. However, SCS requires an extra step of genome amplification to accumulate enough quantity for satisfying sequencing needs. The amplification yields large biases and thus raises challenge for using the bulk-cell-sequencing methods. In order to provide guidance for the development of specialized analyzed methods as well as using currently developed tools for SNS, this paper aims to bridge the gap. In this paper, we firstly introduced two popular genome amplification methods and compared their capabilities. Then we introduced a few popular models for calling single-nucleotide polymorphisms and copy-number variations. Finally, break-through applications of SNS were summarized to demonstrate its potential in researching cell evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Topics in the Detection of Gravitational Waves from Compact Binary Inspirals

    NASA Astrophysics Data System (ADS)

    Kapadia, Shasvath Jagat

    Orbiting compact binaries - such as binary black holes, binary neutron stars and neutron star-black hole binaries - are among the most promising sources of gravitational waves observable by ground-based interferometric detectors. Despite numerous sophisticated engineering techniques, the gravitational wave signals will be buried deep within noise generated by various instrumental and environmental processes, and need to be extracted via a signal processing technique referred to as matched filtering. Matched filtering requires large banks of signal templates that are faithful representations of the true gravitational waveforms produced by astrophysical binaries. The accurate and efficient production of templates is thus crucial to the success of signal processing and data analysis. To that end, the dissertation presents a numerical technique that calibrates existing analytical (Post-Newtonian) waveforms, which are relatively inexpensive, to more accurate fiducial waveforms that are computationally expensive to generate. The resulting waveform family is significantly more accurate than the analytical waveforms, without incurring additional computational costs of production. Certain kinds of transient background noise artefacts, called "glitches'', can masquerade as gravitational wave signals for short durations and throw-off the matched-filter algorithm. Identifying glitches from true gravitational wave signals is a highly non-trivial exercise in data analysis which has been attempted with varying degrees of success. We present here a machine-learning based approach that exploits the various attributes of glitches and signals within detector data to provide a classification scheme that is a significant improvement over previous methods. The dissertation concludes by investigating the possibility of detecting a non-linear DC imprint, called the Christodoulou memory, produced in the arms of ground-based interferometers by the recently detected gravitational waves. The memory, which is even smaller in amplitude than the primary (detected) gravitational waves, will almost certainly not be seen in the current detection event. Nevertheless, future space-based detectors will likely be sensitive enough to observe the memory.

  15. Perspectives: A Challenging Patriotism

    ERIC Educational Resources Information Center

    Boyte, Harry C.

    2012-01-01

    In a time of alarm about the poisoning of electoral politics, public passions inflamed by sophisticated techniques of mass polarization, and fears that the country is losing control of its collective future, higher education is called upon to take leadership in "reinventing citizenship." It needs to respond to that call on a scale unprecedented in…

  16. Teaching Free Expression in Word and Example (Commentary).

    ERIC Educational Resources Information Center

    Merrill, John

    1991-01-01

    Suggests that the teaching of free expression may be the highest calling of a communications or journalism professor. Argues that freedom must be tempered by a sense of ethics. Calls upon teachers to encourage students to analyze the questions surrounding free expression. Describes techniques for scrutinizing journalistic myths. (SG)

  17. Hands-free human-machine interaction with voice

    NASA Astrophysics Data System (ADS)

    Juang, B. H.

    2004-05-01

    Voice is natural communication interface between a human and a machine. The machine, when placed in today's communication networks, may be configured to provide automation to save substantial operating cost, as demonstrated in AT&T's VRCP (Voice Recognition Call Processing), or to facilitate intelligent services, such as virtual personal assistants, to enhance individual productivity. These intelligent services often need to be accessible anytime, anywhere (e.g., in cars when the user is in a hands-busy-eyes-busy situation or during meetings where constantly talking to a microphone is either undersirable or impossible), and thus call for advanced signal processing and automatic speech recognition techniques which support what we call ``hands-free'' human-machine communication. These techniques entail a broad spectrum of technical ideas, ranging from use of directional microphones and acoustic echo cancellatiion to robust speech recognition. In this talk, we highlight a number of key techniques that were developed for hands-free human-machine communication in the mid-1990s after Bell Labs became a unit of Lucent Technologies. A video clip will be played to demonstrate the accomplishement.

  18. Evaluation of architectures for an ASP MPEG-4 decoder using a system-level design methodology

    NASA Astrophysics Data System (ADS)

    Garcia, Luz; Reyes, Victor; Barreto, Dacil; Marrero, Gustavo; Bautista, Tomas; Nunez, Antonio

    2005-06-01

    Trends in multimedia consumer electronics, digital video and audio, aim to reach users through low-cost mobile devices connected to data broadcasting networks with limited bandwidth. An emergent broadcasting network is the digital audio broadcasting network (DAB) which provides CD quality audio transmission together with robustness and efficiency techniques to allow good quality reception in motion conditions. This paper focuses on the system-level evaluation of different architectural options to allow low bandwidth digital video reception over DAB, based on video compression techniques. Profiling and design space exploration techniques are applied over the ASP MPEG-4 decoder in order to find out the best HW/SW partition given the application and platform constraints. An innovative SystemC-based system-level design tool, called CASSE, is being used for modelling, exploration and evaluation of different ASP MPEG-4 decoder HW/SW partitions. System-level trade offs and quantitative data derived from this analysis are also presented in this work.

  19. Results from the MACHO Galactic Pixel Lensing Search

    NASA Astrophysics Data System (ADS)

    Drake, Andrew J.; Minniti, Dante; Alcock, Charles; Allsman, Robyn A.; Alves, David; Axelrod, Tim S.; Becker, Andrew C.; Bennett, David; Cook, Kem H.; Freeman, Ken C.; Griest, Kim; Lehner, Matt; Marshall, Stuart; Peterson, Bruce; Pratt, Mark; Quinn, Peter; Rodgers, Alex; Stubbs, Chris; Sutherland, Will; Tomaney, Austin; Vandehei, Thor; Welch, Doug L.

    The MACHO, EROS, OGLE and AGAPE collaborations have been studying nature of the galactic halo for a number of years using microlensing events. The MACHO group undertakes observations of the LMC, SMC and Galactic Bulge monitoring the light curves of millions of stars to detect microlensing. Most of these fields are crowded to the extent that all the monitored stars are blended. Such crowding makes the performance of accurate photometry difficult. We apply the new technique of Difference Image Analysis (DIA) on archival data to improve the photometry and increase both the detection sensitivity and effective search area. The application of this technique also allows us to detect so called `pixel lensing' events. These are microlensing events where the source star is only detectable during lensing. The detection of these events will allow us to make a large increase in the number of detected microlensing events. We present a light curve demonstrating the detection of a pixel lensing event with this technique.

  20. Wavelength resolved neutron transmission analysis to identify single crystal particles in historical metallurgy

    NASA Astrophysics Data System (ADS)

    Barzagli, E.; Grazzi, F.; Salvemini, F.; Scherillo, A.; Sato, H.; Shinohara, T.; Kamiyama, T.; Kiyanagi, Y.; Tremsin, A.; Zoppi, Marco

    2014-07-01

    The phase composition and the microstructure of four ferrous Japanese arrows of the Edo period (17th-19th century) has been determined through two complementary neutron techniques: Position-sensitive wavelength-resolved neutron transmission analysis (PS-WRNTA) and time-of-flight neutron diffraction (ToF-ND). Standard ToF-ND technique has been applied by using the INES diffractometer at the ISIS pulsed neutron source in the UK, while the innovative PS-WRNTA one has been performed at the J-PARC neutron source on the BL-10 NOBORU beam line using the high spatial high time resolution neutron imaging detector. With ToF-ND we were able to reach information about the quantitative distribution of the metal and non-metal phases, the texture level, the strain level and the domain size of each of the samples, which are important parameters to gain knowledge about the technological level of the Japanese weapon. Starting from this base of data, the more complex PS-WRNTA has been applied to the same samples. This experimental technique exploits the presence of the so-called Bragg edges, in the time-of-flight spectrum of neutrons transmitted through crystalline materials, to map the microstructural properties of samples. The two techniques are non-invasive and can be easily applied to archaeometry for an accurate microstructure mapping of metal and ceramic artifacts.

  1. Optimization of the Divergent method for genotyping single nucleotide variations using SYBR Green-based single-tube real-time PCR.

    PubMed

    Gentilini, Fabio; Turba, Maria E

    2014-01-01

    A novel technique, called Divergent, for single-tube real-time PCR genotyping of point mutations without the use of fluorescently labeled probes has recently been reported. This novel PCR technique utilizes a set of four primers and a particular denaturation temperature for simultaneously amplifying two different amplicons which extend in opposite directions from the point mutation. The two amplicons can readily be detected using the melt curve analysis downstream to a closed-tube real-time PCR. In the present study, some critical aspects of the original method were specifically addressed to further implement the technique for genotyping the DNM1 c.G767T mutation responsible for exercise-induced collapse in Labrador retriever dogs. The improved Divergent assay was easily set up using a standard two-step real-time PCR protocol. The melting temperature difference between the mutated and the wild-type amplicons was approximately 5°C which could be promptly detected by all the thermal cyclers. The upgraded assay yielded accurate results with 157pg of genomic DNA per reaction. This optimized technique represents a flexible and inexpensive alternative to the minor grove binder fluorescently labeled method and to high resolution melt analysis for high-throughput, robust and cheap genotyping of single nucleotide variations. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Sparse dictionary learning for resting-state fMRI analysis

    NASA Astrophysics Data System (ADS)

    Lee, Kangjoo; Han, Paul Kyu; Ye, Jong Chul

    2011-09-01

    Recently, there has been increased interest in the usage of neuroimaging techniques to investigate what happens in the brain at rest. Functional imaging studies have revealed that the default-mode network activity is disrupted in Alzheimer's disease (AD). However, there is no consensus, as yet, on the choice of analysis method for the application of resting-state analysis for disease classification. This paper proposes a novel compressed sensing based resting-state fMRI analysis tool called Sparse-SPM. As the brain's functional systems has shown to have features of complex networks according to graph theoretical analysis, we apply a graph model to represent a sparse combination of information flows in complex network perspectives. In particular, a new concept of spatially adaptive design matrix has been proposed by implementing sparse dictionary learning based on sparsity. The proposed approach shows better performance compared to other conventional methods, such as independent component analysis (ICA) and seed-based approach, in classifying the AD patients from normal using resting-state analysis.

  3. Comparing Noun Phrasing Techniques for Use with Medical Digital Library Tools.

    ERIC Educational Resources Information Center

    Tolle, Kristin M.; Chen, Hsinchun

    2000-01-01

    Describes a study that investigated the use of a natural language processing technique called noun phrasing to determine whether it is a viable technique for medical information retrieval. Evaluates four noun phrase generation tools for their ability to isolate noun phrases from medical journal abstracts, focusing on precision and recall.…

  4. Underwater Photo-Elicitation: A New Experiential Marine Education Technique

    ERIC Educational Resources Information Center

    Andrews, Steve; Stocker, Laura; Oechel, Walter

    2018-01-01

    Underwater photo-elicitation is a novel experiential marine education technique that combines direct experience in the marine environment with the use of digital underwater cameras. A program called Show Us Your Ocean! (SUYO!) was created, utilising a mixed methodology (qualitative and quantitative methods) to test the efficacy of this technique.…

  5. Q-Technique and Graphics Research.

    ERIC Educational Resources Information Center

    Kahle, Roger R.

    Because Q-technique is as appropriate for use with visual and design items as for use with words, it is not stymied by the topics one is likely to encounter in graphics research. In particular Q-technique is suitable for studying the so-called "congeniality" of typography, for various copytesting usages, and for multivariate graphics research. The…

  6. Writing with Basals: A Sentence Combining Approach to Comprehension.

    ERIC Educational Resources Information Center

    Reutzel, D. Ray; Merrill, Jimmie D.

    Sentence combining techniques can be used with basal readers to help students develop writing skills. The first technique is addition, characterized by using the connecting word "and" to join two or more base sentences together. The second technique is called "embedding," and is characterized by putting parts of two or more base sentences together…

  7. Singularity-sensitive gauge-based radar rainfall adjustment methods for urban hydrological applications

    NASA Astrophysics Data System (ADS)

    Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.

    2015-09-01

    Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system's dynamics, particularly of peak runoff flows.

  8. Analysis of spectra using correlation functions

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H.

    1988-01-01

    A novel method is presented for the quantitative analysis of spectra based on the properties of the cross correlation between a real spectrum and either a numerical synthesis or laboratory simulation. A new goodness-of-fit criterion called the heteromorphic coefficient H is proposed that has the property of being zero when a fit is achieved and varying smoothly through zero as the iteration proceeds, providing a powerful tool for automatic or near-automatic analysis. It is also shown that H can be rendered substantially noise-immune, permitting the analysis of very weak spectra well below the apparent noise level and, as a byproduct, providing Doppler shift and radial velocity information with excellent precision. The technique is in regular use in the Atmospheric Trace Molecule Spectroscopy (ATMOS) project and operates in an interactive, realtime computing environment with turn-around times of a few seconds or less.

  9. Analysis of Mouse Growth Plate Development

    PubMed Central

    Mangiavini, Laura; Merceron, Christophe; Schipani, Ernestina

    2016-01-01

    To investigate skeletal development, pathophysiological mechanisms of cartilage and bone disease, and eventually assess innovative treatments, the mouse is a very important resource. During embryonic development, mesenchymal condensations are formed, and cells within these mesenchymal condensations either directly differentiate into osteoblasts and give origin to intramembranous bone, or differentiate into chondrocytes and form a cartilaginous anlage. The cartilaginous anlage or fetal growth plate is then replaced with bone. This process is also called endochondral bone development, and it is responsible for the generation of most of our skeleton. In this Review, we will discuss in detail the most common in vivo and in vitro techniques our laboratory is currently using for the analysis of the mouse fetal growth plate during development. PMID:26928664

  10. Quantitative image analysis for investigating cell-matrix interactions

    NASA Astrophysics Data System (ADS)

    Burkel, Brian; Notbohm, Jacob

    2017-07-01

    The extracellular matrix provides both chemical and physical cues that control cellular processes such as migration, division, differentiation, and cancer progression. Cells can mechanically alter the matrix by applying forces that result in matrix displacements, which in turn may localize to form dense bands along which cells may migrate. To quantify the displacements, we use confocal microscopy and fluorescent labeling to acquire high-contrast images of the fibrous material. Using a technique for quantitative image analysis called digital volume correlation, we then compute the matrix displacements. Our experimental technology offers a means to quantify matrix mechanics and cell-matrix interactions. We are now using these experimental tools to modulate mechanical properties of the matrix to study cell contraction and migration.

  11. Raman spectroscopy and immunohistochemistry for schwannoma characterization: a case study

    NASA Astrophysics Data System (ADS)

    Neto, Lazaro P. M.; das Chagas, Maurilio J.; Carvalho, Luis Felipe C. S.; Ferreira, Isabelle; dos Santos, Laurita; Haddad, Marcelo; Loddi, Vinicius; Martin, Airton A.

    2016-03-01

    The schwannomas is a tumour of the tissue that covers nerves, called the nerve sheath. Schwannomas are often benign tumors of the Schwan cells, which are the principal glia of the peripheral nervous system (PNS). Preoperative diagnosis of this lesion usually is difficult, therefore, new techniques are being studied as pre surgical evaluation. Among these, Raman spectroscopy, that enables the biochemical identification of the tissue analyzed by their optical properties, may be used as a tool for schwannomas diagnosis. The aim of this study was to discriminate between normal nervous tissue and schwannoma through the confocal Raman spectroscopy and Raman optical fiber-based techniques combined with immunohistochemical analysis. Twenty spectra were analyzed from a normal nerve tissue sample (10) and schwannoma (10) by Holospec f / 1.8 (Kayser Optical Systems) coupled to an optical fiber with a 785nm laser line source. The data were pre-processed and vector normalized. The average analysis and standard deviation was performed associated with cluster analysis. AML, 1A4, CD34, Desmin and S-100 protein markers were used for immunohistochemical analysis. Immunohistochemical analysis was positive only for protein S-100 marker which confirmed the neural schwanomma originality. The immunohistochemistry analysis were important to determine the source of the injury, whereas Raman spectroscopy were able to differentiated tissues types indicating important biochemical changes between normal and benign neoplasia.

  12. Robust detection of chromosomal interactions from small numbers of cells using low-input Capture-C

    PubMed Central

    Oudelaar, A. Marieke; Davies, James O.J.; Downes, Damien J.; Higgs, Douglas R.

    2017-01-01

    Abstract Chromosome conformation capture (3C) techniques are crucial to understanding tissue-specific regulation of gene expression, but current methods generally require large numbers of cells. This hampers the investigation of chromatin architecture in rare cell populations. We present a new low-input Capture-C approach that can generate high-quality 3C interaction profiles from 10 000–20 000 cells, depending on the resolution used for analysis. We also present a PCR-free, sequencing-free 3C technique based on NanoString technology called C-String. By comparing C-String and Capture-C interaction profiles we show that the latter are not skewed by PCR amplification. Furthermore, we demonstrate that chromatin interactions detected by Capture-C do not depend on the degree of cross-linking by performing experiments with varying formaldehyde concentrations. PMID:29186505

  13. The Electron Microscopy Outreach Program: A Web-based resource for research and education.

    PubMed

    Sosinsky, G E; Baker, T S; Hand, G; Ellisman, M H

    1999-01-01

    We have developed a centralized World Wide Web (WWW)-based environment that serves as a resource of software tools and expertise for biological electron microscopy. A major focus is molecular electron microscopy, but the site also includes information and links on structural biology at all levels of resolution. This site serves to help integrate or link structural biology techniques in accordance with user needs. The WWW site, called the Electron Microscopy (EM) Outreach Program (URL: http://emoutreach.sdsc.edu), provides scientists with computational and educational tools for their research and edification. In particular, we have set up a centralized resource containing course notes, references, and links to image analysis and three-dimensional reconstruction software for investigators wanting to learn about EM techniques either within or outside of their fields of expertise. Copyright 1999 Academic Press.

  14. Improving high resolution retinal image quality using speckle illumination HiLo imaging

    PubMed Central

    Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew

    2014-01-01

    Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis. PMID:25136486

  15. Improving high resolution retinal image quality using speckle illumination HiLo imaging.

    PubMed

    Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew

    2014-08-01

    Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis.

  16. Evaluation of concrete cover by surface wave technique: Identification procedure

    NASA Astrophysics Data System (ADS)

    Piwakowski, Bogdan; Kaczmarek, Mariusz; Safinowski, Paweł

    2012-05-01

    Concrete cover degradation is induced by aggressive agents in ambiance, such as moisture, chemicals or temperature variations. Due to degradation usually a thin (a few millimeters thick) surface layer has porosity slightly higher than the deeper sound material. The non destructive evaluation of concrete cover is vital to monitor the integrity of concrete structures and prevent their irreversible damage. In this paper the methodology applied by the classical technique used for ground structure recovery called Multichanel Analysis of Surface Waves is discussed as the NDT tool in civil engineering domain to characterize the concrete cover. In order to obtain the velocity as a function of sample depth the dispersion of surface waves is used as an input for solving inverse problem. The paper describes the inversion procedure and provides the practical example of use of developed system.

  17. In situ characterization of natural pyrite bioleaching using electrochemical noise technique

    NASA Astrophysics Data System (ADS)

    Chen, Guo-bao; Yang, Hong-ying; Li, Hai-jun

    2016-02-01

    An in situ characterization technique called electrochemical noise (ECN) was used to investigate the bioleaching of natural pyrite. ECN experiments were conducted in four active systems (sulfuric acid, ferric-ion, 9k culture medium, and bioleaching solutions). The ECN data were analyzed in both the time and frequency domains. Spectral noise impedance spectra obtained from power spectral density (PSD) plots for different systems were compared. A reaction mechanism was also proposed on the basis of the experimental data analysis. The bioleaching system exhibits the lowest noise resistance of 0.101 MΩ. The bioleaching of natural pyrite is considered to be a bio-battery reaction, which distinguishes it from chemical oxidation reactions in ferric-ion and culture-medium (9k) solutions. The corrosion of pyrite becomes more severe over time after the long-term testing of bioleaching.

  18. An implementation and performance measurement of the progressive retry technique

    NASA Technical Reports Server (NTRS)

    Suri, Gaurav; Huang, Yennun; Wang, Yi-Min; Fuchs, W. Kent; Kintala, Chandra

    1995-01-01

    This paper describes a recovery technique called progressive retry for bypassing software faults in message-passing applications. The technique is implemented as reusable modules to provide application-level software fault tolerance. The paper describes the implementation of the technique and presents results from the application of progressive retry to two telecommunications systems. the results presented show that the technique is helpful in reducing the total recovery time for message-passing applications.

  19. Validation of a new technique to detect Cryptosporidium spp. oocysts in bovine feces.

    PubMed

    Inácio, Sandra Valéria; Gomes, Jancarlo Ferreira; Oliveira, Bruno César Miranda; Falcão, Alexandre Xavier; Suzuki, Celso Tetsuo Nagase; Dos Santos, Bianca Martins; de Aquino, Monally Conceição Costa; de Paula Ribeiro, Rafaela Silva; de Assunção, Danilla Mendes; Casemiro, Pamella Almeida Freire; Meireles, Marcelo Vasconcelos; Bresciani, Katia Denise Saraiva

    2016-11-01

    Due to its important zoonotic potential, cryptosporidiosis arouses strong interest in the scientific community, because, it was initially considered a rare and opportunistic disease. The parasitological diagnosis of the causative agent of this disease, the protozoan Cryptosporidium spp., requires the use of specific techniques of concentration and permanent staining, which are laborious and costly, and are difficult to use in routine laboratory tests. In view of the above, we conducted the feasibility, development, evaluation and intralaboratory validation of a new parasitological technique for analysis in optical microscopy of Cryptosporidium spp. oocysts, called TF-Test Coccidia, using fecal samples from calves from the city of Araçatuba, São Paulo. To confirm the aforementioned parasite and prove the diagnostic efficiency of the new technique, we used two established methodologies in the scientific literature: parasite concentration by centrifugal sedimentation and negative staining with malachite green (CSN-Malachite) and Nested-PCR. We observed good effectiveness of the TF-Test Coccidia technique, being statistically equivalent to CSN-Malachite. Thus, we verified the effectiveness of the TF-Test Coccidia parasitological technique for the detection of Cryptosporidium spp. oocysts and observed good concentration and morphology of the parasite, with a low amount of debris in the fecal smear. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Perspective on Kraken Mare Shores

    NASA Image and Video Library

    2015-02-12

    This Cassini Synthetic Aperture Radar (SAR) image is presented as a perspective view and shows a landscape near the eastern shoreline of Kraken Mare, a hydrocarbon sea in Titan's north polar region. This image was processed using a technique for handling noise that results in clearer views that can be easier for researchers to interpret. The technique, called despeckling, also is useful for producing altimetry data and 3-D views called digital elevation maps. Scientists have used a technique called radargrammetry to determine the altitude of surface features in this view at a resolution of approximately half a mile, or 1 kilometer. The altimetry reveals that the area is smooth overall, with a maximum amplitude of 0.75 mile (1.2 kilometers) in height. The topography also shows that all observed channels flow downhill. The presence of what scientists call "knickpoints" -- locations on a river where a sharp change in slope occurs -- might indicate stratification in the bedrock, erosion mechanisms at work or a particular way the surface responds to runoff events, such as floods following large storms. One such knickpoint is visible just above the lower left corner, where an area of bright slopes is seen. The image was obtained during a flyby of Titan on April 10, 2007. A more traditional radar image of this area on Titan is seen in PIA19046. http://photojournal.jpl.nasa.gov/catalog/PIA19051

  1. Maltodextrin: a novel excipient used in sugar-based orally disintegrating tablets and phase transition process.

    PubMed

    Elnaggar, Yosra Shaaban R; El-Massik, Magda A; Abdallah, Ossama Y; Ebian, Abd Elazim R

    2010-06-01

    The recent challenge in orally disintegrating tablets (ODT) manufacturing encompasses the compromise between instantaneous disintegration, sufficient hardness, and standard processing equipment. The current investigation constitutes one attempt to fulfill this challenge. Maltodextrin, in the present work, was utilized as a novel excipient to prepare ODT of meclizine. Tablets were prepared by both direct compression and wet granulation techniques. The effect of maltodextrin concentrations on ODT characteristics--manifested as hardness and disintegration time--was studied. The effect of conditioning (40 degrees C and 75% relative humidity) as a post-compression treatment on ODT characteristics was also assessed. Furthermore, maltodextrin-pronounced hardening effect was investigated using differential scanning calorimetry (DSC) and X-ray analysis. Results revealed that in both techniques, rapid disintegration (30-40 s) would be achieved on the cost of tablet hardness (about 1 kg). Post-compression conditioning of tablets resulted in an increase in hardness (3 kg), while keeping rapid disintegration (30-40 s) according to guidance of the FDA for ODT. However, direct compression-conditioning technique exhibited drawbacks of long conditioning time and appearance of the so-called patch effect. These problems were, yet, absent in wet granulation-conditioning technique. DSC and X-ray analysis suggested involvement of glass-elastic deformation in maltodextrin hardening effect. High-performance liquid chromatography analysis of meclizine ODT suggested no degradation of the drug by the applied conditions of temperature and humidity. Overall results proposed that maltodextrin is a promising saccharide for production of ODT with accepted hardness-disintegration time compromise, utilizing standard processing equipment and phenomena of phase transition.

  2. Gene Profiling Technique to Accelerate Stem Cell Therapies for Eye Diseases

    MedlinePlus

    ... like RPE. They also use a technique called quantitative RT-PCR to measure the expression of genes ... higher in iPS cells than mature RPE. But quantitative RT-PCR only permits the simultaneous measurement of ...

  3. Using comparative genome analysis to identify problems in annotated microbial genomes.

    PubMed

    Poptsova, Maria S; Gogarten, J Peter

    2010-07-01

    Genome annotation is a tedious task that is mostly done by automated methods; however, the accuracy of these approaches has been questioned since the beginning of the sequencing era. Genome annotation is a multilevel process, and errors can emerge at different stages: during sequencing, as a result of gene-calling procedures, and in the process of assigning gene functions. Missed or wrongly annotated genes differentially impact different types of analyses. Here we discuss and demonstrate how the methods of comparative genome analysis can refine annotations by locating missing orthologues. We also discuss possible reasons for errors and show that the second-generation annotation systems, which combine multiple gene-calling programs with similarity-based methods, perform much better than the first annotation tools. Since old errors may propagate to the newly sequenced genomes, we emphasize that the problem of continuously updating popular public databases is an urgent and unresolved one. Due to the progress in genome-sequencing technologies, automated annotation techniques will remain the main approach in the future. Researchers need to be aware of the existing errors in the annotation of even well-studied genomes, such as Escherichia coli, and consider additional quality control for their results.

  4. Chaos as an intermittently forced linear system.

    PubMed

    Brunton, Steven L; Brunton, Bingni W; Proctor, Joshua L; Kaiser, Eurika; Kutz, J Nathan

    2017-05-30

    Understanding the interplay of order and disorder in chaos is a central challenge in modern quantitative science. Approximate linear representations of nonlinear dynamics have long been sought, driving considerable interest in Koopman theory. We present a universal, data-driven decomposition of chaos as an intermittently forced linear system. This work combines delay embedding and Koopman theory to decompose chaotic dynamics into a linear model in the leading delay coordinates with forcing by low-energy delay coordinates; this is called the Hankel alternative view of Koopman (HAVOK) analysis. This analysis is applied to the Lorenz system and real-world examples including Earth's magnetic field reversal and measles outbreaks. In each case, forcing statistics are non-Gaussian, with long tails corresponding to rare intermittent forcing that precedes switching and bursting phenomena. The forcing activity demarcates coherent phase space regions where the dynamics are approximately linear from those that are strongly nonlinear.The huge amount of data generated in fields like neuroscience or finance calls for effective strategies that mine data to reveal underlying dynamics. Here Brunton et al.develop a data-driven technique to analyze chaotic systems and predict their dynamics in terms of a forced linear model.

  5. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    PubMed

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.

  6. Support vector machine and principal component analysis for microarray data classification

    NASA Astrophysics Data System (ADS)

    Astuti, Widi; Adiwijaya

    2018-03-01

    Cancer is a leading cause of death worldwide although a significant proportion of it can be cured if it is detected early. In recent decades, technology called microarray takes an important role in the diagnosis of cancer. By using data mining technique, microarray data classification can be performed to improve the accuracy of cancer diagnosis compared to traditional techniques. The characteristic of microarray data is small sample but it has huge dimension. Since that, there is a challenge for researcher to provide solutions for microarray data classification with high performance in both accuracy and running time. This research proposed the usage of Principal Component Analysis (PCA) as a dimension reduction method along with Support Vector Method (SVM) optimized by kernel functions as a classifier for microarray data classification. The proposed scheme was applied on seven data sets using 5-fold cross validation and then evaluation and analysis conducted on term of both accuracy and running time. The result showed that the scheme can obtained 100% accuracy for Ovarian and Lung Cancer data when Linear and Cubic kernel functions are used. In term of running time, PCA greatly reduced the running time for every data sets.

  7. A tale of two species: neural integration in zebrafish and monkeys

    PubMed Central

    Joshua, Mati; Lisberger, Stephen G.

    2014-01-01

    Selection of a model organism creates a tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called “neural integration” that occurs in the brainstems of larval zebrafish, non-human primates, and species “in between”. We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for the details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. PMID:24797331

  8. A tale of two species: Neural integration in zebrafish and monkeys.

    PubMed

    Joshua, M; Lisberger, S G

    2015-06-18

    Selection of a model organism creates tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called "neural integration" that occurs in the brainstems of larval zebrafish, primates, and species "in between". We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  9. Component-Level Electronic-Assembly Repair (CLEAR) Spacecraft Circuit Diagnostics by Analog and Complex Signature Analysis

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Wade, Raymond P.; Izadnegahdar, Alain

    2011-01-01

    The Component-Level Electronic-Assembly Repair (CLEAR) project at the NASA Glenn Research Center is aimed at developing technologies that will enable space-flight crews to perform in situ component-level repair of electronics on Moon and Mars outposts, where there is no existing infrastructure for logistics spares. These technologies must provide effective repair capabilities yet meet the payload and operational constraints of space facilities. Effective repair depends on a diagnostic capability that is versatile but easy to use by crew members that have limited training in electronics. CLEAR studied two techniques that involve extensive precharacterization of "known good" circuits to produce graphical signatures that provide an easy-to-use comparison method to quickly identify faulty components. Analog Signature Analysis (ASA) allows relatively rapid diagnostics of complex electronics by technicians with limited experience. Because of frequency limits and the growing dependence on broadband technologies, ASA must be augmented with other capabilities. To meet this challenge while preserving ease of use, CLEAR proposed an alternative called Complex Signature Analysis (CSA). Tests of ASA and CSA were used to compare capabilities and to determine if the techniques provided an overlapping or complementary capability. The results showed that the methods are complementary.

  10. Combined slope ratio analysis and linear-subtraction: An extension of the Pearce ratio method

    NASA Astrophysics Data System (ADS)

    De Waal, Sybrand A.

    1996-07-01

    A new technique, called combined slope ratio analysis, has been developed by extending the Pearce element ratio or conserved-denominator method (Pearce, 1968) to its logical conclusions. If two stoichiometric substances are mixed and certain chemical components are uniquely contained in either one of the two mixing substances, then by treating these unique components as conserved, the composition of the substance not containing the relevant component can be accurately calculated within the limits allowed by analytical and geological error. The calculated composition can then be subjected to rigorous statistical testing using the linear-subtraction method recently advanced by Woronow (1994). Application of combined slope ratio analysis to the rocks of the Uwekahuna Laccolith, Hawaii, USA, and the lavas of the 1959-summit eruption of Kilauea Volcano, Hawaii, USA, yields results that are consistent with field observations.

  11. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  12. Resolving and quantifying overlapped chromatographic bands by transmutation

    PubMed

    Malinowski

    2000-09-15

    A new chemometric technique called "transmutation" is developed for the purpose of sharpening overlapped chromatographic bands in order to quantify the components. The "transmutation function" is created from the chromatogram of the pure component of interest, obtained from the same instrument, operating under the same experimental conditions used to record the unresolved chromatogram of the sample mixture. The method is used to quantify mixtures containing toluene, ethylbenzene, m-xylene, naphthalene, and biphenyl from unresolved chromatograms previously reported. The results are compared to those obtained using window factor analysis, rank annihilation factor analysis, and matrix regression analysis. Unlike the latter methods, the transmutation method is not restricted to two-dimensional arrays of data, such as those obtained from HPLC/DAD, but is also applicable to chromatograms obtained from single detector experiments. Limitations of the method are discussed.

  13. Principle of the electrically induced Transient Current Technique

    NASA Astrophysics Data System (ADS)

    Bronuzzi, J.; Moll, M.; Bouvet, D.; Mapelli, A.; Sallese, J. M.

    2018-05-01

    In the field of detector development for High Energy Physics, the so-called Transient Current Technique (TCT) is used to characterize the electric field profile and the charge trapping inside silicon radiation detectors where particles or photons create electron-hole pairs in the bulk of a semiconductor device, as PiN diodes. In the standard approach, the TCT signal originates from the free carriers generated close to the surface of a silicon detector, by short pulses of light or by alpha particles. This work proposes a new principle of charge injection by means of lateral PN junctions implemented in one of the detector electrodes, called the electrical TCT (el-TCT). This technique is fully compatible with CMOS technology and therefore opens new perspectives for assessment of radiation detectors performances.

  14. Calculating phase equilibrium properties of plasma pseudopotential model using hybrid Gibbs statistical ensemble Monte-Carlo technique

    NASA Astrophysics Data System (ADS)

    Butlitsky, M. A.; Zelener, B. B.; Zelener, B. V.

    2015-11-01

    Earlier a two-component pseudopotential plasma model, which we called a “shelf Coulomb” model has been developed. A Monte-Carlo study of canonical NVT ensemble with periodic boundary conditions has been undertaken to calculate equations of state, pair distribution functions, internal energies and other thermodynamics properties of the model. In present work, an attempt is made to apply so-called hybrid Gibbs statistical ensemble Monte-Carlo technique to this model. First simulation results data show qualitatively similar results for critical point region for both methods. Gibbs ensemble technique let us to estimate the melting curve position and a triple point of the model (in reduced temperature and specific volume coordinates): T* ≈ 0.0476, v* ≈ 6 × 10-4.

  15. Development of a cooperative operational rendezvous plan for Eureca and other maneuvering Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Gavin, R. T.

    1987-01-01

    This paper discusses the development of a new class of US Space Shuttle rendezvous missions which involve a maneuvering target vehicle. The objective of the analysis was to develop an operational plan to take advantage of the target spacecraft's maneuvering ability by making it responsible for a portion of the maneuvers necessary to achieve rendezvous. This work resulted in the development of a region in space relative to the Shuttle, called the control box, into which the target vehicle maneuvers. Furthermore, a mission operations plan was developed to implement the control box technique.

  16. Instruction set commutivity

    NASA Technical Reports Server (NTRS)

    Windley, P.

    1992-01-01

    We present a state property called congruence and show how it can be used to demonstrate commutivity of instructions in a modern load-store architecture. Our analysis is particularly important in pipelined microprocessors where instructions are frequently reordered to avoid costly delays in execution caused by hazards. Our work has significant implications to safety and security critical applications since reordering can easily change the meaning and an instruction sequence and current techniques are largely ad hoc. Our work is done in a mechanical theorem prover and results in a set of trustworthy rules for instruction reordering. The mechanization makes it practical to analyze the entire instruction set.

  17. Analysis of an Unusual Mirror in a 16th-Century Painting: A Museum Exercise for Physics Students

    NASA Astrophysics Data System (ADS)

    Swaminathan, Sudha; Lamelas, Frank

    2017-04-01

    Physics students at Worcester State University visit the Worcester Art Museum (WAM) at the end of a special 100-level course called Physics in Art. The students have studied geometrical optics, and they have been introduced to concepts in atomic physics. The purpose of the museum tour is to show how physics-based techniques can be used in a nontraditional lab setting. Other examples of the use of museum-based art in physics instruction include analyses of Pointillism and image resolution, and of reflections in soap bubbles in 17- and 18th-century paintings.

  18. A semester-long project for teaching basic techniques in molecular biology such as restriction fragment length polymorphism analysis to undergraduate and graduate students.

    PubMed

    DiBartolomeis, Susan M

    2011-01-01

    Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky(73). Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers.

  19. A Semester-Long Project for Teaching Basic Techniques in Molecular Biology Such as Restriction Fragment Length Polymorphism Analysis to Undergraduate and Graduate Students

    PubMed Central

    DiBartolomeis, Susan M.

    2011-01-01

    Several reports on science education suggest that students at all levels learn better if they are immersed in a project that is long term, yielding results that require analysis and interpretation. I describe a 12-wk laboratory project suitable for upper-level undergraduates and first-year graduate students, in which the students molecularly locate and map a gene from Drosophila melanogaster called dusky and one of dusky's mutant alleles. The mapping strategy uses restriction fragment length polymorphism analysis; hence, students perform most of the basic techniques of molecular biology (DNA isolation, restriction enzyme digestion and mapping, plasmid vector subcloning, agarose and polyacrylamide gel electrophoresis, DNA labeling, and Southern hybridization) toward the single goal of characterizing dusky and the mutant allele dusky73. Students work as individuals, pairs, or in groups of up to four students. Some exercises require multitasking and collaboration between groups. Finally, results from everyone in the class are required for the final analysis. Results of pre- and postquizzes and surveys indicate that student knowledge of appropriate topics and skills increased significantly, students felt more confident in the laboratory, and students found the laboratory project interesting and challenging. Former students report that the lab was useful in their careers. PMID:21364104

  20. Teaching Business Management to Engineers: The Impact of Interactive Lectures

    ERIC Educational Resources Information Center

    Rambocas, Meena; Sastry, Musti K. S.

    2017-01-01

    Some education specialists are challenging the use of traditional strategies in classrooms and are calling for the use of contemporary teaching and learning techniques. In response to these calls, many field experiments that compare different teaching and learning strategies have been conducted. However, to date, little is known on the outcomes of…

  1. 78 FR 69705 - 60-Day Notice of Proposed Information Collection: Mortgagee's Application for Partial Settlement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION CONTACT: Steve... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  2. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  3. Incorporating principal component analysis into air quality ...

    EPA Pesticide Factsheets

    The efficacy of standard air quality model evaluation techniques is becoming compromised as the simulation periods continue to lengthen in response to ever increasing computing capacity. Accordingly, the purpose of this paper is to demonstrate a statistical approach called Principal Component Analysis (PCA) with the intent of motivating its use by the evaluation community. One of the main objectives of PCA is to identify, through data reduction, the recurring and independent modes of variations (or signals) within a very large dataset, thereby summarizing the essential information of that dataset so that meaningful and descriptive conclusions can be made. In this demonstration, PCA is applied to a simple evaluation metric – the model bias associated with EPA's Community Multi-scale Air Quality (CMAQ) model when compared to weekly observations of sulfate (SO42−) and ammonium (NH4+) ambient air concentrations measured by the Clean Air Status and Trends Network (CASTNet). The advantages of using this technique are demonstrated as it identifies strong and systematic patterns of CMAQ model bias across a myriad of spatial and temporal scales that are neither constrained to geopolitical boundaries nor monthly/seasonal time periods (a limitation of many current studies). The technique also identifies locations (station–grid cell pairs) that are used as indicators for a more thorough diagnostic evaluation thereby hastening and facilitating understanding of the prob

  4. Multimodal Pressure-Flow Analysis: Application of Hilbert Huang Transform in Cerebral Blood Flow Regulation

    NASA Astrophysics Data System (ADS)

    Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera

    2008-12-01

    Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique called multimodal pressure flow (MMPF) method that utilizes Hilbert-Huang transformation to quantify interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP) for the assessment of dynamic cerebral autoregulation (CA). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The MMPF analysis decomposes BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. Additionally, the new technique can reliably assess CA using both induced BP/BFV oscillations during clinical tests and spontaneous BP/BFV fluctuations during resting conditions.

  5. Directional frequency and recording (DIFAR) sensors in seafloor recorders to locate calling bowhead whales during their fall migration.

    PubMed

    Greene, Charles R; McLennan, Miles Wm; Norman, Robert G; McDonald, Trent L; Jakubczak, Ray S; Richardson, W John

    2004-08-01

    Bowhead whales, Balaena mysticetus, migrate west during fall approximately 10-75 km off the north coast of Alaska, passing the petroleum developments around Prudhoe Bay. Oil production operations on an artificial island 5 km offshore create sounds heard by some whales. As part of an effort to assess whether migrating whales deflect farther offshore at times with high industrial noise, an acoustical approach was selected for localizing calling whales. The technique incorporated DIFAR (directional frequency and recording) sonobuoy techniques. An array of 11 DASARs (directional autonomous seafloor acoustic recorders) was built and installed with unit-to-unit separation of 5 km. When two or more DASARs detected the same call, the whale location was determined from the bearing intersections. This article describes the acoustic methods used to determine the locations of the calling bowhead whales and shows the types and precision of the data acquired. Calibration transmissions at GPS-measured times and locations provided measures of the individual DASAR clock drift and directional orientation. The standard error of the bearing measurements at distances of 3-4 km was approximately 1.35 degrees after corrections for gain imbalance in the two directional sensors. During 23 days in 2002, 10,587 bowhead calls were detected and 8383 were localized.

  6. Variable horizon in a peridynamic medium

    DOE PAGES

    Silling, Stewart A.; Littlewood, David J.; Seleson, Pablo

    2015-12-10

    Here, a notion of material homogeneity is proposed for peridynamic bodies with variable horizon but constant bulk properties. A relation is derived that scales the force state according to the position-dependent horizon while keeping the bulk properties unchanged. Using this scaling relation, if the horizon depends on position, artifacts called ghost forces may arise in a body under a homogeneous deformation. These artifacts depend on the second derivative of the horizon and can be reduced by employing a modified equilibrium equation using a new quantity called the partial stress. Bodies with piecewise constant horizon can be modeled without ghost forcesmore » by using a simpler technique called a splice. As a limiting case of zero horizon, both the partial stress and splice techniques can be used to achieve local-nonlocal coupling. Computational examples, including dynamic fracture in a one-dimensional model with local-nonlocal coupling, illustrate the methods.« less

  7. Node fingerprinting: an efficient heuristic for aligning biological networks.

    PubMed

    Radu, Alex; Charleston, Michael

    2014-10-01

    With the continuing increase in availability of biological data and improvements to biological models, biological network analysis has become a promising area of research. An emerging technique for the analysis of biological networks is through network alignment. Network alignment has been used to calculate genetic distance, similarities between regulatory structures, and the effect of external forces on gene expression, and to depict conditional activity of expression modules in cancer. Network alignment is algorithmically complex, and therefore we must rely on heuristics, ideally as efficient and accurate as possible. The majority of current techniques for network alignment rely on precomputed information, such as with protein sequence alignment, or on tunable network alignment parameters, which may introduce an increased computational overhead. Our presented algorithm, which we call Node Fingerprinting (NF), is appropriate for performing global pairwise network alignment without precomputation or tuning, can be fully parallelized, and is able to quickly compute an accurate alignment between two biological networks. It has performed as well as or better than existing algorithms on biological and simulated data, and with fewer computational resources. The algorithmic validation performed demonstrates the low computational resource requirements of NF.

  8. Post-processing of auditory steady-state responses to correct spectral leakage.

    PubMed

    Felix, Leonardo Bonato; de Sá, Antonio Mauricio Ferreira Leite Miranda; Mendes, Eduardo Mazoni Andrade Marçal; Moraes, Márcio Flávio Dutra

    2009-06-30

    Auditory steady-state responses (ASSRs) are electrical manifestations of brain due to high rate sound stimulation. These evoked responses can be used to assess the hearing capabilities of a subject in an objective, automatic fashion. Usually, the detection protocol is accomplished by frequency-domain techniques, such as magnitude-squared coherence, whose estimation is based on the fast Fourier transform (FFT) of several data segments. In practice, the FFT-based spectrum may spread out the energy of a given frequency to its side bins and this escape of energy in the spectrum is called spectral leakage. The distortion of the spectrum due to leakage may severely compromise statistical significance of objective detection. This work presents an offline, a posteriori method for spectral leakage minimization in the frequency-domain analysis of ASSRs using coherent sampling criterion and interpolation in time. The technique was applied to the local field potentials of 10 Wistar rats and the results, together with those from simulated data, indicate that a leakage-free analysis of ASSRs is possible for any dataset if the methods showed in this paper were followed.

  9. Performance Analysis of Physical Layer Security of Opportunistic Scheduling in Multiuser Multirelay Cooperative Networks

    PubMed Central

    Shim, Kyusung; Do, Nhu Tri; An, Beongku

    2017-01-01

    In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286

  10. Modeling and visualizing cell type switching.

    PubMed

    Ghaffarizadeh, Ahmadreza; Podgorski, Gregory J; Flann, Nicholas S

    2014-01-01

    Understanding cellular differentiation is critical in explaining development and for taming diseases such as cancer. Differentiation is conventionally represented using bifurcating lineage trees. However, these lineage trees cannot readily capture or quantify all the types of transitions now known to occur between cell types, including transdifferentiation or differentiation off standard paths. This work introduces a new analysis and visualization technique that is capable of representing all possible transitions between cell states compactly, quantitatively, and intuitively. This method considers the regulatory network of transcription factors that control cell type determination and then performs an analysis of network dynamics to identify stable expression profiles and the potential cell types that they represent. A visualization tool called CellDiff3D creates an intuitive three-dimensional graph that shows the overall direction and probability of transitions between all pairs of cell types within a lineage. In this study, the influence of gene expression noise and mutational changes during myeloid cell differentiation are presented as a demonstration of the CellDiff3D technique, a new approach to quantify and envision all possible cell state transitions in any lineage network.

  11. Longitudinal changes in the visual field and optic disc in glaucoma.

    PubMed

    Artes, Paul H; Chauhan, Balwantray C

    2005-05-01

    The nature and mode of functional and structural progression in open-angle glaucoma is a subject of considerable debate in the literature. While there is a traditionally held viewpoint that optic disc and/or nerve fibre layer changes precede visual field changes, there is surprisingly little published evidence from well-controlled prospective studies in this area, specifically with modern perimetric and imaging techniques. In this paper, we report on clinical data from both glaucoma patients and normal controls collected prospectively over several years, to address the relationship between visual field and optic disc changes in glaucoma using standard automated perimetry (SAP), high-pass resolution perimetry (HRP) and confocal scanning laser tomography (CSLT). We use several methods of analysis of longitudinal data and describe a new technique called "evidence of change" analysis which facilitates comparison between different tests. We demonstrate that current clinical indicators of visual function (SAP and HRP) and measures of optic disc structure (CSLT) provide largely independent measures of progression. We discuss the reasons for these findings as well as several methodological issues that pose challenges to elucidating the true structure-function relationship in glaucoma.

  12. Uncovering Pompeii: Examining Evidence.

    ERIC Educational Resources Information Center

    Yell, Michael M.

    2001-01-01

    Presents a lesson plan on Pompeii (Italy) for middle school students that utilizes a teaching technique called interactive presentation. Describes the technique's five phases: (1) discrepant event inquiry; (2) discussion/presentation; (3) cooperative learning activity; (4) writing for understanding activity; and (5) whole-class discussion and…

  13. Alarm Fatigue vs User Expectations Regarding Context-Aware Alarm Handling in Hospital Environments Using CallMeSmart.

    PubMed

    Solvoll, Terje; Arntsen, Harald; Hartvigsen, Gunnar

    2017-01-01

    Surveys and research show that mobile communication systems in hospital settings are old and cause frequent interruptions. In the quest to remedy this, an Android based communication system called CallMeSmart tries to encapsulate most of the frequent communication into one hand held device focusing on reducing interruptions and at the same time make the workday easier for healthcare workers. The objective of CallMeSmart is to use context-awareness techniques to automatically monitor the availability of physicians' and nurses', and use this information to prevent or route phone calls, text messages, pages and alarms that would otherwise compromise patient care. In this paper, we present the results from interviewing nurses on alarm fatigue and their expectations regarding context-aware alarm handling using CallMeSmart.

  14. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  15. A comparative critical study between FMEA and FTA risk analysis methods

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Constantinescu, DM

    2017-10-01

    Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.

  16. Partial least squares correspondence analysis: A framework to simultaneously analyze behavioral and genetic data.

    PubMed

    Beaton, Derek; Dunlop, Joseph; Abdi, Hervé

    2016-12-01

    For nearly a century, detecting the genetic contributions to cognitive and behavioral phenomena has been a core interest for psychological research. Recently, this interest has been reinvigorated by the availability of genotyping technologies (e.g., microarrays) that provide new genetic data, such as single nucleotide polymorphisms (SNPs). These SNPs-which represent pairs of nucleotide letters (e.g., AA, AG, or GG) found at specific positions on human chromosomes-are best considered as categorical variables, but this coding scheme can make difficult the multivariate analysis of their relationships with behavioral measurements, because most multivariate techniques developed for the analysis between sets of variables are designed for quantitative variables. To palliate this problem, we present a generalization of partial least squares-a technique used to extract the information common to 2 different data tables measured on the same observations-called partial least squares correspondence analysis-that is specifically tailored for the analysis of categorical and mixed ("heterogeneous") data types. Here, we formally define and illustrate-in a tutorial format-how partial least squares correspondence analysis extends to various types of data and design problems that are particularly relevant for psychological research that include genetic data. We illustrate partial least squares correspondence analysis with genetic, behavioral, and neuroimaging data from the Alzheimer's Disease Neuroimaging Initiative. R code is available on the Comprehensive R Archive Network and via the authors' websites. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. CometBoards Users Manual Release 1.0

    NASA Technical Reports Server (NTRS)

    Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo

    1996-01-01

    Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.

  18. Performance of US teaching hospitals: a panel analysis of cost inefficiency.

    PubMed

    Rosko, Michael D

    2004-02-01

    This research summarizes an analysis of the impact of environment pressures on hospital inefficiency during the period 1990-1999. The panel design included 616 hospitals. Of these, 211 were academic medical centers and 415 were hospitals with smaller teaching programs. The primary sources of data were the American Hospital Association's Annual Survey of Hospitals and Medicare Cost Reports. Hospital inefficiency was estimated by a regression technique called stochastic frontier analysis. This technique estimates a "best practice cost frontier" for each hospital that is based on the hospital's outputs and input prices. The cost efficiency of each hospital was defined as the ratio of the stochastic frontier total costs to observed total costs. Average inefficiency declined from 14.35% in 1990 to 11.42% in 1998. It increased to 11.78% in 1999. Decreases in inefficiency were associated with the HMO penetration rate and time. Increases in inefficiency were associated with for-profit ownership status and Medicare share of admissions. The implementation of the provisions of the Balanced Budget Act of 1997 was followed by a small decrease in average hospital inefficiency. Analysis found that the SFA results were moderately sensitive to the specification of the teaching output variable. Thus, although the SFA technique can be useful for detecting differences in inefficiency between groups of hospitals (i.e., those with high versus those with low Medicare shares or for-profit versus not-for-profit hospitals), its relatively low precision indicates it should not be used for exact estimates of the magnitude of differences associated with inefficiency-effects variables.

  19. 78 FR 70957 - 60-Day Notice of Proposed Information Collection: HUD-Owned Real Estate Good Neighbor Next Door...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION CONTACT: Ivery W... number through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... automated collection techniques or other forms of information technology, e.g., permitting electronic...

  20. 78 FR 67384 - 60-Day Notice of Proposed Information Collection: FHA-Insured Mortgage Loan Servicing Involving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... hearing or speech impairments may access this number through TTY by calling the toll-free Federal Relay... calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available documents submitted to... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  1. 78 FR 75364 - 60-Day Notice of Proposed Information Collection: Application for FHA Insured Mortgages

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-11

    ... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  2. Encourage Students to Read through the Use of Data Visualization

    ERIC Educational Resources Information Center

    Bandeen, Heather M.; Sawin, Jason E.

    2012-01-01

    Instructors are always looking for new ways to engage students in reading assignments. The authors present a few techniques that rely on a web-based data visualization tool called Wordle (wordle.net). Wordle creates word frequency representations called word clouds. The larger a word appears within a cloud, the more frequently it occurs within a…

  3. Spectrum transformation for divergent iterations

    NASA Technical Reports Server (NTRS)

    Gupta, Murli M.

    1991-01-01

    Certain spectrum transformation techniques are described that can be used to transform a diverging iteration into a converging one. Two techniques are considered called spectrum scaling and spectrum enveloping and how to obtain the optimum values of the transformation parameters is discussed. Numerical examples are given to show how this technique can be used to transform diverging iterations into converging ones; this technique can also be used to accelerate the convergence of otherwise convergent iterations.

  4. Optimization of Online Searching by Pre-Recording the Search Statements: A Technique for the HP-2645A Terminal.

    ERIC Educational Resources Information Center

    Oberhauser, O. C.; Stebegg, K.

    1982-01-01

    Describes the terminal's capabilities, ways to store and call up lines of statements, cassette tapes needed during searches, and master tape's use for login storage. Advantages of the technique and two sources are listed. (RBF)

  5. Development of a mix design process for cold-in-place rehabilitation using foamed asphalt.

    DOT National Transportation Integrated Search

    2003-12-01

    This study evaluates one of the recycling techniques used to rehabilitate pavement, called Cold In-Place Recycling (CIR). CIR is one of the fastest growing road rehabilitation techniques because it is quick and cost-effective. The document reports on...

  6. Classification by diagnosing all absorption features (CDAF) for the most abundant minerals in airborne hyperspectral images

    NASA Astrophysics Data System (ADS)

    Mobasheri, Mohammad Reza; Ghamary-Asl, Mohsen

    2011-12-01

    Imaging through hyperspectral technology is a powerful tool that can be used to spectrally identify and spatially map materials based on their specific absorption characteristics in electromagnetic spectrum. A robust method called Tetracorder has shown its effectiveness at material identification and mapping, using a set of algorithms within an expert system decision-making framework. In this study, using some stages of Tetracorder, a technique called classification by diagnosing all absorption features (CDAF) is introduced. This technique enables one to assign a class to the most abundant mineral in each pixel with high accuracy. The technique is based on the derivation of information from reflectance spectra of the image. This can be done through extraction of spectral absorption features of any minerals from their respected laboratory-measured reflectance spectra, and comparing it with those extracted from the pixels in the image. The CDAF technique has been executed on the AVIRIS image where the results show an overall accuracy of better than 96%.

  7. Dispatch-assisted CPR: where are the hold-ups during calls to emergency dispatchers? A preliminary analysis of caller-dispatcher interactions during out-of-hospital cardiac arrest using a novel call transcription technique.

    PubMed

    Clegg, Gareth R; Lyon, Richard M; James, Scott; Branigan, Holly P; Bard, Ellen G; Egan, Gerry J

    2014-01-01

    Survival from out-of-hospital cardiac arrest (OHCA) is dependent on the chain of survival. Early recognition of cardiac arrest and provision of bystander cardiopulmonary resuscitation (CPR) are key determinants of OHCA survival. Emergency medical dispatchers play a key role in cardiac arrest recognition and giving telephone CPR advice. The interaction between caller and dispatcher can influence the time to bystander CPR and quality of resuscitation. We sought to pilot the use of emergency call transcription to audit and evaluate the holdups in performing dispatch-assisted CPR. A retrospective case selection of 50 consecutive suspected OHCA was performed. Audio recordings of calls were downloaded from the emergency medical dispatch centre computer database. All calls were transcribed using proprietary software and voice dialogue was compared with the corresponding stage on the Medical Priority Dispatch System (MPDS). Time to progress through each stage and number of caller-dispatcher interactions were calculated. Of the 50 downloaded calls, 47 were confirmed cases of OHCA. Call transcription was successfully completed for all OHCA calls. Bystander CPR was performed in 39 (83%) of these. In the remaining cases, the caller decided the patient was beyond help (n = 7) or the caller said that they were physically unable to perform CPR (n = 1). MPDS stages varied substantially in time to completion. Stage 9 (determining if the patient is breathing through airway instructions) took the longest time to complete (median = 59 s, IQR 22-82 s). Stage 11 (giving CPR instructions) also took a relatively longer time to complete compared to the other stages (median = 46 s, IQR 37-75 s). Stage 5 (establishing the patient's age) took the shortest time to complete (median = 5.5s, IQR 3-9s). Transcription of OHCA emergency calls and caller-dispatcher interaction compared to MPDS stage is feasible. Confirming whether a patient is breathing and completing CPR instructions required the longest time and most interactions between caller and dispatcher. Use of call transcription has the potential to identify key factors in caller-dispatcher interaction that could improve time to CPR and further research is warranted in this area. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Use of a Smartphone as a Colorimetric Analyzer in Paper-based Devices for Sensitive and Selective Determination of Mercury in Water Samples.

    PubMed

    Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb

    2018-01-01

    A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.

  9. Current trends in gamma radiation detection for radiological emergency response

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sanjoy; Guss, Paul; Maurer, Richard

    2011-09-01

    Passive and active detection of gamma rays from shielded radioactive materials, including special nuclear materials, is an important task for any radiological emergency response organization. This article reports on the current trends and status of gamma radiation detection objectives and measurement techniques as applied to nonproliferation and radiological emergencies. In recent years, since the establishment of the Domestic Nuclear Detection Office by the Department of Homeland Security, a tremendous amount of progress has been made in detection materials (scintillators, semiconductors), imaging techniques (Compton imaging, use of active masking and hybrid imaging), data acquisition systems with digital signal processing, field programmable gate arrays and embedded isotopic analysis software (viz. gamma detector response and analysis software [GADRAS]1), fast template matching, and data fusion (merging radiological data with geo-referenced maps, digital imagery to provide better situational awareness). In this stride to progress, a significant amount of inter-disciplinary research and development has taken place-techniques and spin-offs from medical science (such as x-ray radiography and tomography), materials engineering (systematic planned studies on scintillators to optimize several qualities of a good scintillator, nanoparticle applications, quantum dots, and photonic crystals, just to name a few). No trend analysis of radiation detection systems would be complete without mentioning the unprecedented strategic position taken by the National Nuclear Security Administration (NNSA) to deter, detect, and interdict illicit trafficking in nuclear and other radioactive materials across international borders and through the global maritime transportation-the so-called second line of defense.

  10. Resonance Energy Transfer-Based Molecular Switch Designed Using a Systematic Design Process Based on Monte Carlo Methods and Markov Chains

    NASA Astrophysics Data System (ADS)

    Rallapalli, Arjun

    A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical. In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications. We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.

  11. Structural Embeddings: Mechanization with Method

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Rushby, John

    1999-01-01

    The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.

  12. Secondary signal imaging (SSI) electron tomography (SSI-ET): A new three-dimensional metrology for mesoscale specimens in transmission electron microscope.

    PubMed

    Han, Chang Wan; Ortalan, Volkan

    2015-09-01

    We have demonstrated a new electron tomography technique utilizing the secondary signals (secondary electrons and backscattered electrons) for ultra thick (a few μm) specimens. The Monte Carlo electron scattering simulations reveal that the amount of backscattered electrons generated by 200 and 300keV incident electrons is a monotonic function of the sample thickness and this causes the thickness contrast satisfying the projection requirement for the tomographic reconstruction. Additional contribution of the secondary electrons emitted from the edges of the specimens enhances the visibility of the surface features. The acquired SSI tilt series of the specimen having mesoscopic dimensions are successfully reconstructed verifying that this new technique, so called the secondary signal imaging electron tomography (SSI-ET), can directly be utilized for 3D structural analysis of mesoscale structures. Published by Elsevier Ltd.

  13. Extraction of decision rules via imprecise probabilities

    NASA Astrophysics Data System (ADS)

    Abellán, Joaquín; López, Griselda; Garach, Laura; Castellano, Javier G.

    2017-05-01

    Data analysis techniques can be applied to discover important relations among features. This is the main objective of the Information Root Node Variation (IRNV) technique, a new method to extract knowledge from data via decision trees. The decision trees used by the original method were built using classic split criteria. The performance of new split criteria based on imprecise probabilities and uncertainty measures, called credal split criteria, differs significantly from the performance obtained using the classic criteria. This paper extends the IRNV method using two credal split criteria: one based on a mathematical parametric model, and other one based on a non-parametric model. The performance of the method is analyzed using a case study of traffic accident data to identify patterns related to the severity of an accident. We found that a larger number of rules is generated, significantly supplementing the information obtained using the classic split criteria.

  14. Application of gamma imaging techniques for the characterisation of position sensitive gamma detectors

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Didierjean, F.; Duchêne, G.; Filliger, M.; Gerl, J.; Kojouharov, I.; Li, G.; Pietralla, N.; Schaffner, H.; Sigward, M.-H.

    2017-11-01

    A device to characterize position-sensitive germanium detectors has been implemented at GSI. The main component of this so called scanning table is a gamma camera that is capable of producing online 2D images of the scanned detector by means of a PET technique. To calibrate the gamma camera Compton imaging is employed. The 2D data can be processed further offline to obtain depth information. Of main interest is the response of the scanned detector in terms of the digitized pulse shapes from the preamplifier. This is an important input for pulse-shape analysis algorithms as they are in use for gamma tracking arrays in gamma spectroscopy. To validate the scanning table, a comparison of its results with a second scanning table implemented at the IPHC Strasbourg is envisaged. For this purpose a pixelated germanium detector has been scanned.

  15. Parallel computation in a three-dimensional elastic-plastic finite-element analysis

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Bigelow, C. A.; Newman, J. C., Jr.

    1992-01-01

    A CRAY parallel processing technique called autotasking was implemented in a three-dimensional elasto-plastic finite-element code. The technique was evaluated on two CRAY supercomputers, a CRAY 2 and a CRAY Y-MP. Autotasking was implemented in all major portions of the code, except the matrix equations solver. Compiler directives alone were not able to properly multitask the code; user-inserted directives were required to achieve better performance. It was noted that the connect time, rather than wall-clock time, was more appropriate to determine speedup in multiuser environments. For a typical example problem, a speedup of 2.1 (1.8 when the solution time was included) was achieved in a dedicated environment and 1.7 (1.6 with solution time) in a multiuser environment on a four-processor CRAY 2 supercomputer. The speedup on a three-processor CRAY Y-MP was about 2.4 (2.0 with solution time) in a multiuser environment.

  16. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  17. Early diagnosis of tongue malignancy using laser induced fluorescence spectroscopy technique

    NASA Astrophysics Data System (ADS)

    Patil, Ajeetkumar; Unnikrishnan V., K.; Ongole, Ravikiran; Pai, Keerthilatha M.; Kartha, V. B.; Chidangil, Santhosh

    2015-07-01

    Oral cancer together with pharyngeal cancer is the sixth most common malignancy reported worldwide and one with high mortality ratio among all malignancies [1]. Worldwide 450,000 new cases are estimated in 2014[2]. About 90% are a type of cancer called squamous cell carcinoma (SCC). SCC of the tongue is the most common oral malignancy accounting for approximately 40% of all oral carcinomas. One of the important factors for successful therapy of any malignancy is early diagnosis. Although considerable progress has been made in understanding the cellular and molecular mechanisms of tumorigenesis, lack of reliable diagnostic methods for early detection leading to delay in therapy is an important factor responsible for the increase in the mortality rate in various types of cancers. Spectroscopy techniques are extremely sensitive for the analysis of biochemical changes in cellular systems. These techniques can provide a valuable information on alterations that occur during the development of cancer. This is especially important in oral cancer, where "tumor detection is complicated by a tendency towards field cancerization, leading to multi-centric lesions" and "current techniques detect malignant change too late" [3], and "biopsies are not representative of the whole premalignant lesion". [4

  18. ProteinAC: a frequency domain technique for analyzing protein dynamics

    NASA Astrophysics Data System (ADS)

    Bozkurt Varolgunes, Yasemin; Demir, Alper

    2018-03-01

    It is widely believed that the interactions of proteins with ligands and other proteins are determined by their dynamic characteristics as opposed to only static, time-invariant processes. We propose a novel computational technique, called ProteinAC (PAC), that can be used to analyze small scale functional protein motions as well as interactions with ligands directly in the frequency domain. PAC was inspired by a frequency domain analysis technique that is widely used in electronic circuit design, and can be applied to both coarse-grained and all-atom models. It can be considered as a generalization of previously proposed static perturbation-response methods, where the frequency of the perturbation becomes the key. We discuss the precise relationship of PAC to static perturbation-response schemes. We show that the frequency of the perturbation may be an important factor in protein dynamics. Perturbations at different frequencies may result in completely different response behavior while magnitude and direction are kept constant. Furthermore, we introduce several novel frequency dependent metrics that can be computed via PAC in order to characterize response behavior. We present results for the ferric binding protein that demonstrate the potential utility of the proposed techniques.

  19. A B-TOF mass spectrometer for the analysis of ions with extreme high start-up energies.

    PubMed

    Lezius, M

    2002-03-01

    Weak magnetic deflection is combined with two acceleration stage time-of-flight mass spectrometry and subsequent position-sensitive ion detection. The experimental method, called B-TOF mass spectrometry, is described with respect to its theoretical background and some experimental results. It is demonstrated that the technique has distinct advantages over other approaches, with special respect to the identification and analysis of very highly energetic ions with an initially large energy broadening (up to 1 MeV) and with high charge states (up to 30+). Similar energetic targets are a common case in intense laser-matter interaction processes found during laser ablation, laser-cluster and laser-molecule interaction and fast particle and x-ray generation from laser-heated plasma. Copyright 2002 John Wiley & Sons, Ltd.

  20. Structural analysis of the industrial grade calcite

    NASA Astrophysics Data System (ADS)

    Shah, Rajiv P.; Raval, Kamlesh G.

    2017-05-01

    The chemical, optical and structural characterization of the industrial grade Calcite by EDAX, FT-IR and XRD. EDAX is a widely used technique to analyze the chemical components in a material, FT-IR stands for Fourier Transform Infra-Red, the preferred method of infrared spectroscopy. The resultant spectrum represents the molecular absorption and transmission, creating a molecular fingerprint of the sample, The atomic planes of a crystal cause an incident beam of X-rays to interfere with one another as they leave the crystal. The phenomenon is called X ray diffraction.(XRD). Data analysis of EDAX, FT-IR and XRD has been carried out with help of various instruments and software and find out the results of the these industrial grade materials which are mostly used in ceramics industries

  1. X-ray absorption spectroscopic studies of mononuclear non-heme iron enzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westre, Tami E.

    Fe-K-edge X-ray absorption spectroscopy (XAS) has been used to investigate the electronic and geometric structure of the iron active site in non-heme iron enzymes. A new theoretical extended X-ray absorption fine structure (EXAFS) analysis approach, called GNXAS, has been tested on data for iron model complexes to evaluate the utility and reliability of this new technique, especially with respect to the effects of multiple-scattering. In addition, a detailed analysis of the 1s→3d pre-edge feature has been developed as a tool for investigating the oxidation state, spin state, and geometry of iron sites. Edge and EXAFS analyses have then been appliedmore » to the study of non-heme iron enzyme active sites.« less

  2. Silver and tin plating as medieval techniques of producing counterfeit coins and their identification by means of micro-XRF

    NASA Astrophysics Data System (ADS)

    Hložek, M.; Trojek, T.

    2017-08-01

    Archaeological surveys and metal detector prospecting yield a great amount of coins from the medieval period. Naturally, some of these are counterfeit which an experienced numismatist can determine without using chemical methods. The production of counterfeit coins in the middle ages took place in castles, caves or other remote areas where waste from this activity can still be found today - copper sheets, technical ceramics and counterfeit coins. Until recently, it has been assumed that medieval counterfeit coins are made by silver-plating copper blanks using an amalgam. However, the performed analyses reveal that there are many more techniques of counterfeiting of coins. Other techniques were based on e.g. tin amalgam plating of the blanks or alloying so-called white metal with silver-like appearance from which the coins were minted. Current chemical analyses indicate that the coins were often tinned by hot dipping with no amalgamation. Micro-X-ray fluorescence analysis has been chosen as a suitable non-destructive method to identify present chemical elements in investigated artifacts and to quantify their concentrations. In addition, a quick technique telltale the plating was applied. This technique utilizes the detected fluorescence ratio Kα/Kβ of copper, which is the main ingredient of a lot of historical metallic materials.

  3. Beam modulation: A novel ToF-technique for high resolution diffraction at the Beamline for European Materials Engineering Research (BEER)

    NASA Astrophysics Data System (ADS)

    Rouijaa, M.; Kampmann, R.; Šaroun, J.; Fenske, J.; Beran, P.; Müller, M.; Lukáš, P.; Schreyer, A.

    2018-05-01

    The Beamline for European Materials Engineering Research (BEER) is under construction at the European Spallation Source (ESS) in Lund, Sweden. A basic requirement on BEER is to make best use of the long ESS pulse (2.86 ms) for engineering investigations. High-resolution diffraction, however, demands timing resolution up to 0.1% corresponding to a pulse length down to about 70 μs for the case of thermal neutrons (λ ∼ 1.8 Å). Such timing resolution can be achieved by pulse shaping techniques cutting a short section out of the long pulse, and thus paying for resolution by strong loss of intensity. In contrast to this, BEER proposes a novel operation mode called pulse modulation technique based on a new chopper design, which extracts several short pulses out of the long ESS pulse, and hence leads to a remarkable gain of intensity compared to nowadays existing conventional pulse shaping techniques. The potential of the new technique can be used with full advantage for investigating strains and textures of highly symmetric materials. Due to its instrument design and the high brilliance of the ESS pulse, BEER is expected to become the European flagship for engineering research for strain mapping and texture analysis.

  4. Determination of the optimal number of components in independent components analysis.

    PubMed

    Kassouf, Amine; Jouan-Rimbaud Bouveresse, Delphine; Rutledge, Douglas N

    2018-03-01

    Independent components analysis (ICA) may be considered as one of the most established blind source separation techniques for the treatment of complex data sets in analytical chemistry. Like other similar methods, the determination of the optimal number of latent variables, in this case, independent components (ICs), is a crucial step before any modeling. Therefore, validation methods are required in order to decide about the optimal number of ICs to be used in the computation of the final model. In this paper, three new validation methods are formally presented. The first one, called Random_ICA, is a generalization of the ICA_by_blocks method. Its specificity resides in the random way of splitting the initial data matrix into two blocks, and then repeating this procedure several times, giving a broader perspective for the selection of the optimal number of ICs. The second method, called KMO_ICA_Residuals is based on the computation of the Kaiser-Meyer-Olkin (KMO) index of the transposed residual matrices obtained after progressive extraction of ICs. The third method, called ICA_corr_y, helps to select the optimal number of ICs by computing the correlations between calculated proportions and known physico-chemical information about samples, generally concentrations, or between a source signal known to be present in the mixture and the signals extracted by ICA. These three methods were tested using varied simulated and experimental data sets and compared, when necessary, to ICA_by_blocks. Results were relevant and in line with expected ones, proving the reliability of the three proposed methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. CNV-ROC: A cost effective, computer-aided analytical performance evaluator of chromosomal microarrays

    PubMed Central

    Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.

    2016-01-01

    Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567

  6. [Body, rights and comprehensive health: Analysis of the parliamentary debates on the Gender Identity and Assisted Fertilization Laws (Argentina, 2011-2013)].

    PubMed

    Farji Neer, Anahí

    2015-09-01

    In this paper we present an analysis of the parliamentary debates of the Gender Identity Law (No. 26743) and the Assisted Fertilization Law (No. 26862) carried out in the Argentine National Congress between 2011 and 2013. Using a qualitative content analysis technique, the stenographic records of the debates were analyzed to explore the following questions: How was the public problem to which each law responds characterized? How was the mission of each law conceptualized? To what extent did those definitions call into question ideas of health and illness, in including in the public health system coverage for certain medical treatments of body optimization or modification? In the process of sanctioning both laws, the concepts of health and disease were put into dispute as moral categories. In this context, an expanded concept of comprehensive health arose, in which desires regarding reproduction and the body were included.

  7. Development of an unsteady aerodynamics model to improve correlation of computed blade stresses with test data

    NASA Technical Reports Server (NTRS)

    Gangwani, S. T.

    1985-01-01

    A reliable rotor aeroelastic analysis operational that correctly predicts the vibration levels for a helicopter is utilized to test various unsteady aerodynamics models with the objective of improving the correlation between test and theory. This analysis called Rotor Aeroelastic Vibration (RAVIB) computer program is based on a frequency domain forced response analysis which utilizes the transfer matrix techniques to model helicopter/rotor dynamic systems of varying degrees of complexity. The results for the AH-1G helicopter rotor were compared with the flight test data during high speed operation and they indicated a reasonably good correlation for the beamwise and chordwise blade bending moments, but for torsional moments the correlation was poor. As a result, a new aerodynamics model based on unstalled synthesized data derived from the large amplitude oscillating airfoil experiments was developed and tested.

  8. Location and Size Planning of Distributed Photovoltaic Generation in Distribution network System Based on K-means Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Lu, Siqi; Wang, Xiaorong; Wu, Junyong

    2018-01-01

    The paper presents a method to generate the planning scenarios, which is based on K-means clustering analysis algorithm driven by data, for the location and size planning of distributed photovoltaic (PV) units in the network. Taken the power losses of the network, the installation and maintenance costs of distributed PV, the profit of distributed PV and the voltage offset as objectives and the locations and sizes of distributed PV as decision variables, Pareto optimal front is obtained through the self-adaptive genetic algorithm (GA) and solutions are ranked by a method called technique for order preference by similarity to an ideal solution (TOPSIS). Finally, select the planning schemes at the top of the ranking list based on different planning emphasis after the analysis in detail. The proposed method is applied to a 10-kV distribution network in Gansu Province, China and the results are discussed.

  9. Latent morpho-semantic analysis : multilingual information retrieval with character n-grams and mutual information.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, Brett William; Chew, Peter A.; Abdelali, Ahmed

    We describe an entirely statistics-based, unsupervised, and language-independent approach to multilingual information retrieval, which we call Latent Morpho-Semantic Analysis (LMSA). LMSA overcomes some of the shortcomings of related previous approaches such as Latent Semantic Analysis (LSA). LMSA has an important theoretical advantage over LSA: it combines well-known techniques in a novel way to break the terms of LSA down into units which correspond more closely to morphemes. Thus, it has a particular appeal for use with morphologically complex languages such as Arabic. We show through empirical results that the theoretical advantages of LMSA can translate into significant gains in precisionmore » in multilingual information retrieval tests. These gains are not matched either when a standard stemmer is used with LSA, or when terms are indiscriminately broken down into n-grams.« less

  10. Functional feature embedded space mapping of fMRI data.

    PubMed

    Hu, Jin; Tian, Jie; Yang, Lei

    2006-01-01

    We have proposed a new method for fMRI data analysis which is called Functional Feature Embedded Space Mapping (FFESM). Our work mainly focuses on the experimental design with periodic stimuli which can be described by a number of Fourier coefficients in the frequency domain. A nonlinear dimension reduction technique Isomap is applied to the high dimensional features obtained from frequency domain of the fMRI data for the first time. Finally, the presence of activated time series is identified by the clustering method in which the information theoretic criterion of minimum description length (MDL) is used to estimate the number of clusters. The feasibility of our algorithm is demonstrated by real human experiments. Although we focus on analyzing periodic fMRI data, the approach can be extended to analyze non-periodic fMRI data (event-related fMRI) by replacing the Fourier analysis with a wavelet analysis.

  11. A numerical formulation and algorithm for limit and shakedown analysis of large-scale elastoplastic structures

    NASA Astrophysics Data System (ADS)

    Peng, Heng; Liu, Yinghua; Chen, Haofeng

    2018-05-01

    In this paper, a novel direct method called the stress compensation method (SCM) is proposed for limit and shakedown analysis of large-scale elastoplastic structures. Without needing to solve the specific mathematical programming problem, the SCM is a two-level iterative procedure based on a sequence of linear elastic finite element solutions where the global stiffness matrix is decomposed only once. In the inner loop, the static admissible residual stress field for shakedown analysis is constructed. In the outer loop, a series of decreasing load multipliers are updated to approach to the shakedown limit multiplier by using an efficient and robust iteration control technique, where the static shakedown theorem is adopted. Three numerical examples up to about 140,000 finite element nodes confirm the applicability and efficiency of this method for two-dimensional and three-dimensional elastoplastic structures, with detailed discussions on the convergence and the accuracy of the proposed algorithm.

  12. Recent advances of liquid chromatography-(tandem) mass spectrometry in clinical and forensic toxicology.

    PubMed

    Peters, Frank T

    2011-01-01

    Liquid chromatography (LC) coupled to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) has become increasingly important in clinical and forensic toxicology as well as doping control and is now a robust and reliable technique for routine analysis in these fields. In recent years, methods for LC-MS(/MS)-based systematic toxicological analysis using triple quadrupole or ion trap instruments have been considerably improved and a new screening approach based on high-resolution MS analysis using benchtop time-of-flight MS instruments has been developed. Moreover, many applications for so-called multi-target screening and/or quantification of drugs, poisons, and or their metabolites in various biomatrices have been published. The present paper will provide an overview and discuss these recent developments focusing on the literature published after 2006. Copyright © 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Beluga whale, Delphinapterus leucas, vocalizations and their relation to behaviour in the Churchill River, Manitoba, Canada

    NASA Astrophysics Data System (ADS)

    Chmelnitsky, Elly Golda

    The investigation of a species' repertoire and the contexts in which different calls are used is central to understanding vocal communication among animals. Beluga whale, Delphinapterus leucas, calls were classified and described in association with behaviours, from recordings collected in the Churchill River, Manitoba, during the summers of 2006-2008. Calls were subjectively classified based on sound and visual analysis into whistles (64.2% of total calls; 22 call types), pulsed or noisy calls (25.9%; 15 call types), and combined calls (9.9%; seven types). A hierarchical cluster analysis, using six call measurements as variables, separated whistles into 12 groups and results were compared to subjective classification. Beluga calls associated with social interactions, travelling, feeding, and interactions with the boat were described. Call type percentages, relative proportions of different whistle contours (shapes), average frequency, and call duration varied with behaviour. Generally, higher percentages of whistles, more broadband pulsed and noisy calls, and shorter calls (<0.49s) were produced during behaviours associated with higher levels of activity and/or apparent arousal. Information on call types, call characteristics, and behavioural context of calls can be used for automated detection and classification methods and in future studies on call meaning and function.

  14. ToTem: a tool for variant calling pipeline optimization.

    PubMed

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  15. New GMO regulations for old: Determining a new future for EU crop biotechnology.

    PubMed

    Davison, John; Ammann, Klaus

    2017-01-02

    In this review, current EU GMO regulations are subjected to a point-by point analysis to determine their suitability for agriculture in modern Europe. Our analysis concerns present GMO regulations as well as suggestions for possible new regulations for genome editing and New Breeding Techniques (for which no regulations presently exist). Firstly, the present GMO regulations stem from the early days of recombinant DNA and are not adapted to current scientific understanding on this subject. Scientific understanding of GMOs has changed and these regulations are now, not only unfit for their original purpose, but, the purpose itself is now no longer scientifically valid. Indeed, they defy scientific, economic, and even common, sense. A major EU regulatory preconception is that GM crops are basically different from their parent crops. Thus, the EU regulations are "process based" regulations that discriminate against GMOs simply because they are GMOs. However current scientific evidence shows a blending of classical crops and their GMO counterparts with no clear demarcation line between them. Canada has a "product based" approach and determines the safety of each new crop variety independently of the process used to obtain it. We advise that the EC re-writes it outdated regulations and moves toward such a product based approach.  Secondly, over the last few years new genomic editing techniques (sometimes called New Breeding Techniques) have evolved. These techniques are basically mutagenesis techniques that can generate genomic diversity and have vast potential for crop improvement. They are not GMO based techniques (any more than mutagenesis is a GMO technique), since in many cases no new DNA is introduced. Thus they cannot simply be lumped together with GMOs (as many anti-GMO NGOs would prefer). The EU currently has no regulations to cover these new techniques. In this review, we make suggestions as to how these new gene edited crops may be regulated. The EU is at a turning point where the wrong decision could destroy European agricultural competitively for decades to come.

  16. Altai pika (Ochotona alpina) alarm calls: individual acoustic variation and the phenomenon of call-synchronous ear folding behavior.

    PubMed

    Volodin, Ilya A; Matrosova, Vera A; Frey, Roland; Kozhevnikova, Julia D; Isaeva, Inna L; Volodina, Elena V

    2018-06-11

    Non-hibernating pikas collect winter food reserves and store them in hay piles. Individualization of alarm calls might allow discrimination between colony members and conspecifics trying to steal food items from a colony pile. We investigated vocal posture, vocal tract length, and individual acoustic variation of alarm calls, emitted by wild-living Altai pikas Ochotona alpina toward a researcher. Recording started when a pika started calling and lasted as long as possible. The alarm call series of 442 individual callers from different colonies consisted of discrete short (0.073-0.157 s), high-frequency (7.31-15.46 kHz), and frequency-modulated calls separated by irregular intervals. Analysis of 442 discrete calls, the second of each series, revealed that 44.34% calls lacked nonlinear phenomena, in 7.02% nonlinear phenomena covered less than half of call duration, and in 48.64% nonlinear phenomena covered more than half of call duration. Peak frequencies varied among individuals but always fitted one of three maxima corresponding to the vocal tract resonance frequencies (formants) calculated for an estimated 45-mm oral vocal tract. Discriminant analysis using variables of 8 calls per series of 36 different callers, each from a different colony, correctly assigned over 90% of the calls to individuals. Consequently, Altai pika alarm calls are individualistic and nonlinear phenomena might further increase this acoustic individualization. Additionally, video analysis revealed a call-synchronous, very fast (0.13-0.23 s) folding, depression, and subsequent re-expansion of the pinna confirming an earlier report of this behavior that apparently contributes to protecting the hearing apparatus from damage by the self-generated high-intensity alarm calls.

  17. Using telephony data to facilitate discovery of clinical workflows.

    PubMed

    Rucker, Donald W

    2017-04-19

    Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  18. Two-photon imaging and analysis of neural network dynamics

    NASA Astrophysics Data System (ADS)

    Lütcke, Henry; Helmchen, Fritjof

    2011-08-01

    The glow of a starry night sky, the smell of a freshly brewed cup of coffee or the sound of ocean waves breaking on the beach are representations of the physical world that have been created by the dynamic interactions of thousands of neurons in our brains. How the brain mediates perceptions, creates thoughts, stores memories and initiates actions remains one of the most profound puzzles in biology, if not all of science. A key to a mechanistic understanding of how the nervous system works is the ability to measure and analyze the dynamics of neuronal networks in the living organism in the context of sensory stimulation and behavior. Dynamic brain properties have been fairly well characterized on the microscopic level of individual neurons and on the macroscopic level of whole brain areas largely with the help of various electrophysiological techniques. However, our understanding of the mesoscopic level comprising local populations of hundreds to thousands of neurons (so-called 'microcircuits') remains comparably poor. Predominantly, this has been due to the technical difficulties involved in recording from large networks of neurons with single-cell spatial resolution and near-millisecond temporal resolution in the brain of living animals. In recent years, two-photon microscopy has emerged as a technique which meets many of these requirements and thus has become the method of choice for the interrogation of local neural circuits. Here, we review the state-of-research in the field of two-photon imaging of neuronal populations, covering the topics of microscope technology, suitable fluorescent indicator dyes, staining techniques, and in particular analysis techniques for extracting relevant information from the fluorescence data. We expect that functional analysis of neural networks using two-photon imaging will help to decipher fundamental operational principles of neural microcircuits.

  19. Modeling Complex Chemical Systems: Problems and Solutions

    NASA Astrophysics Data System (ADS)

    van Dijk, Jan

    2016-09-01

    Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.

  20. On statistical inference in time series analysis of the evolution of road safety.

    PubMed

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Adaptation of warrant price with Black Scholes model and historical volatility

    NASA Astrophysics Data System (ADS)

    Aziz, Khairu Azlan Abd; Idris, Mohd Fazril Izhar Mohd; Saian, Rizauddin; Daud, Wan Suhana Wan

    2015-05-01

    This project discusses about pricing warrant in Malaysia. The Black Scholes model with non-dividend approach and linear interpolation technique was applied in pricing the call warrant. Three call warrants that are listed in Bursa Malaysia were selected randomly from UiTM's datastream. The finding claims that the volatility for each call warrants are different to each other. We have used the historical volatility which will describes the price movement by which an underlying share is expected to fluctuate within a period. The Black Scholes model price that was obtained by the model will be compared with the actual market price. Mispricing the call warrants will contribute to under or over valuation price. Other variables like interest rate, time to maturity date, exercise price and underlying stock price are involves in pricing call warrants as well as measuring the moneyness of call warrants.

  2. Trick or Technique?

    ERIC Educational Resources Information Center

    Sheard, Michael

    2009-01-01

    More often than one might at first imagine, a simple trick involving integration by parts can be used to compute indefinite integrals in unexpected and amusing ways. A systematic look at the trick illuminates the question of whether the trick is useful enough to be called an actual technique of integration.

  3. Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models

    DTIC Science & Technology

    2008-08-01

    Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool

  4. The Effect of Perceiving a Calling on Pakistani Nurses' Organizational Commitment, Organizational Citizenship Behavior, and Job Stress.

    PubMed

    Afsar, Bilal; Shahjehan, Asad; Cheema, Sadia; Javed, Farheen

    2018-03-01

    People differ considerably in the way in which they express and experience their nursing careers. The positive effects associated with having a calling may differ substantially based on individuals' abilities to live out their callings. In a working world where many individuals have little to no choice in their type of employment and thus are unable to live out a calling even if they have one, the current study examined how perceiving a calling and living a calling interacted to predict organizational commitment, organizational citizenship behavior, and job stress with career commitment mediating the effect of the interactions on the three outcome variables. The purpose of the study is to investigate the mediating effect of career commitment between the relationships of calling and (a) nurses' attitudes (organizational commitment), (b) behaviors (organizational citizenship behavior), and (c) subjective experiences regarding work (job stress). Using a descriptive exploratory design, data were collected from 332 registered nurses working in Pakistani hospitals. Descriptive analysis and hierarchical regression analysis were used for data analysis. Living a calling moderated the effect of calling on career commitment, organizational citizenship behavior, and job stress, and career commitment fully mediated the effect of calling on organizational commitment, organizational citizenship behavior, and job stress. Increasing the understanding of calling, living a calling, and career commitment may increase nurses' organizational commitment and organizational citizenship behavior and decrease job stress. The study provided evidence to help nursing managers and health policy makers integrate knowledge and skills related to calling into career interventions and help nurses discover their calling.

  5. The mediating effect of calling on the relationship between medical school students' academic burnout and empathy.

    PubMed

    Chae, Su Jin; Jeong, So Mi; Chung, Yoon-Sok

    2017-09-01

    This study is aimed at identifying the relationships between medical school students' academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students' empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. This result demonstrates that calling is a key variable that mediates the relationship between medical students' academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students' empathy skills.

  6. A new stratification of mourning dove call-count routes

    USGS Publications Warehouse

    Blankenship, L.H.; Humphrey, A.B.; MacDonald, D.

    1971-01-01

    The mourning dove (Zenaidura macroura) call-count survey is a nationwide audio-census of breeding mourning doves. Recent analyses of the call-count routes have utilized a stratification based upon physiographic regions of the United States. An analysis of 5 years of call-count data, based upon stratification using potential natural vegetation, has demonstrated that this uew stratification results in strata with greater homogeneity than the physiographic strata, provides lower error variance, and hence generates greatet precision in the analysis without an increase in call-count routes. Error variance was reduced approximately 30 percent for the contiguous United States. This indicates that future analysis based upon the new stratification will result in an increased ability to detect significant year-to-year changes.

  7. Cold Calling and Web Postings: Do They Improve Students' Preparation and Learning in Statistics?

    ERIC Educational Resources Information Center

    Levy, Dan

    2014-01-01

    Getting students to prepare well for class is a common challenge faced by instructors all over the world. This study investigates the effects that two frequently used techniques to increase student preparation--web postings and cold calling--have on student outcomes. The study is based on two experiments and a qualitative study conducted in a…

  8. Comparative Analysis of Sequential Proximal Optimizing Technique Versus Kissing Balloon Inflation Technique in Provisional Bifurcation Stenting: Fractal Coronary Bifurcation Bench Test.

    PubMed

    Finet, Gérard; Derimay, François; Motreff, Pascal; Guerin, Patrice; Pilet, Paul; Ohayon, Jacques; Darremont, Olivier; Rioufol, Gilles

    2015-08-24

    This study used a fractal bifurcation bench model to compare 6 optimization sequences for coronary bifurcation provisional stenting, including 1 novel sequence without kissing balloon inflation (KBI), comprising initial proximal optimizing technique (POT) + side-branch inflation (SBI) + final POT, called "re-POT." In provisional bifurcation stenting, KBI fails to improve the rate of major adverse cardiac events. Proximal geometric deformation increases the rate of in-stent restenosis and target lesion revascularization. A bifurcation bench model was used to compare KBI alone, KBI after POT, KBI with asymmetric inflation pressure after POT, and 2 sequences without KBI: initial POT plus SBI, and initial POT plus SBI with final POT (called "re-POT"). For each protocol, 5 stents were tested using 2 different drug-eluting stent designs: that is, a total of 60 tests. Compared with the classic KBI-only sequence and those associating POT with modified KBI, the re-POT sequence gave significantly (p < 0.05) better geometric results: it reduced SB ostium stent-strut obstruction from 23.2 ± 6.0% to 5.6 ± 8.3%, provided perfect proximal stent apposition with almost perfect circularity (ellipticity index reduced from 1.23 ± 0.02 to 1.04 ± 0.01), reduced proximal area overstretch from 24.2 ± 7.6% to 8.0 ± 0.4%, and reduced global strut malapposition from 40 ± 6.2% to 2.6 ± 1.4%. In comparison with 5 other techniques, the re-POT sequence significantly optimized the final result of provisional coronary bifurcation stenting, maintaining circular geometry while significantly reducing SB ostium strut obstruction and global strut malapposition. These experimental findings confirm that provisional stenting may be optimized more effectively without KBI using re-POT. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  9. Fabrication of dense wavelength division multiplexing filters with large useful area

    NASA Astrophysics Data System (ADS)

    Lee, Cheng-Chung; Chen, Sheng-Hui; Hsu, Jin-Cherng; Kuo, Chien-Cheng

    2006-08-01

    Dense Wavelength Division Multiplexers (DWDM), a kind of narrow band-pass filter, are extremely sensitive to the optical thickness error in each composite layer. Therefore to have a large useful coating area is extreme difficult because of the uniformity problem. To enlarge the useful coating area it is necessary to improve their design and their fabrication. In this study, we discuss how the tooling factors at different positions and for different materials are related to the optical performance of the design. 100GHz DWDM filters were fabricated by E-gun evaporation with ion-assisted deposition (IAD). To improve the coating uniformity, an analysis technique called shaping tooling factor (STF) was used to analyze the deviation of the optical thickness in different materials so as to enlarge the useful coating area. Also a technique of etching the deposited layers with oxygen ions was introduced. When the above techniques were applied in the fabrication of 100 GHz DWDM filters, the uniformity was better than +/-0.002% over an area of 72 mm in diameter and better than +/-0.0006% over 20mm in diameter.

  10. Compressible cavitation with stochastic field method

    NASA Astrophysics Data System (ADS)

    Class, Andreas; Dumond, Julien

    2012-11-01

    Non-linear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrange particles or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic field method solving pdf transport based on Euler fields has been proposed which eliminates the necessity to mix Euler and Lagrange techniques or prescribed pdf assumptions. In the present work, part of the PhD Design and analysis of a Passive Outflow Reducer relying on cavitation, a first application of the stochastic field method to multi-phase flow and in particular to cavitating flow is presented. The application considered is a nozzle subjected to high velocity flow so that sheet cavitation is observed near the nozzle surface in the divergent section. It is demonstrated that the stochastic field formulation captures the wide range of pdf shapes present at different locations. The method is compatible with finite-volume codes where all existing physical models available for Lagrange techniques, presumed pdf or binning methods can be easily extended to the stochastic field formulation.

  11. Fundamentals and techniques of nonimaging optics research

    NASA Astrophysics Data System (ADS)

    Winston, R.; Ogallagher, J.

    1987-07-01

    Nonimaging Optics differs from conventional approaches in its relaxation of unnecessary constraints on energy transport imposed by the traditional methods for optimizing image formation and its use of more broadly based analytical techniques such as phase space representations of energy flow, radiative transfer analysis, thermodynamic arguments, etc. Based on these means, techniques for designing optical elements which approach and in some cases attain the maximum concentration permitted by the Second Law of Thermodynamics were developed. The most widely known of these devices are the family of Compound Parabolic Concentrators (CPC's) and their variants and the so called Flow-Line or trumpet concentrator derived from the geometric vector flux formalism developed under this program. Applications of these and other such ideal or near-ideal devices permits increases of typically a factor of four (though in some cases as much as an order of magnitude) in the concentration above that possible with conventional means. Present efforts can be classed into two main areas: (1) classical geometrical nonimaging optics, and (2) logical extensions of nonimaging concepts to the physical optics domain.

  12. Fundamentals and techniques of nonimaging optics research at the University of Chicago

    NASA Astrophysics Data System (ADS)

    Winston, R.; Ogallagher, J.

    1986-11-01

    Nonimaging Optics differs from conventional approaches in its relaxation of unnecessary constraints on energy transport imposed by the traditional methods for optimizing image formation and its use of more broadly based analytical techniques such as phase space representations of energy flow, radiative transfer analysis, thermodynamic arguments, etc. Based on these means, techniques for designing optical elements which approach and in some cases attain the maximum concentration permitted by the Second Law of Thermodynamics were developed. The most widely known of these devices are the family of Compound Parabolic Concentrators (CPC's) and their variants and the so called Flow-Line concentrator derived from the geometric vector flux formalism developed under this program. Applications of these and other such ideal or near-ideal devices permits increases of typically a factor of four (though in some cases as much as an order of magnitude) in the concentration above that possible with conventional means. In the most recent phase, our efforts can be classed into two main areas; (a) ''classical'' geometrical nonimaging optics; and (b) logical extensions of nonimaging concepts to the physical optics domain.

  13. An explicit approach to detecting and characterizing submersed aquatic vegetation using a single-beam digital echosounder

    NASA Astrophysics Data System (ADS)

    Sabol, Bruce M.

    2005-09-01

    There has been a longstanding need for an objective and cost-effective technique to detect, characterize, and quantify submersed aquatic vegetation at spatial scales between direct physical sampling and remote aerial-based imaging. Acoustic-based approaches for doing so are reviewed and an explicit approach, using a narrow, single-beam echosounder, is described in detail. This heuristic algorithm is based on the spatial distribution of a thresholded signal generated from a high-frequency, narrow-beam echosounder operated in a vertical orientation from a survey boat. The physical basis, rationale, and implementation of this algorithm are described, and data documenting performance are presented. Using this technique, it is possible to generate orders of magnitude more data than would be available using previous techniques with a comparable level of effort. Thus, new analysis and interpretation approaches are called for which can make full use of these data. Several analyses' examples are shown for environmental effects application studies. Current operational window and performance limitations are identified and thoughts on potential processing approaches to improve performance are discussed.

  14. Prospective motion correction of high-resolution magnetic resonance imaging data in children.

    PubMed

    Brown, Timothy T; Kuperman, Joshua M; Erhart, Matthew; White, Nathan S; Roddey, J Cooper; Shankaranarayanan, Ajit; Han, Eric T; Rettmann, Dan; Dale, Anders M

    2010-10-15

    Motion artifacts pose significant problems for the acquisition and analysis of high-resolution magnetic resonance imaging data. These artifacts can be particularly severe when studying pediatric populations, where greater patient movement reduces the ability to clearly view and reliably measure anatomy. In this study, we tested the effectiveness of a new prospective motion correction technique, called PROMO, as applied to making neuroanatomical measures in typically developing school-age children. This method attempts to address the problem of motion at its source by keeping the measurement coordinate system fixed with respect to the subject throughout image acquisition. The technique also performs automatic rescanning of images that were acquired during intervals of particularly severe motion. Unlike many previous techniques, this approach adjusts for both in-plane and through-plane movement, greatly reducing image artifacts without the need for additional equipment. Results show that the use of PROMO notably enhances subjective image quality, reduces errors in Freesurfer cortical surface reconstructions, and significantly improves the subcortical volumetric segmentation of brain structures. Further applications of PROMO for clinical and cognitive neuroscience are discussed. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Non-uniform sampling: post-Fourier era of NMR data collection and processing.

    PubMed

    Kazimierczuk, Krzysztof; Orekhov, Vladislav

    2015-11-01

    The invention of multidimensional techniques in the 1970s revolutionized NMR, making it the general tool of structural analysis of molecules and materials. In the most straightforward approach, the signal sampling in the indirect dimensions of a multidimensional experiment is performed in the same manner as in the direct dimension, i.e. with a grid of equally spaced points. This results in lengthy experiments with a resolution often far from optimum. To circumvent this problem, numerous sparse-sampling techniques have been developed in the last three decades, including two traditionally distinct approaches: the radial sampling and non-uniform sampling. This mini review discusses the sparse signal sampling and reconstruction techniques from the point of view of an underdetermined linear algebra problem that arises when a full, equally spaced set of sampled points is replaced with sparse sampling. Additional assumptions that are introduced to solve the problem, as well as the shape of the undersampled Fourier transform operator (visualized as so-called point spread function), are shown to be the main differences between various sparse-sampling methods. Copyright © 2015 John Wiley & Sons, Ltd.

  16. On the utilization of engineering knowledge in design optimization

    NASA Technical Reports Server (NTRS)

    Papalambros, P.

    1984-01-01

    Some current research work conducted at the University of Michigan is described to illustrate efforts for incorporating knowledge in optimization in a nontraditional way. The incorporation of available knowledge in a logic structure is examined in two circumstances. The first examines the possibility of introducing global design information in a local active set strategy implemented during the iterations of projection-type algorithms for nonlinearly constrained problems. The technique used algorithms for nonlinearly constrained problems. The technique used combines global and local monotinicity analysis of the objective and constraint functions. The second examines a knowledge-based program which aids the user to create condigurations that are most desirable from the manufacturing assembly viewpoint. The data bank used is the classification scheme suggested by Boothroyd. The important aspect of this program is that it is an aid for synthesis intended for use in the design concept phase in a way similar to the so-called idea-triggers in creativity-enhancement techniques like brain-storming. The idea generation, however, is not random but it is driven by the goal of achieving the best acceptable configuration.

  17. Measurement of Stress Distribution Around a Circular Hole in a Plate Under Bending Moment Using Phase-shifting Method with Reflective Polariscope Arrangement

    NASA Astrophysics Data System (ADS)

    Baek, Tae Hyun

    Photoelasticity is one of the most widely used whole-field optical methods for stress analysis. The technique of birefringent coatings, also called the method of photoelastic coatings, extends the classical procedures of model photoelasticity to the measurement of surface strains in opaque models made of any structural material. Photoelastic phase-shifting method can be used for the determination of the phase values of isochromatics and isoclinics. In this paper, photoelastic phase-shifting technique and conventional Babinet-Soleil compensation method were utilized to analyze a specimen with a triangular hole and a circular hole under bending. Photoelastic phase-shifting technique is whole-field measurement. On the other hand, conventional compensation method is point measurement. Three groups of results were obtained by phase-shifting method with reflective polariscope arrangement, conventional compensation method and FEM simulation, respectively. The results from the first two methods agree with each other relatively well considering experiment error. The advantage of photoelastic phase-shifting method is that it is possible to measure the stress distribution accurately close to the edge of holes.

  18. MindDigger: Feature Identification and Opinion Association for Chinese Movie Reviews

    NASA Astrophysics Data System (ADS)

    Zhao, Lili; Li, Chunping

    In this paper, we present a prototype system called MindDigger, which can be used to analyze the opinions in Chinese movie reviews. Different from previous research that employed techniques on product reviews, we focus on Chinese movie reviews, in which opinions are expressed in subtle and varied ways. The system designed in this work aims to extract the opinion expressions and assign them to the corresponding features. The core tasks include feature and opinion extraction, and feature-opinion association. To deal with Chinese effectively, several novel approaches based on syntactic analysis are proposed in this paper. Running results show the performance is satisfactory.

  19. NERVA dynamic analysis methodology, SPRVIB

    NASA Technical Reports Server (NTRS)

    Vronay, D. F.

    1972-01-01

    The general dynamic computer code called SPRVIB (Spring Vib) developed in support of the NERVA (nuclear engine for rocket vehicle application) program is described. Using normal mode techniques, the program computes kinematical responses of a structure caused by various combinations of harmonic and elliptic forcing functions or base excitations. Provision is made for a graphical type of force or base excitation input to the structure. A description of the required input format and a listing of the program are presented, along with several examples illustrating the use of the program. SPRVIB is written in FORTRAN 4 computer language for use on the CDC 6600 or the IBM 360/75 computers.

  20. The robot's eyes - Stereo vision system for automated scene analysis

    NASA Technical Reports Server (NTRS)

    Williams, D. S.

    1977-01-01

    Attention is given to the robot stereo vision system which maintains the image produced by solid-state detector television cameras in a dynamic random access memory called RAPID. The imaging hardware consists of sensors (two solid-state image arrays using a charge injection technique), a video-rate analog-to-digital converter, the RAPID memory, and various types of computer-controlled displays, and preprocessing equipment (for reflexive actions, processing aids, and object detection). The software is aimed at locating objects and transversibility. An object-tracking algorithm is discussed and it is noted that tracking speed is in the 50-75 pixels/s range.

  1. The 'relics of Joan of Arc': a forensic multidisciplinary analysis.

    PubMed

    Charlier, P; Poupon, J; Eb, A; De Mazancourt, P; Gilbert, T; Huynh-Charlier, I; Loublier, Y; Verhille, A M; Moulheirat, C; Patou-Mathis, M; Robbiola, L; Montagut, R; Masson, F; Etcheberry, A; Brun, L; Willerslev, E; de la Grandmaison, G Lorin; Durigon, M

    2010-01-30

    Archaeological remains can provide concrete cases, making it possible to develop, refine or validate medico-legal techniques. In the case of the so-called 'Joan of Arc's relics' (a group of bone and archaeological remains known as the 'Bottle of Chinon'), 14 specialists analysed the samples such as a cadaver X of carbonised aspect: forensic anthropologist, medical examiners, pathologists, geneticists, radiologist, biochemists, palynologists, zoologist and archaeologist. Materials, methods and results of this study are presented here. This study aims to offer an exploitable methodology for the modern medico-legal cases of small quantities of human bones of carbonised aspect. 2009 Elsevier Ireland Ltd. All rights reserved.

  2. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  3. Implications of Windowing Techniques for CAI.

    ERIC Educational Resources Information Center

    Heines, Jesse M.; Grinstein, Georges G.

    This paper discusses the use of a technique called windowing in computer assisted instruction to allow independent control of functional areas in complex CAI displays and simultaneous display of output from a running computer program and coordinated instructional material. Two obstacles to widespread use of CAI in computer science courses are…

  4. Ketso: A New Tool for Extension Professionals

    ERIC Educational Resources Information Center

    Bates, James S.

    2016-01-01

    Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…

  5. Measuring the apparent size of the Moon with a digital camera

    NASA Astrophysics Data System (ADS)

    Ellery, Adam; Hughes, Stephen

    2012-09-01

    The Moon appears to be much larger closer to the horizon than when higher in the sky. This is called the ‘Moon illusion’ since the observed size of the Moon is not actually larger when the Moon is just above the horizon. This paper describes a technique for verifying that the observed size of the Moon is not larger on the horizon. The technique can be performed easily in a high-school teaching environment. Moreover, the technique demonstrates the surprising fact that the observed size of the Moon is actually smaller on the horizon due to atmospheric refraction. For the purposes of this paper, several images of the Moon were taken with it close to the horizon and close to the zenith. The images were processed using a free program called ImageJ. The Moon was found to be 5.73 ± 0.04% smaller in area on the horizon then at the zenith.

  6. Modeling and performance analysis using extended fuzzy-timing Petri nets for networked virtual environments.

    PubMed

    Zhou, Y; Murata, T; Defanti, T A

    2000-01-01

    Despite their attractive properties, networked virtual environments (net-VEs) are notoriously difficult to design, implement, and test due to the concurrency, real-time and networking features in these systems. Net-VEs demand high quality-of-service (QoS) requirements on the network to maintain natural and real-time interactions among users. The current practice for net-VE design is basically trial and error, empirical, and totally lacks formal methods. This paper proposes to apply a Petri net formal modeling technique to a net-VE-NICE (narrative immersive constructionist/collaborative environment), predict the net-VE performance based on simulation, and improve the net-VE performance. NICE is essentially a network of collaborative virtual reality systems called the CAVE-(CAVE automatic virtual environment). First, we introduce extended fuzzy-timing Petri net (EFTN) modeling and analysis techniques. Then, we present EFTN models of the CAVE, NICE, and transport layer protocol used in NICE: transmission control protocol (TCP). We show the possibility analysis based on the EFTN model for the CAVE. Then, by using these models and design/CPN as the simulation tool, we conducted various simulations to study real-time behavior, network effects and performance (latencies and jitters) of NICE. Our simulation results are consistent with experimental data.

  7. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  8. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  9. Time-frequency analysis of electric motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bentley, C.L.; Dunn, M.E.; Mattingly, J.K.

    1995-12-31

    Physical signals such as the current of an electric motor become nonstationary as a consequence of degraded operation and broken parts. In this instance, their power spectral densities become time dependent, and time-frequency analysis techniques become the appropriate tools for signal analysis. The first among these techniques, generally called the short-time Fourier transform (STFT) method, is the Gabor transform 2 (GT) of a signal S(t), which decomposes the signal into time-local frequency modes: where the window function, {Phi}(t-{tau}), is a normalized Gaussian. Alternatively, one can decompose the signal into its multi-resolution representation at different levels of magnification. This representation ismore » achieved by the continuous wavelet transform (CWT) where the function g(t) is a kernel of zero average belonging to a family of scaled and shifted wavelet kernels. The CWT can be interpreted as the action of a microscope that locates the signal by the shift parameter b and adjusts its magnification by changing the scale parameter a. The Fourier-transformed CWT, W,{sub g}(a, {omega}), acts as a filter that places the high-frequency content of a signal into the lower end of the scale spectrum and vice versa for the low frequencies. Signals from a motor in three different states were analyzed.« less

  10. Statistical inference of dynamic resting-state functional connectivity using hierarchical observation modeling.

    PubMed

    Sojoudi, Alireza; Goodyear, Bradley G

    2016-12-01

    Spontaneous fluctuations of blood-oxygenation level-dependent functional magnetic resonance imaging (BOLD fMRI) signals are highly synchronous between brain regions that serve similar functions. This provides a means to investigate functional networks; however, most analysis techniques assume functional connections are constant over time. This may be problematic in the case of neurological disease, where functional connections may be highly variable. Recently, several methods have been proposed to determine moment-to-moment changes in the strength of functional connections over an imaging session (so called dynamic connectivity). Here a novel analysis framework based on a hierarchical observation modeling approach was proposed, to permit statistical inference of the presence of dynamic connectivity. A two-level linear model composed of overlapping sliding windows of fMRI signals, incorporating the fact that overlapping windows are not independent was described. To test this approach, datasets were synthesized whereby functional connectivity was either constant (significant or insignificant) or modulated by an external input. The method successfully determines the statistical significance of a functional connection in phase with the modulation, and it exhibits greater sensitivity and specificity in detecting regions with variable connectivity, when compared with sliding-window correlation analysis. For real data, this technique possesses greater reproducibility and provides a more discriminative estimate of dynamic connectivity than sliding-window correlation analysis. Hum Brain Mapp 37:4566-4580, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Exploratory and spatial data analysis (EDA-SDA) for determining regional background levels and anomalies of potentially toxic elements in soils from Catorce-Matehuala, Mexico

    USGS Publications Warehouse

    Chiprés, J.A.; Castro-Larragoitia, J.; Monroy, M.G.

    2009-01-01

    The threshold between geochemical background and anomalies can be influenced by the methodology selected for its estimation. Environmental evaluations, particularly those conducted in mineralized areas, must consider this when trying to determinate the natural geochemical status of a study area, quantifying human impacts, or establishing soil restoration values for contaminated sites. Some methods in environmental geochemistry incorporate the premise that anomalies (natural or anthropogenic) and background data are characterized by their own probabilistic distributions. One of these methods uses exploratory data analysis (EDA) on regional geochemical data sets coupled with a geographic information system (GIS) to spatially understand the processes that influence the geochemical landscape in a technique that can be called a spatial data analysis (SDA). This EDA-SDA methodology was used to establish the regional background range from the area of Catorce-Matehuala in north-central Mexico. Probability plots of the data, particularly for those areas affected by human activities, show that the regional geochemical background population is composed of smaller subpopulations associated with factors such as soil type and parent material. This paper demonstrates that the EDA-SDA method offers more certainty in defining thresholds between geochemical background and anomaly than a numeric technique, making it a useful tool for regional geochemical landscape analysis and environmental geochemistry studies.

  12. On-call service of neurosurgeons in Germany: organization, use of communication services, and personal acceptance of modern technologies.

    PubMed

    Brenke, Christopher; Lassel, Elke A; Terris, Darcey; Kurt, Aysel; Schmieder, Kirsten; Schoenberg, Stefan O; Weisser, Gerald

    2014-05-01

    A significant proportion of acute care neurosurgical patients present to hospital outside regular working hours. The objective of our study was to evaluate the structure of neurosurgical on-call services in Germany, the use of modern communication devices and teleradiology services, and the personal acceptance of modern technologies by neurosurgeons. A nationwide survey of all 141 neurosurgical departments in Germany was performed. The questionnaire consisted of two parts: one for neurosurgical departments and one for individual neurosurgeons. The questionnaire, available online and mailed in paper form, included 21 questions about on-call service structure; the availability and use of communication devices, teleradiology services, and other information services; and neurosurgeons' personal acceptance of modern technologies. The questionnaire return rate from departments was 63.1% (89/141), whereas 187 individual neurosurgeons responded. For 57.3% of departments, teleradiology services were available and were frequently used by 62.2% of neurosurgeons. A further 23.6% of departments described using smartphone screenshots of computed tomography (CT) images transmitted by multimedia messaging service (MMS), and 8.6% of images were described as sent by unencrypted email. Although 47.0% of neurosurgeons reported owning a smartphone, only 1.1% used their phone for on-call image communication. Teleradiology services were observed to be widely used by on-call neurosurgeons in Germany. Nevertheless, a significant number of departments appear to use outdated techniques or techniques that leave patient data unprotected. On-call neurosurgeons in Germany report a willingness to adopt more modern approaches, utilizing readily available smartphones or tablet technology. Georg Thieme Verlag KG Stuttgart · New York.

  13. Patome: a database server for biological sequence annotation and analysis in issued patents and published patent applications.

    PubMed

    Lee, Byungwook; Kim, Taehyung; Kim, Seon-Kyu; Lee, Kwang H; Lee, Doheon

    2007-01-01

    With the advent of automated and high-throughput techniques, the number of patent applications containing biological sequences has been increasing rapidly. However, they have attracted relatively little attention compared to other sequence resources. We have built a database server called Patome, which contains biological sequence data disclosed in patents and published applications, as well as their analysis information. The analysis is divided into two steps. The first is an annotation step in which the disclosed sequences were annotated with RefSeq database. The second is an association step where the sequences were linked to Entrez Gene, OMIM and GO databases, and their results were saved as a gene-patent table. From the analysis, we found that 55% of human genes were associated with patenting. The gene-patent table can be used to identify whether a particular gene or disease is related to patenting. Patome is available at http://www.patome.org/; the information is updated bimonthly.

  14. Patome: a database server for biological sequence annotation and analysis in issued patents and published patent applications

    PubMed Central

    Lee, Byungwook; Kim, Taehyung; Kim, Seon-Kyu; Lee, Kwang H.; Lee, Doheon

    2007-01-01

    With the advent of automated and high-throughput techniques, the number of patent applications containing biological sequences has been increasing rapidly. However, they have attracted relatively little attention compared to other sequence resources. We have built a database server called Patome, which contains biological sequence data disclosed in patents and published applications, as well as their analysis information. The analysis is divided into two steps. The first is an annotation step in which the disclosed sequences were annotated with RefSeq database. The second is an association step where the sequences were linked to Entrez Gene, OMIM and GO databases, and their results were saved as a gene–patent table. From the analysis, we found that 55% of human genes were associated with patenting. The gene–patent table can be used to identify whether a particular gene or disease is related to patenting. Patome is available at ; the information is updated bimonthly. PMID:17085479

  15. Nanomaterials as Assisted Matrix of Laser Desorption/Ionization Time-of-Flight Mass Spectrometry for the Analysis of Small Molecules.

    PubMed

    Lu, Minghua; Yang, Xueqing; Yang, Yixin; Qin, Peige; Wu, Xiuru; Cai, Zongwei

    2017-04-21

    Matrix-assisted laser desorption/ionization (MALDI), a soft ionization method, coupling with time-of-flight mass spectrometry (TOF MS) has become an indispensible tool for analyzing macromolecules, such as peptides, proteins, nucleic acids and polymers. However, the application of MALDI for the analysis of small molecules (<700 Da) has become the great challenge because of the interference from the conventional matrix in low mass region. To overcome this drawback, more attention has been paid to explore interference-free methods in the past decade. The technique of applying nanomaterials as matrix of laser desorption/ionization (LDI), also called nanomaterial-assisted laser desorption/ionization (nanomaterial-assisted LDI), has attracted considerable attention in the analysis of low-molecular weight compounds in TOF MS. This review mainly summarized the applications of different types of nanomaterials including carbon-based, metal-based and metal-organic frameworks as assisted matrices for LDI in the analysis of small biological molecules, environmental pollutants and other low-molecular weight compounds.

  16. Nanomaterials as Assisted Matrix of Laser Desorption/Ionization Time-of-Flight Mass Spectrometry for the Analysis of Small Molecules

    PubMed Central

    Lu, Minghua; Yang, Xueqing; Yang, Yixin; Qin, Peige; Wu, Xiuru; Cai, Zongwei

    2017-01-01

    Matrix-assisted laser desorption/ionization (MALDI), a soft ionization method, coupling with time-of-flight mass spectrometry (TOF MS) has become an indispensible tool for analyzing macromolecules, such as peptides, proteins, nucleic acids and polymers. However, the application of MALDI for the analysis of small molecules (<700 Da) has become the great challenge because of the interference from the conventional matrix in low mass region. To overcome this drawback, more attention has been paid to explore interference-free methods in the past decade. The technique of applying nanomaterials as matrix of laser desorption/ionization (LDI), also called nanomaterial-assisted laser desorption/ionization (nanomaterial-assisted LDI), has attracted considerable attention in the analysis of low-molecular weight compounds in TOF MS. This review mainly summarized the applications of different types of nanomaterials including carbon-based, metal-based and metal-organic frameworks as assisted matrices for LDI in the analysis of small biological molecules, environmental pollutants and other low-molecular weight compounds. PMID:28430138

  17. Non-Gradient Blue Native Polyacrylamide Gel Electrophoresis.

    PubMed

    Luo, Xiaoting; Wu, Jinzi; Jin, Zhen; Yan, Liang-Jun

    2017-02-02

    Gradient blue native polyacrylamide gel electrophoresis (BN-PAGE) is a well established and widely used technique for activity analysis of high-molecular-weight proteins, protein complexes, and protein-protein interactions. Since its inception in the early 1990s, a variety of minor modifications have been made to this gradient gel analytical method. Here we provide a major modification of the method, which we call non-gradient BN-PAGE. The procedure, similar to that of non-gradient SDS-PAGE, is simple because there is no expensive gradient maker involved. The non-gradient BN-PAGE protocols presented herein provide guidelines on the analysis of mitochondrial protein complexes, in particular, dihydrolipoamide dehydrogenase (DLDH) and those in the electron transport chain. Protocols for the analysis of blood esterases or mitochondrial esterases are also presented. The non-gradient BN-PAGE method may be tailored for analysis of specific proteins according to their molecular weight regardless of whether the target proteins are hydrophobic or hydrophilic. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  18. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  19. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE PAGES

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    2014-09-29

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  20. Direct determination of the local Hamaker constant of inorganic surfaces based on scanning force microscopy

    NASA Astrophysics Data System (ADS)

    Krajina, Brad A.; Kocherlakota, Lakshmi S.; Overney, René M.

    2014-10-01

    The energetics involved in the bonding fluctuations between nanometer-sized silicon dioxide (SiO2) probes and highly oriented pyrolytic graphite (HOPG) and molybdenum disulfide (MoS2) could be quantified directly and locally on the submicron scale via a time-temperature superposition analysis of the lateral forces between scanning force microscopy silicon dioxide probes and inorganic sample surfaces. The so-called "intrinsic friction analysis" (IFA) provided direct access to the Hamaker constants for HOPG and MoS2, as well as the control sample, calcium fluoride (CaF2). The use of scanning probe enables nanoscopic analysis of bonding fluctuations, thereby overcoming challenges associated with larger scale inhomogeneity and surface roughness common to conventional techniques used to determine surface free energies and dielectric properties. A complementary numerical analysis based on optical and electron energy loss spectroscopy and the Lifshitz quantum electrodynamic theory of van der Waals interactions is provided and confirms quantitatively the IFA results.

  1. Direct determination of the local Hamaker constant of inorganic surfaces based on scanning force microscopy.

    PubMed

    Krajina, Brad A; Kocherlakota, Lakshmi S; Overney, René M

    2014-10-28

    The energetics involved in the bonding fluctuations between nanometer-sized silicon dioxide (SiO2) probes and highly oriented pyrolytic graphite (HOPG) and molybdenum disulfide (MoS2) could be quantified directly and locally on the submicron scale via a time-temperature superposition analysis of the lateral forces between scanning force microscopy silicon dioxide probes and inorganic sample surfaces. The so-called "intrinsic friction analysis" (IFA) provided direct access to the Hamaker constants for HOPG and MoS2, as well as the control sample, calcium fluoride (CaF2). The use of scanning probe enables nanoscopic analysis of bonding fluctuations, thereby overcoming challenges associated with larger scale inhomogeneity and surface roughness common to conventional techniques used to determine surface free energies and dielectric properties. A complementary numerical analysis based on optical and electron energy loss spectroscopy and the Lifshitz quantum electrodynamic theory of van der Waals interactions is provided and confirms quantitatively the IFA results.

  2. A comparison of two micro-beam X-ray emission techniques for actinide elemental distribution in microscopic particles originating from the hydrogen bombs involved in the Palomares (Spain) and Thule (Greenland) accidents

    NASA Astrophysics Data System (ADS)

    Jimenez-Ramos, M. C.; Eriksson, M.; García-López, J.; Ranebo, Y.; García-Tenorio, R.; Betti, M.; Holm, E.

    2010-09-01

    In order to validate and to gain confidence in two micro-beam techniques: particle induced X-ray emission with nuclear microprobe technique (μ-PIXE) and synchrotron radiation induced X-ray fluorescence in a confocal alignment (confocal SR μ-XRF) for characterization of microscopic particles containing actinide elements (mixed plutonium and uranium) a comparative study has been performed. Inter-comparison of the two techniques is essential as the X-ray production cross-sections for U and Pu are different for protons and photons and not well defined in the open literature, especially for Pu. The particles studied consisted of nuclear weapons material, and originate either in the so called Palomares accident in Spain, 1966 or in the Thule accident in Greenland, 1968. In the determination of the average Pu/U mass ratios (not corrected by self-absorption) in the analysed microscopic particles the results from both techniques show a very good agreement. In addition, the suitability of both techniques for the analysis with good resolution (down to a few μm) of the Pu/U distribution within the particles has been proved. The set of results obtained through both techniques has allowed gaining important information concerning the characterization of the remaining fissile material in the areas affected by the aircraft accidents. This type of information is essential for long-term impact assessments of contaminated sites.

  3. Los Alamos, Toshiba probing Fukushima with cosmic rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Christopher

    2014-06-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create imagesmore » of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.« less

  4. Los Alamos, Toshiba probing Fukushima with cosmic rays

    ScienceCinema

    Morris, Christopher

    2018-01-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create images of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.

  5. Training Needs Analysis and Evaluation for New Technologies through the Use of Problem-Based Inquiry

    ERIC Educational Resources Information Center

    Casey, Matthew Scott; Doverspike, Dennis

    2005-01-01

    The analysis of calls to a help desk, in this case calls to a computer help desk, can serve as a rich source of information on the real world problems that individuals are having with the implementation of a new technology. Thus, we propose that an analysis of help desk calls, a form of problem-based inquiry, can serve as a fast and low cost means…

  6. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  7. Evaluation of the Infinium Methylation 450K technology.

    PubMed

    Dedeurwaerder, Sarah; Defrance, Matthieu; Calonne, Emilie; Denis, Hélène; Sotiriou, Christos; Fuks, François

    2011-12-01

    Studies of DNA methylomes hold enormous promise for biomedicine but are hampered by the technological challenges of analyzing many samples cost-effectively. Recently, a major extension of the previous Infinium HumanMethylation27 BeadChip® (Illumina, Inc. CA, USA), called Infinium HumanMethylation450 (Infinium Methylation 450K; Illumina, Inc. CA, USA) was developed. This upgraded technology is a hybrid of two different chemical assays, the Infinium I and Infinium II assays, allowing (for 12 samples in parallel) assessment of the methylation status of more than 480,000 cytosines distributed over the whole genome. In this article, we evaluate Infinium Methylation 450K on cell lines and tissue samples, highlighting some of its advantages but also some of its limitations. In particular, we compare the methylation values of the Infinium I and Infinium II assays. We used Infinium Methylation 450K to profile: first, the well-characterized HCT116 wild-type and double-knockout cell lines and then, 16 breast tissue samples (including eight normal and eight primary tumor samples). Absolute methylation values (β-values) were extracted with the GenomeStudio™ software and then subjected to detailed analysis. While this technology appeared highly robust as previously shown, we noticed a divergence between the β-values retrieved from the type I and type II Infinium assays. Specifically, the β-values obtained from Infinium II probes were less accurate and reproducible than those obtained from Infinium I probes. This suggests that data from the type I and type II assays should be considered separately in any downstream bioinformatic analysis. To be able to deal with the Infinium I and Infinium II data together, we developed and tested a new correction technique, which we called 'peak-based correction'. The idea was to rescale the Infinium II data on the basis of the Infinium I data. While this technique should be viewed as an approximation method, it significantly improves the quality of Infinium II data. Infinium 450K is a powerful technique in terms of reagent costs, time of labor, sample throughput and coverage. It holds great promise for the better understanding of the epigenetic component in health and disease. Yet, due to the nature of its design comprising two different chemical assays, analysis of the whole set of data is not as easy as initially anticipated. Correction strategies, such as the peak-based approach proposed here, are a step towards adequate output data analysis.

  8. An empirical analysis of the corporate call decision

    NASA Astrophysics Data System (ADS)

    Carlson, Murray Dean

    1998-12-01

    In this thesis we provide insights into the behavior of financial managers of utility companies by studying their decisions to redeem callable preferred shares. In particular, we investigate whether or not an option pricing based model of the call decision, with managers who maximize shareholder value, does a better job of explaining callable preferred share prices and call decisions than do other models of the decision. In order to perform these tests, we extend an empirical technique introduced by Rust (1987) to include the use of information from preferred share prices in addition to the call decisions. The model we develop to value the option embedded in a callable preferred share differs from standard models in two ways. First, as suggested in Kraus (1983), we explicitly account for transaction costs associated with a redemption. Second, we account for state variables that are observed by the decision makers but not by the preferred shareholders. We interpret these unobservable state variables as the benefits and costs associated with a change in capital structure that can accompany a call decision. When we add this variable, our empirical model changes from one which predicts exactly when a share should be called to one which predicts the probability of a call as the function of the observable state. These two modifications of the standard model result in predictions of calls, and therefore of callable preferred share prices, that are consistent with several previously unexplained features of the data; we show that the predictive power of the model is improved in a statistical sense by adding these features to the model. The pricing and call probability functions from our model do a good job of describing call decisions and preferred share prices for several utilities. Using data from shares of the Pacific Gas and Electric Co. (PGE) we obtain reasonable estimates for the transaction costs associated with a call. Using a formal empirical test, we are able to conclude that the managers of the Pacific Gas and Electric Company clearly take into account the value of the option to delay the call when making their call decisions. Overall, the model seems to be robust to tests of its specification and does a better job of describing the data than do simpler models of the decision making process. Limitations in the data do not allow us to perform the same tests in a larger cross-section of utility companies. However, we are able to estimate transaction cost parameters for many firms and these do not seem to vary significantly from those of PGE. This evidence does not cause us to reject our hypothesis that managerial behavior is consistent with a model in which managers maximize shareholder value.

  9. Transient analysis of intercalation electrodes for parameter estimation

    NASA Astrophysics Data System (ADS)

    Devan, Sheba

    An essential part of integrating batteries as power sources in any application, be it a large scale automotive application or a small scale portable application, is an efficient Battery Management System (BMS). The combination of a battery with the microprocessor based BMS (called "smart battery") helps prolong the life of the battery by operating in the optimal regime and provides accurate information regarding the battery to the end user. The main purposes of BMS are cell protection, monitoring and control, and communication between different components. These purposes are fulfilled by tracking the change in the parameters of the intercalation electrodes in the batteries. Consequently, the functions of the BMS should be prompt, which requires the methodology of extracting the parameters to be efficient in time. The traditional transient techniques applied so far may not be suitable due to reasons such as the inability to apply these techniques when the battery is under operation, long experimental time, etc. The primary aim of this research work is to design a fast, accurate and reliable technique that can be used to extract parameter values of the intercalation electrodes. A methodology based on analysis of the short time response to a sinusoidal input perturbation, in the time domain is demonstrated using a porous electrode model for an intercalation electrode. It is shown that the parameters associated with the interfacial processes occurring in the electrode can be determined rapidly, within a few milliseconds, by measuring the response in the transient region. The short time analysis in the time domain is then extended to a single particle model that involves bulk diffusion in the solid phase in addition to interfacial processes. A systematic procedure for sequential parameter estimation using sensitivity analysis is described. Further, the short time response and the input perturbation are transformed into the frequency domain using Fast Fourier Transform (FFT) to generate impedance spectra to derive immediate qualitative information regarding the nature of the system. The short time analysis technique gives the ability to perform both time domain and frequency domain analysis using data measured within short durations.

  10. Blocking Strategies for Performing Entity Resolution in a Distributed Computing Environment

    ERIC Educational Resources Information Center

    Wang, Pei

    2016-01-01

    Entity resolution (ER) is an O(n[superscript 2]) problem where n is the number of records to be processed. The pair-wise nature of ER makes it impractical to perform on large datasets without the use of a technique called blocking. In blocking the records are separated into groups (called blocks) in such a way the records most likely to match are…

  11. Measurement of the spatially distributed temperature and soot loadings in a laminar diffusion flame using a Cone-Beam Tomography technique

    NASA Astrophysics Data System (ADS)

    Zhao, Huayong; Williams, Ben; Stone, Richard

    2014-01-01

    A new low-cost optical diagnostic technique, called Cone Beam Tomographic Three Colour Spectrometry (CBT-TCS), has been developed to measure the planar distributions of temperature, soot particle size, and soot volume fraction in a co-flow axi-symmetric laminar diffusion flame. The image of a flame is recorded by a colour camera, and then by using colour interpolation and applying a cone beam tomography algorithm, a colour map can be reconstructed that corresponds to a diametral plane. Look-up tables calculated using Planck's law and different scattering models are then employed to deduce the temperature, approximate average soot particle size and soot volume fraction in each voxel (volumetric pixel). A sensitivity analysis of the look-up tables shows that the results have a high temperature resolution but a relatively low soot particle size resolution. The assumptions underlying the technique are discussed in detail. Sample data from an ethylene laminar diffusion flame are compared with data in the literature for similar flames. The comparison shows very consistent temperature and soot volume fraction profiles. Further analysis indicates that the difference seen in comparison with published results are within the measurement uncertainties. This methodology is ready to be applied to measure 3D data by capturing multiple flame images from different angles for non-axisymmetric flame.

  12. Chroma intra prediction based on inter-channel correlation for HEVC.

    PubMed

    Zhang, Xingyu; Gisquet, Christophe; François, Edouard; Zou, Feng; Au, Oscar C

    2014-01-01

    In this paper, we investigate a new inter-channel coding mode called LM mode proposed for the next generation video coding standard called high efficiency video coding. This mode exploits inter-channel correlation using reconstructed luma to predict chroma linearly with parameters derived from neighboring reconstructed luma and chroma pixels at both encoder and decoder to avoid overhead signaling. In this paper, we analyze the LM mode and prove that the LM parameters for predicting original chroma and reconstructed chroma are statistically the same. We also analyze the error sensitivity of the LM parameters. We identify some LM mode problematic situations and propose three novel LM-like modes called LMA, LML, and LMO to address the situations. To limit the increase in complexity due to the LM-like modes, we propose some fast algorithms with the help of some new cost functions. We further identify some potentially-problematic conditions in the parameter estimation (including regression dilution problem) and introduce a novel model correction technique to detect and correct those conditions. Simulation results suggest that considerable BD-rate reduction can be achieved by the proposed LM-like modes and model correction technique. In addition, the performance gain of the two techniques appears to be essentially additive when combined.

  13. ENSO and its modulations on annual and multidecadal timescales revealed by Nonlinear Laplacian Spectral Analysis

    NASA Astrophysics Data System (ADS)

    Giannakis, D.; Slawinska, J. M.

    2016-12-01

    The variability of the Indo-Pacific Ocean on interannual to multidecadal timescales is investigated in a millennial control run of CCSM4 and in observations using a recently introduced technique called Nonlinear Laplacian Spectral Analysis (NLSA). Through this technique, drawbacks associated with ad hoc pre-filtering of the input data are avoided, enabling recovery of low-frequency and intermittent modes not accessible previously via classical approaches. Here, a multiscale hierarchy of modes is identified for Indo-Pacific SST and numerous linkages between these patterns are revealed. On interannual timescales, a mode with spatiotemporal pattern corresponding to the fundamental component of ENSO emerges, along with modulations of the annual cycle by ENSO in agreement with ENSO combination mode theory. In spatiotemporal reconstructions, these patterns capture the seasonal southward migration of SST and zonal wind anomalies associated with termination of El Niño and La Niña events. Notably, this family of modes explains a significant portion of SST variance in Eastern Indian Ocean regions employed in the definition of Indian Ocean dipole (IOD) indices, suggesting that it should be useful for understanding the linkage of these indices with ENSO and the interaction of the Indian and Pacific Oceans. In model data, we find that the ENSO and ENSO combination modes are modulated on multidecadal timescales by a mode predominantly active in the western tropical Pacific - we call this mode West Pacific Multidecadal Oscillation (WPMO). Despite the relatively low variance explained by this mode, its dynamical role appears to be significant as it has clear sign-dependent modulating relationships with the interannual modes carrying most of the variance. In particular, cold WPMO events are associated with anomalous Central Pacific westerlies favoring stronger ENSO events, while warm WPMO events suppress ENSO activity. Moreover, the WPMO has significant climatic impacts as demonstrated here through its strong correlation with decadal precipitation over Australia. As an extension of this work, we discuss the deterministic and stochastic aspects of the variability of these modes and their potential predictability based on nonparametric kernel analog forecasting techniques.

  14. Detailed temporal structure of communication networks in groups of songbirds.

    PubMed

    Stowell, Dan; Gill, Lisa; Clayton, David

    2016-06-01

    Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual. © 2016 The Authors.

  15. Probabilistic topic modeling for the analysis and classification of genomic sequences

    PubMed Central

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  16. A Statistical Methodology for Detecting and Monitoring Change in Forest Ecosystems Using Remotely Sensed Imagery

    NASA Astrophysics Data System (ADS)

    Mills, R. T.; Kumar, J.; Hoffman, F. M.; Hargrove, W. W.; Spruce, J.

    2011-12-01

    Variations in vegetation phenology, the annual temporal pattern of leaf growth and senescence, can be a strong indicator of ecological change or disturbance. However, phenology is also strongly influenced by seasonal, interannual, and long-term trends in climate, making identification of changes in forest ecosystems a challenge. Forest ecosystems are vulnerable to extreme weather events, insect and disease attacks, wildfire, harvesting, and other land use change. Normalized difference vegetation index (NDVI), a remotely sensed measure of greenness, provides a proxy for phenology. NDVI for the conterminous United States (CONUS) derived from the Moderate Resolution Spectroradiometer (MODIS) at 250 m resolution was used in this study to develop phenological signatures of ecological regimes called phenoregions. By applying a quantitative data mining technique to the NDVI measurements for every eight days over the entire MODIS record, annual maps of phenoregions were developed. This geospatiotemporal cluster analysis technique employs high performance computing resources, enabling analysis of such very large data sets. This technique produces a prescribed number of prototypical phenological states to which every location belongs in any year. Analysis of the shifts among phenological states yields information about responses to interannual climate variability and, more importantly, changes in ecosystem health due to disturbances. Moreover, a large change in the phenological states occupied by a single location over time indicates a significant disturbance or ecological shift. This methodology has been applied for identification of various forest disturbance events, including wildfire, tree mortality due to Mountain Pine Beetle, and other insect infestation and diseases, as well as extreme events like storms and hurricanes in the U.S. Presented will be results from analysis of phenological state dynamics, along with disturbance and validation data.

  17. Multiset canonical correlations analysis and multispectral, truly multitemporal remote sensing data.

    PubMed

    Nielsen, Allan Aasbjerg

    2002-01-01

    This paper describes two- and multiset canonical correlations analysis (CCA) for data fusion, multisource, multiset, or multitemporal exploratory data analysis. These techniques transform multivariate multiset data into new orthogonal variables called canonical variates (CVs) which, when applied in remote sensing, exhibit ever-decreasing similarity (as expressed by correlation measures) over sets consisting of 1) spectral variables at fixed points in time (R-mode analysis), or 2) temporal variables with fixed wavelengths (T-mode analysis). The CVs are invariant to linear and affine transformations of the original variables within sets which means, for example, that the R-mode CVs are insensitive to changes over time in offset and gain in a measuring device. In a case study, CVs are calculated from Landsat Thematic Mapper (TM) data with six spectral bands over six consecutive years. Both Rand T-mode CVs clearly exhibit the desired characteristic: they show maximum similarity for the low-order canonical variates and minimum similarity for the high-order canonical variates. These characteristics are seen both visually and in objective measures. The results from the multiset CCA R- and T-mode analyses are very different. This difference is ascribed to the noise structure in the data. The CCA methods are related to partial least squares (PLS) methods. This paper very briefly describes multiset CCA-based multiset PLS. Also, the CCA methods can be applied as multivariate extensions to empirical orthogonal functions (EOF) techniques. Multiset CCA is well-suited for inclusion in geographical information systems (GIS).

  18. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  19. A comparative analysis of the statistical properties of large mobile phone calling networks.

    PubMed

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  20. Tools for the functional interpretation of metabolomic experiments.

    PubMed

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  1. Membership-degree preserving discriminant analysis with applications to face recognition.

    PubMed

    Yang, Zhangjing; Liu, Chuancai; Huang, Pu; Qian, Jianjun

    2013-01-01

    In pattern recognition, feature extraction techniques have been widely employed to reduce the dimensionality of high-dimensional data. In this paper, we propose a novel feature extraction algorithm called membership-degree preserving discriminant analysis (MPDA) based on the fisher criterion and fuzzy set theory for face recognition. In the proposed algorithm, the membership degree of each sample to particular classes is firstly calculated by the fuzzy k-nearest neighbor (FKNN) algorithm to characterize the similarity between each sample and class centers, and then the membership degree is incorporated into the definition of the between-class scatter and the within-class scatter. The feature extraction criterion via maximizing the ratio of the between-class scatter to the within-class scatter is applied. Experimental results on the ORL, Yale, and FERET face databases demonstrate the effectiveness of the proposed algorithm.

  2. Gold rush - A swarm dynamics in games

    NASA Astrophysics Data System (ADS)

    Zelinka, Ivan; Bukacek, Michal

    2017-07-01

    This paper is focused on swarm intelligence techniques and its practical use in computer games. The aim is to show how a swarm dynamics can be generated by multiplayer game, then recorded, analyzed and eventually controlled. In this paper we also discuss possibility to use swarm intelligence instead of game players. Based on our previous experiments two games, using swarm algorithms are mentioned briefly here. The first one is strategy game StarCraft: Brood War, and TicTacToe in which SOMA algorithm has also take a role of player against human player. Open research reported here has shown potential benefit of swarm computation in the field of strategy games and players strategy based on swarm behavior record and analysis. We propose new game called Gold Rush as an experimental environment for human or artificial swarm behavior and consequent analysis.

  3. Decorin content and near infrared spectroscopy analysis of dried collagenous biomaterial samples.

    PubMed

    Aldema-Ramos, Mila L; Castell, Joan Carles; Muir, Zerlina E; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-12-14

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis.

  4. Statistical physics in foreign exchange currency and stock markets

    NASA Astrophysics Data System (ADS)

    Ausloos, M.

    2000-09-01

    Problems in economy and finance have attracted the interest of statistical physicists all over the world. Fundamental problems pertain to the existence or not of long-, medium- or/and short-range power-law correlations in various economic systems, to the presence of financial cycles and on economic considerations, including economic policy. A method like the detrended fluctuation analysis is recalled emphasizing its value in sorting out correlation ranges, thereby leading to predictability at short horizon. The ( m, k)-Zipf method is presented for sorting out short-range correlations in the sign and amplitude of the fluctuations. A well-known financial analysis technique, the so-called moving average, is shown to raise questions to physicists about fractional Brownian motion properties. Among spectacular results, the possibility of crash predictions has been demonstrated through the log-periodicity of financial index oscillations.

  5. High-throughput tetrad analysis.

    PubMed

    Ludlow, Catherine L; Scott, Adrian C; Cromie, Gareth A; Jeffery, Eric W; Sirr, Amy; May, Patrick; Lin, Jake; Gilbert, Teresa L; Hays, Michelle; Dudley, Aimée M

    2013-07-01

    Tetrad analysis has been a gold-standard genetic technique for several decades. Unfortunately, the need to manually isolate, disrupt and space tetrads has relegated its application to small-scale studies and limited its integration with high-throughput DNA sequencing technologies. We have developed a rapid, high-throughput method, called barcode-enabled sequencing of tetrads (BEST), that uses (i) a meiosis-specific GFP fusion protein to isolate tetrads by FACS and (ii) molecular barcodes that are read during genotyping to identify spores derived from the same tetrad. Maintaining tetrad information allows accurate inference of missing genetic markers and full genotypes of missing (and presumably nonviable) individuals. An individual researcher was able to isolate over 3,000 yeast tetrads in 3 h, an output equivalent to that of almost 1 month of manual dissection. BEST is transferable to other microorganisms for which meiotic mapping is significantly more laborious.

  6. Virtual prototyping of drop test using explicit analysis

    NASA Astrophysics Data System (ADS)

    Todorov, Georgi; Kamberov, Konstantin

    2017-12-01

    Increased requirements for reliability and safety, included in contemporary standards and norms, has high impact over new product development. New numerical techniques based on virtual prototyping technology, facilitates imrpoving product development cycle, resutling in reduced time/money spent for this stage as well as increased knowledge about certain failure mechanism. So called "drop test" became nearly a "must" step in development of any human operated product. This study aims to demonstrate dynamic behaviour assessment of a structure under impact loads, based on virtual prototyping using a typical nonlinear analysis - explicit dynamics. An example is presneted, based on a plastic container that is used as cartridge for a dispenser machine exposed to various work conditions. Different drop orientations were analyzed and critical load cases and design weaknesses have been found. Several design modifications have been proposed, based on detailed analyses results review.

  7. Hybrid Feature Extraction-based Approach for Facial Parts Representation and Recognition

    NASA Astrophysics Data System (ADS)

    Rouabhia, C.; Tebbikh, H.

    2008-06-01

    Face recognition is a specialized image processing which has attracted a considerable attention in computer vision. In this article, we develop a new facial recognition system from video sequences images dedicated to person identification whose face is partly occulted. This system is based on a hybrid image feature extraction technique called ACPDL2D (Rouabhia et al. 2007), it combines two-dimensional principal component analysis and two-dimensional linear discriminant analysis with neural network. We performed the feature extraction task on the eyes and the nose images separately then a Multi-Layers Perceptron classifier is used. Compared to the whole face, the results of simulation are in favor of the facial parts in terms of memory capacity and recognition (99.41% for the eyes part, 98.16% for the nose part and 97.25 % for the whole face).

  8. Analysis of the plasma-wall interaction in the Heliotron E device

    NASA Astrophysics Data System (ADS)

    Motojima, O.; Mizuuchi, T.; Besshou, S.; Iiyoshi, A.; Uo, K.; Yamashina, T.; Mohri, M.; Satake, T.; Hashiba, M.; Amemiya, S.; Miwa, H.

    1984-12-01

    The plasma-wall interaction (PWI) of the currentless plasmas with temperature To, Tio ≤ 1.1 keV, density N¯e = (2-10)× 1013/cm3, and volume-averaged beta value of β$¯≤ 2% was investigated. We have observed that PWI took place mainly where the divertor field line intersected the chamber wall (called divertor traces). Boundary plasmas were measured with electrostatic probes, which showed the presence of the divertor region with the parameters in the range of Ned = 1010-1011/cm3 and Ted = 10-50 eV. Surface analysis techniques (ESCA, AES, and RBS) were applied to analyze the surface probes (Si, graphite and stainless steel) and the test pieces (SiC, TiC, and stainless steel), which were irradiated by plasmas for short and long times respectively.

  9. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  10. Targeting New Teachers & Teaching by Novel Techniques.

    ERIC Educational Resources Information Center

    Hopkins, Patricia; And Others

    In 1988-89, the Science Academy, a magnet program at LBJ High School (Austin, Texas), was awarded a two-year grant called Double TNT to "target new teachers" and "teach by novel techniques." The purposes of the program include: (1) interesting minority and female students in science; (2) attracting minority and female students…

  11. Accuracy of vertical radial plume mapping technique in measuring lagoon gas emission

    USDA-ARS?s Scientific Manuscript database

    Recently, the U.S. Environmental Protection Agency (USEPA) posted a ground-based optical remote sensing method on its website called OTM 10 for measuring fugitive gas emission flux from area sources such as closed landfills. The OTM 10 utilizes the vertical radial plume mapping (VRPM) technique to c...

  12. The use of soil electrical conductivity to investigate soil homogeneity in Story County, Iowa, USA

    USDA-ARS?s Scientific Manuscript database

    Precision agriculture, environmental applications, and land use planning needs have led to calls for more detailed soil maps. A remote sensing technique that can differentiate soils with a high degree of accuracy would be ideal for soil survey purposes. One technique that has shown promise in Iowa i...

  13. For Mole Problems, Call Avogadro: 602-1023.

    ERIC Educational Resources Information Center

    Uthe, R. E.

    2002-01-01

    Describes techniques to help introductory students become familiar with Avogadro's number and mole calculations. Techniques involve estimating numbers of common objects then calculating the length of time needed to count large numbers of them. For example, the immense amount of time required to count a mole of sand grains at one grain per second…

  14. A New Forensic Picture Polygraph Technique for Terrorist and Crime Deception System

    ERIC Educational Resources Information Center

    Costello, R. H. Brian; Axton, JoAnn; Gold, Karen L.

    2006-01-01

    The Forensic Terrorist Detection System called Pinocchio Assessment Profile (PAP) employs standard issue polygraphs for a non-verbal picture technique originated as a biofeedback careers interest instrument. The system can be integrated readily into airport screening protocols. However, the method does not rely on questioning or foreign language…

  15. A Study of the Influence of Advertising Techniques on Selection of Instructional Reading Materials by Prospective Teachers.

    ERIC Educational Resources Information Center

    Greenlaw, M. Jean; And Others

    This study examined the effect of three different modes of presentation on elementary education majors' selection and rating of materials for reading instruction. Materials were chosen to represent each of the following propaganda techniques: glittering generalities, name calling, transfer, testimonial, bandwagon, and card stacking. Students in…

  16. Asphalt Pavements Session 2E-3 : Warm Mix Asphalt : Laboratory Evaluation and Pavement Design [SD .WMV (720x480/29fps/218.0 MB)

    DOT National Transportation Integrated Search

    2003-12-01

    This study evaluates one of the recycling techniques used to rehabilitate pavement, called Cold In-Place Recycling (CIR). CIR is one of the fastest growing road rehabilitation techniques because it is quick and cost-effective. The document reports on...

  17. Development and evaluation of the photoload sampling technique

    Treesearch

    Robert E. Keane; Laura J. Dickinson

    2007-01-01

    Wildland fire managers need better estimates of fuel loading so they can accurately predict potential fire behavior and effects of alternative fuel and ecosystem restoration treatments. This report presents the development and evaluation of a new fuel sampling method, called the photoload sampling technique, to quickly and accurately estimate loadings for six common...

  18. Educator and participant perceptions and cost analysis of stage-tailored educational telephone calls.

    PubMed

    Esters, Onikia N; Boeckner, Linda S; Hubert, Melanie; Horacek, Tanya; Kritsch, Karen R; Oakland, Mary J; Lohse, Barbara; Greene, Geoffrey; Nitzke, Susan

    2008-01-01

    To identify strengths and weaknesses of nutrition education via telephone calls as part of a larger stage-of-change tailored intervention with mailed materials. Evaluative feedback was elicited from educators who placed the calls and respondents who received the calls. An internet and telephone survey of 10 states in the midwestern United States. 21 educators in 10 states reached via the internet and 50 young adults reached via telephone. VARIABLES MEASURED AND ANALYSIS: Rankings of intervention components, ratings of key aspects of educational calls, and cost data (as provided by a lead researcher in each state) were summarized via descriptive statistics. RESULTS, CONCLUSIONS, AND IMPLICATIONS: Educational calls used 6 to 17 minutes of preparation time, required 8 to 15 minutes of contact time, and had a mean estimated cost of $5.82 per call. Low-income young adults favored print materials over educational calls. However, the calls were reported to have positive effects on motivating participants to set goals. Educators who use educational telephone calls to reach young adults, a highly mobile target audience, may require a robust and flexible contact plan.

  19. Improvements in analysis techniques for segmented mirror arrays

    NASA Astrophysics Data System (ADS)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  20. A burnout prediction model based around char morphology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tao Wu; Edward Lester; Michael Cloke

    Several combustion models have been developed that can make predictions about coal burnout and burnout potential. Most of these kinetic models require standard parameters such as volatile content and particle size to make a burnout prediction. This article presents a new model called the char burnout (ChB) model, which also uses detailed information about char morphology in its prediction. The input data to the model is based on information derived from two different image analysis techniques. One technique generates characterization data from real char samples, and the other predicts char types based on characterization data from image analysis of coalmore » particles. The pyrolyzed chars in this study were created in a drop tube furnace operating at 1300{sup o}C, 200 ms, and 1% oxygen. Modeling results were compared with a different carbon burnout kinetic model as well as the actual burnout data from refiring the same chars in a drop tube furnace operating at 1300{sup o}C, 5% oxygen, and residence times of 200, 400, and 600 ms. A good agreement between ChB model and experimental data indicates that the inclusion of char morphology in combustion models could well improve model predictions. 38 refs., 5 figs., 6 tabs.« less

Top