Sample records for processing technique called

  1. Encoding techniques for complex information structures in connectionist systems

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.

  2. What Does CALL Have to Offer Computer Science and What Does Computer Science Have to Offer CALL?

    ERIC Educational Resources Information Center

    Cushion, Steve

    2006-01-01

    We will argue that CALL can usefully be viewed as a subset of computer software engineering and can profit from adopting some of the recent progress in software development theory. The unified modelling language has become the industry standard modelling technique and the accompanying unified process is rapidly gaining acceptance. The manner in…

  3. Essentials of Suggestopedia: A Primer for Practitioners.

    ERIC Educational Resources Information Center

    Caskey, Owen L.; Flake, Muriel H.

    Suggestology is the scientific study of the psychology of suggestion and Suggestopedia in the application of relaxation and suggestion techniques to learning. The approach applied to learning processes (called Suggestopedic) developed by Dr. Georgi Lozanov (called the Lozanov Method) utilizes mental and physical relaxation, deep breathing,…

  4. Toward energy harvesting using active materials and conversion improvement by nonlinear processing.

    PubMed

    Guyomar, Daniel; Badel, Adrien; Lefeuvre, Elie; Richard, Claude

    2005-04-01

    This paper presents a new technique of electrical energy generation using mechanically excited piezoelectric materials and a nonlinear process. This technique, called synchronized switch harvesting (SSH), is derived from the synchronized switch damping (SSD), which is a nonlinear technique previously developed to address the problem of vibration damping on mechanical structures. This technique results in a significant increase of the electromechanical conversion capability of piezoelectric materials. Comparatively with standard technique, the electrical harvested power may be increased above 900%. The performance of the nonlinear processing is demonstrated on structures excited at their resonance frequency as well as out of resonance.

  5. Electronic Materials and Processing: Proceedings of the First Electronic Materials and Processing Congress Held in Conjunction with the 1988 World Materials Congress, Chicago, Illinois, USA, 24-30 September 1988

    DTIC Science & Technology

    1988-01-01

    usually be traced to a combination of new semiconductors one on top of the other, then concepts, materials, and device principles, the process is called...example, growth techniques. New combinations of compound semiconductors such as GaAs have an materials called heterostructures can be made intrinsically...of combinations of metals, have direct energy band gaps that facilitate semiconductor, and insulators. Quantum the efficient recombination of

  6. Multidisciplinary Responses to the Sexual Victimization of Children: Use of Control Phone Calls.

    PubMed

    Canavan, J William; Borowski, Christine; Essex, Stacy; Perkowski, Stefan

    2017-10-01

    This descriptive study addresses the question of the value of one-party consent phone calls regarding the sexual victimization of children. The authors reviewed 4 years of experience with children between the ages of 3 and 18 years selected for the control phone calls after a forensic interview by the New York State Police forensic interviewer. The forensic interviewer identified appropriate cases for control phone calls considering New York State law, the child's capacity to make the call, the presence of another person to make the call and a supportive residence. The control phone call process has been extremely effective forensically. Offenders choose to avoid trial by taking a plea bargain thereby dramatically speeding up the criminal judicial and family court processes. An additional outcome of the control phone call is the alleged offender's own words saved the child from the trauma of testifying in court. The control phone call reduced the need for children to repeat their stories to various interviewers. A successful control phone call gives the child a sense of vindication. This technique is the only technique that preserves the actual communication pattern between the alleged victim and the alleged offender. This can be of great value to the mental health professionals working with both the child and the alleged offender. Cautions must be considered regarding potential serious adverse effects on the child. The multidisciplinary team members must work together in the control phone call. The descriptive nature of this study did not allow the authors adequate demographic data, a subject that should be addressed in future prospective study.

  7. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.

  8. Hands-free human-machine interaction with voice

    NASA Astrophysics Data System (ADS)

    Juang, B. H.

    2004-05-01

    Voice is natural communication interface between a human and a machine. The machine, when placed in today's communication networks, may be configured to provide automation to save substantial operating cost, as demonstrated in AT&T's VRCP (Voice Recognition Call Processing), or to facilitate intelligent services, such as virtual personal assistants, to enhance individual productivity. These intelligent services often need to be accessible anytime, anywhere (e.g., in cars when the user is in a hands-busy-eyes-busy situation or during meetings where constantly talking to a microphone is either undersirable or impossible), and thus call for advanced signal processing and automatic speech recognition techniques which support what we call ``hands-free'' human-machine communication. These techniques entail a broad spectrum of technical ideas, ranging from use of directional microphones and acoustic echo cancellatiion to robust speech recognition. In this talk, we highlight a number of key techniques that were developed for hands-free human-machine communication in the mid-1990s after Bell Labs became a unit of Lucent Technologies. A video clip will be played to demonstrate the accomplishement.

  9. Comparing Noun Phrasing Techniques for Use with Medical Digital Library Tools.

    ERIC Educational Resources Information Center

    Tolle, Kristin M.; Chen, Hsinchun

    2000-01-01

    Describes a study that investigated the use of a natural language processing technique called noun phrasing to determine whether it is a viable technique for medical information retrieval. Evaluates four noun phrase generation tools for their ability to isolate noun phrases from medical journal abstracts, focusing on precision and recall.…

  10. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  11. Development of a mix design process for cold-in-place rehabilitation using foamed asphalt.

    DOT National Transportation Integrated Search

    2003-12-01

    This study evaluates one of the recycling techniques used to rehabilitate pavement, called Cold In-Place Recycling (CIR). CIR is one of the fastest growing road rehabilitation techniques because it is quick and cost-effective. The document reports on...

  12. Psychological Dynamics of Adolescent Satanism.

    ERIC Educational Resources Information Center

    Moriarty, Anthony R.; Story, Donald W.

    1990-01-01

    Attempts to describe the psychological processes that predispose an individual to adopt a Satanic belief system. Describes processes in terms of child-parent relationships and the developmental tasks of adolescence. Proposes a model called the web of psychic tension to represent the process of Satanic cult adoption. Describes techniques for…

  13. Optical processing for semiconductor device fabrication

    NASA Technical Reports Server (NTRS)

    Sopori, Bhushan L.

    1994-01-01

    A new technique for semiconductor device processing is described that uses optical energy to produce local heating/melting in the vicinity of a preselected interface of the device. This process, called optical processing, invokes assistance of photons to enhance interface reactions such as diffusion and melting, as compared to the use of thermal heating alone. Optical processing is performed in a 'cold wall' furnace, and requires considerably lower energies than furnace or rapid thermal annealing. This technique can produce some device structures with unique properties that cannot be produced by conventional thermal processing. Some applications of optical processing involving semiconductor-metal interfaces are described.

  14. Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models

    DTIC Science & Technology

    2008-08-01

    Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool

  15. A proposed technique for vehicle tracking, direction, and speed determination

    NASA Astrophysics Data System (ADS)

    Fisher, Paul S.; Angaye, Cleopas O.; Fisher, Howard P.

    2004-12-01

    A technique for recognition of vehicles in terms of direction, distance, and rate of change is presented. This represents very early work on this problem with significant hurdles still to be addressed. These are discussed in the paper. However, preliminary results also show promise for this technique for use in security and defense environments where the penetration of a perimeter is of concern. The material described herein indicates a process whereby the protection of a barrier could be augmented by computers and installed cameras assisting the individuals charged with this responsibility. The technique we employ is called Finite Inductive Sequences (FI) and is proposed as a means for eliminating data requiring storage and recognition where conventional mathematical models don"t eliminate enough and statistical models eliminate too much. FI is a simple idea and is based upon a symbol push-out technique that allows the order (inductive base) of the model to be set to an a priori value for all derived rules. The rules are obtained from exemplar data sets, and are derived by a technique called Factoring, yielding a table of rules called a Ruling. These rules can then be used in pattern recognition applications such as described in this paper.

  16. Nanoforging - Innovation in three-dimensional processing and shaping of nanoscaled structures.

    PubMed

    Landefeld, Andreas; Rösler, Joachim

    2014-01-01

    This paper describes the shaping of freestanding objects out of metallic structures in the nano- and submicron size. The technique used, called nanoforging, is very similar to the macroscopic forging process. With spring actuated tools produced by focused ion beam milling, controlled forging is demonstrated. With only three steps, a conical bar stock is transformed to a flat- and semicircular bent bar stock. Compared with other forming techniques in the reduced scale, nanoforging represents a beneficial approach in forming freestanding metallic structures, due to its simplicity, and supplements other forming techniques.

  17. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  18. [Organization development of the public health system].

    PubMed

    Pfaff, Holger; Klein, Jürgen

    2002-05-15

    Changes in the German health care system require changes in health care institutions. Organizational development (OD) techniques can help them to cope successfully with their changing environment. OD is defined as a collective process of learning aiming to induce intended organizational change. OD is based on social science methods and conducted by process-oriented consultants. In contrast to techniques of organizational design, OD is characterized by employee participation. One of the most important elements of OD is the so-called "survey-feedback-technique". Five examples illustrate how the survey-feedback-technique can be used to facilitate organisational learning. OD technique supports necessary change in health care organizations. It should be used more frequently.

  19. Blocking Strategies for Performing Entity Resolution in a Distributed Computing Environment

    ERIC Educational Resources Information Center

    Wang, Pei

    2016-01-01

    Entity resolution (ER) is an O(n[superscript 2]) problem where n is the number of records to be processed. The pair-wise nature of ER makes it impractical to perform on large datasets without the use of a technique called blocking. In blocking the records are separated into groups (called blocks) in such a way the records most likely to match are…

  20. Adaptive frequency-difference matched field processing for high frequency source localization in a noisy shallow ocean.

    PubMed

    Worthmann, Brian M; Song, H C; Dowling, David R

    2017-01-01

    Remote source localization in the shallow ocean at frequencies significantly above 1 kHz is virtually impossible for conventional array signal processing techniques due to environmental mismatch. A recently proposed technique called frequency-difference matched field processing (Δf-MFP) [Worthmann, Song, and Dowling (2015). J. Acoust. Soc. Am. 138(6), 3549-3562] overcomes imperfect environmental knowledge by shifting the signal processing to frequencies below the signal's band through the use of a quadratic product of frequency-domain signal amplitudes called the autoproduct. This paper extends these prior Δf-MFP results to various adaptive MFP processors found in the literature, with particular emphasis on minimum variance distortionless response, multiple constraint method, multiple signal classification, and matched mode processing at signal-to-noise ratios (SNRs) from -20 to +20 dB. Using measurements from the 2011 Kauai Acoustic Communications Multiple University Research Initiative experiment, the localization performance of these techniques is analyzed and compared to Bartlett Δf-MFP. The results show that a source broadcasting a frequency sweep from 11.2 to 26.2 kHz through a 106 -m-deep sound channel over a distance of 3 km and recorded on a 16 element sparse vertical array can be localized using Δf-MFP techniques within average range and depth errors of 200 and 10 m, respectively, at SNRs down to 0 dB.

  1. Gorilla Mothers Also Matter! New Insights on Social Transmission in Gorillas (Gorilla gorilla gorilla) in Captivity

    PubMed Central

    Luef, Eva Maria; Pika, Simone

    2013-01-01

    The present paper describes two distinct behaviors relating to food processing and communication that were observed in a community of five separately housed groups of lowland gorillas (Gorilla gorilla gorilla) in captivity during two study periods one decade apart: (1) a food processing technique to separate wheat from chaff, the so-called puff-blowing technique; and (2) a male display used to attract the attention of visitors, the so-called throw-kiss-display. We investigated (a) whether the behaviors were transmitted within the respective groups; and if yes, (b) their possible mode of transmission. Our results showed that only the food processing technique spread from three to twenty-one individuals during the ten-year period, whereas the communicative display died out completely. The main transmission mode of the puff-blowing technique was the mother-offspring dyad: offspring of puff-blowing mothers showed the behavior, while the offspring of non- puff-blowing mothers did not. These results strongly support the role mothers play in the acquisition of novel skills and vertical social transmission. Furthermore, they suggest that behaviors, which provide a direct benefit to individuals, have a high chance of social transmission while the loss of benefits can result in the extinction of behaviors. PMID:24312184

  2. Gorilla mothers also matter! New insights on social transmission in gorillas (Gorilla gorilla gorilla) in captivity.

    PubMed

    Luef, Eva Maria; Pika, Simone

    2013-01-01

    The present paper describes two distinct behaviors relating to food processing and communication that were observed in a community of five separately housed groups of lowland gorillas (Gorilla gorilla gorilla) in captivity during two study periods one decade apart: (1) a food processing technique to separate wheat from chaff, the so-called puff-blowing technique; and (2) a male display used to attract the attention of visitors, the so-called throw-kiss-display. We investigated (a) whether the behaviors were transmitted within the respective groups; and if yes, (b) their possible mode of transmission. Our results showed that only the food processing technique spread from three to twenty-one individuals during the ten-year period, whereas the communicative display died out completely. The main transmission mode of the puff-blowing technique was the mother-offspring dyad: offspring of puff-blowing mothers showed the behavior, while the offspring of non- puff-blowing mothers did not. These results strongly support the role mothers play in the acquisition of novel skills and vertical social transmission. Furthermore, they suggest that behaviors, which provide a direct benefit to individuals, have a high chance of social transmission while the loss of benefits can result in the extinction of behaviors.

  3. Activity Summaries as a Classroom Assessment Tool.

    ERIC Educational Resources Information Center

    McGee, Steven; Kirby, Jennifer; Croft, Steven K.

    This study explored the usefulness of a classroom assessment technique called the activity summary template. It is proposed that the activity summary template enables students to process and organize information learning during an investigation. This process will in turn help students to achieve greater learning outcomes. The activity summary…

  4. Improving Group Processes in Transdisciplinary Case Studies for Sustainability Learning

    ERIC Educational Resources Information Center

    Hansmann, Ralf; Crott, Helmut W.; Mieg, Harald A.; Scholz, Roland W.

    2009-01-01

    Purpose: Deficient group processes such as conformity pressure can lead to inadequate group decisions with negative social, economic, or environmental consequences. The study aims to investigate how a group technique (called INFO) improves students' handling of conformity pressure and their collective judgments in the context of a…

  5. A Comparison of Research Techniques Used in the Collective Bargaining Process.

    ERIC Educational Resources Information Center

    Ahrens, Stephen W.

    Types of research currently being used in negotiating salaries and fringe benefits for faculty are discussed. As collective bargaining becomes more widespread among public colleges and universities it is suggested that the institutional researcher will be called upon to provide research relevant to the arbitration process. Master contract…

  6. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  7. Nanoforging – Innovation in three-dimensional processing and shaping of nanoscaled structures

    PubMed Central

    Rösler, Joachim

    2014-01-01

    Summary Background: This paper describes the shaping of freestanding objects out of metallic structures in the nano- and submicron size. The technique used, called nanoforging, is very similar to the macroscopic forging process. Results: With spring actuated tools produced by focused ion beam milling, controlled forging is demonstrated. With only three steps, a conical bar stock is transformed to a flat- and semicircular bent bar stock. Conclusion: Compared with other forming techniques in the reduced scale, nanoforging represents a beneficial approach in forming freestanding metallic structures, due to its simplicity, and supplements other forming techniques. PMID:25161840

  8. Using a Polytope to Estimate Efficient Production Functions of Joint Product Processes.

    ERIC Educational Resources Information Center

    Simpson, William A.

    In the last decade, a modeling technique has been developed to handle complex input/output analyses where outputs involve joint products and there are no known mathematical relationships linking the outputs or inputs. The technique uses the geometrical concept of a six-dimensional shape called a polytope to analyze the efficiency of each…

  9. The Sandtray Technique for Swedish Children 1945-1960: Diagnostics, Psychotherapy and Processes of Individualisation

    ERIC Educational Resources Information Center

    Nelson, Karin Zetterqvist

    2011-01-01

    The present article examines the development of a diagnostic and therapeutic technique named The Sandtray at the Erica Foundation, a privately-run child counselling service in Stockholm. Originally it was called The World, developed by the British paediatrician and child psychiatrist Margaret Lowenfeld. In the 1930s it was imported to Sweden,…

  10. Asphalt Pavements Session 2E-1 : Development of Mix Design Process for Cold-In-Place Recycling [SD .WMV (720x480/29fps/203.0 MB)

    DOT National Transportation Integrated Search

    2003-12-01

    This study evaluates one of the recycling techniques used to rehabilitate pavement, called Cold In-Place Recycling (CIR). CIR is one of the fastest growing road rehabilitation techniques because it is quick and cost-effective. The document reports on...

  11. Mastering Overdetection and Underdetection in Learner-Answer Processing: Simple Techniques for Analysis and Diagnosis

    ERIC Educational Resources Information Center

    Blanchard, Alexia; Kraif, Olivier; Ponton, Claude

    2009-01-01

    This paper presents a "didactic triangulation" strategy to cope with the problem of reliability of NLP applications for computer-assisted language learning (CALL) systems. It is based on the implementation of basic but well mastered NLP techniques and puts the emphasis on an adapted gearing between computable linguistic clues and didactic features…

  12. The Buffer Diagnostic Prototype: A fault isolation application using CLIPS

    NASA Technical Reports Server (NTRS)

    Porter, Ken

    1994-01-01

    This paper describes problem domain characteristics and development experiences from using CLIPS 6.0 in a proof-of-concept troubleshooting application called the Buffer Diagnostic Prototype. The problem domain is a large digital communications subsystems called the real-time network (RTN), which was designed to upgrade the launch processing system used for shuttle support at KSC. The RTN enables up to 255 computers to share 50,000 data points with millisecond response times. The RTN's extensive built-in test capability but lack of any automatic fault isolation capability presents a unique opportunity for a diagnostic expert system application. The Buffer Diagnostic Prototype addresses RTN diagnosis with a multiple strategy approach. A novel technique called 'faulty causality' employs inexact qualitative models to process test results. Experimental knowledge provides a capability to recognize symptom-fault associations. The implementation utilizes rule-based and procedural programming techniques, including a goal-directed control structure and simple text-based generic user interface that may be reusable for other rapid prototyping applications. Although limited in scope, this project demonstrates a diagnostic approach that may be adapted to troubleshoot a broad range of equipment.

  13. Charged-particle emission tomography

    PubMed Central

    Ding, Yijun; Caucci, Luca; Barrett, Harrison H.

    2018-01-01

    Purpose Conventional charged-particle imaging techniques —such as autoradiography —provide only two-dimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo charged-particle imaging with a window chamber or an endoscope. Methods Our approach to charged-particle emission tomography uses particle-processing detectors (PPDs) to estimate attributes of each detected particle. The attributes we estimate include location, direction of propagation, and/or the energy deposited in the detector. Estimated attributes are then fed into a reconstruction algorithm to reconstruct the 3D distribution of charged-particle-emitting radionuclides. Several setups to realize PPDs are designed. Reconstruction algorithms for CPET are developed. Results Reconstruction results from simulated data showed that a PPD enables CPET if the PPD measures more attributes than just the position from each detected particle. Experiments showed that a two-foil charged-particle detector is able to measure the position and direction of incident alpha particles. Conclusions We proposed a new volumetric imaging technique for charged-particle-emitting radionuclides, which we have called charged-particle emission tomography (CPET). We also proposed a new class of charged-particle detectors, which we have called particle-processing detectors (PPDs). When a PPD is used to measure the direction and/or energy attributes along with the position attributes, CPET is feasible. PMID:28370094

  14. Charged-particle emission tomography.

    PubMed

    Ding, Yijun; Caucci, Luca; Barrett, Harrison H

    2017-06-01

    Conventional charged-particle imaging techniques - such as autoradiography - provide only two-dimensional (2D) black ex vivo images of thin tissue slices. In order to get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick tissue sections, thus increasing laboratory throughput and eliminating distortions due to registration. CPET also has the potential to enable in vivo charged-particle imaging with a window chamber or an endoscope. Our approach to charged-particle emission tomography uses particle-processing detectors (PPDs) to estimate attributes of each detected particle. The attributes we estimate include location, direction of propagation, and/or the energy deposited in the detector. Estimated attributes are then fed into a reconstruction algorithm to reconstruct the 3D distribution of charged-particle-emitting radionuclides. Several setups to realize PPDs are designed. Reconstruction algorithms for CPET are developed. Reconstruction results from simulated data showed that a PPD enables CPET if the PPD measures more attributes than just the position from each detected particle. Experiments showed that a two-foil charged-particle detector is able to measure the position and direction of incident alpha particles. We proposed a new volumetric imaging technique for charged-particle-emitting radionuclides, which we have called charged-particle emission tomography (CPET). We also proposed a new class of charged-particle detectors, which we have called particle-processing detectors (PPDs). When a PPD is used to measure the direction and/or energy attributes along with the position attributes, CPET is feasible. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Applying industrial process improvement techniques to increase efficiency in a surgical practice.

    PubMed

    Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan

    2014-10-01

    The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P < .01). Utilizing process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  17. Spotlight-Mode Synthetic Aperture Radar Processing for High-Resolution Lunar Mapping

    NASA Technical Reports Server (NTRS)

    Harcke, Leif; Weintraub, Lawrence; Yun, Sang-Ho; Dickinson, Richard; Gurrola, Eric; Hensley, Scott; Marechal, Nicholas

    2010-01-01

    During the 2008-2009 year, the Goldstone Solar System Radar was upgraded to support radar mapping of the lunar poles at 4 m resolution. The finer resolution of the new system and the accompanying migration through resolution cells called for spotlight, rather than delay-Doppler, imaging techniques. A new pre-processing system supports fast-time Doppler removal and motion compensation to a point. Two spotlight imaging techniques which compensate for phase errors due to i) out of focus-plane motion of the radar and ii) local topography, have been implemented and tested. One is based on the polar format algorithm followed by a unique autofocus technique, the other is a full bistatic time-domain backprojection technique. The processing system yields imagery of the specified resolution. Products enabled by this new system include topographic mapping through radar interferometry, and change detection techniques (amplitude and coherent change) for geolocation of the NASA LCROSS mission impact site.

  18. Innovation in Sales Training

    ERIC Educational Resources Information Center

    Spencer, R. W.

    1974-01-01

    The British Gas Corporation has formulated and refined the incident process of training into their own method, which they call developing case study. Sales trainees learn indoor and outdoor sales techniques for selling central heating through self-taught case studies. (DS)

  19. Using decision-tree classifier systems to extract knowledge from databases

    NASA Technical Reports Server (NTRS)

    St.clair, D. C.; Sabharwal, C. L.; Hacke, Keith; Bond, W. E.

    1990-01-01

    One difficulty in applying artificial intelligence techniques to the solution of real world problems is that the development and maintenance of many AI systems, such as those used in diagnostics, require large amounts of human resources. At the same time, databases frequently exist which contain information about the process(es) of interest. Recently, efforts to reduce development and maintenance costs of AI systems have focused on using machine learning techniques to extract knowledge from existing databases. Research is described in the area of knowledge extraction using a class of machine learning techniques called decision-tree classifier systems. Results of this research suggest ways of performing knowledge extraction which may be applied in numerous situations. In addition, a measurement called the concept strength metric (CSM) is described which can be used to determine how well the resulting decision tree can differentiate between the concepts it has learned. The CSM can be used to determine whether or not additional knowledge needs to be extracted from the database. An experiment involving real world data is presented to illustrate the concepts described.

  20. Flood mapping from Sentinel-1 and Landsat-8 data: a case study from river Evros, Greece

    NASA Astrophysics Data System (ADS)

    Kyriou, Aggeliki; Nikolakopoulos, Konstantinos

    2015-10-01

    Floods are suddenly and temporary natural events, affecting areas which are not normally covered by water. The influence of floods plays a significant role both in society and the natural environment, therefore flood mapping is crucial. Remote sensing data can be used to develop flood map in an efficient and effective way. This work is focused on expansion of water bodies overtopping natural levees of the river Evros, invading the surroundings areas and converting them in flooded. Different techniques of flood mapping were used using data from active and passive remote sensing sensors like Sentinlel-1 and Landsat-8 respectively. Space borne pairs obtained from Sentinel-1 were processed in this study. Each pair included an image during the flood, which is called "crisis image" and another one before the event, which is called "archived image". Both images covering the same area were processed producing a map, which shows the spread of the flood. Multispectral data From Landsat-8 were also processed in order to detect and map the flooded areas. Different image processing techniques were applied and the results were compared to the respective results of the radar data processing.

  1. Parallel computing in genomic research: advances and applications

    PubMed Central

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today’s genomic experiments have to process the so-called “biological big data” that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities. PMID:26604801

  2. Parallel computing in genomic research: advances and applications.

    PubMed

    Ocaña, Kary; de Oliveira, Daniel

    2015-01-01

    Today's genomic experiments have to process the so-called "biological big data" that is now reaching the size of Terabytes and Petabytes. To process this huge amount of data, scientists may require weeks or months if they use their own workstations. Parallelism techniques and high-performance computing (HPC) environments can be applied for reducing the total processing time and to ease the management, treatment, and analyses of this data. However, running bioinformatics experiments in HPC environments such as clouds, grids, clusters, and graphics processing unit requires the expertise from scientists to integrate computational, biological, and mathematical techniques and technologies. Several solutions have already been proposed to allow scientists for processing their genomic experiments using HPC capabilities and parallelism techniques. This article brings a systematic review of literature that surveys the most recently published research involving genomics and parallel computing. Our objective is to gather the main characteristics, benefits, and challenges that can be considered by scientists when running their genomic experiments to benefit from parallelism techniques and HPC capabilities.

  3. The Timing of Island Effects in Nonnative Sentence Processing

    ERIC Educational Resources Information Center

    Felser, Claudia; Cunnings, Ian; Batterham, Claire; Clahsen, Harald

    2012-01-01

    Using the eye-movement monitoring technique in two reading comprehension experiments, this study investigated the timing of constraints on wh-dependencies (so-called island constraints) in first- and second-language (L1 and L2) sentence processing. The results show that both L1 and L2 speakers of English are sensitive to extraction islands during…

  4. Wave scheduling - Decentralized scheduling of task forces in multicomputers

    NASA Technical Reports Server (NTRS)

    Van Tilborg, A. M.; Wittie, L. D.

    1984-01-01

    Decentralized operating systems that control large multicomputers need techniques to schedule competing parallel programs called task forces. Wave scheduling is a probabilistic technique that uses a hierarchical distributed virtual machine to schedule task forces by recursively subdividing and issuing wavefront-like commands to processing elements capable of executing individual tasks. Wave scheduling is highly resistant to processing element failures because it uses many distributed schedulers that dynamically assign scheduling responsibilities among themselves. The scheduling technique is trivially extensible as more processing elements join the host multicomputer. A simple model of scheduling cost is used by every scheduler node to distribute scheduling activity and minimize wasted processing capacity by using perceived workload to vary decentralized scheduling rules. At low to moderate levels of network activity, wave scheduling is only slightly less efficient than a central scheduler in its ability to direct processing elements to accomplish useful work.

  5. Process Product Integrity Audits: A Hardware Auditing Technique for the '90s'

    NASA Technical Reports Server (NTRS)

    Taylor, Mike

    1994-01-01

    The Space Shuttle program has experienced hardware problems that have delayed several shuttle launches. A NASA review determined that the problems could have been prevented. NASA further concluded that a new kind of Quality emphasis at all Space Shuttle prime contractors and subcontractors was necessary to ensure mission success. To meet this challenge, NASA initiated an innovative review process called Process/Product Integrity (PPIA).

  6. Comparison of Point Matching Techniques for Road Network Matching

    NASA Astrophysics Data System (ADS)

    Hackeloeer, A.; Klasing, K.; Krisp, J. M.; Meng, L.

    2013-05-01

    Map conflation investigates the unique identification of geographical entities across different maps depicting the same geographic region. It involves a matching process which aims to find commonalities between geographic features. A specific subdomain of conflation called Road Network Matching establishes correspondences between road networks of different maps on multiple layers of abstraction, ranging from elementary point locations to high-level structures such as road segments or even subgraphs derived from the induced graph of a road network. The process of identifying points located on different maps by means of geometrical, topological and semantical information is called point matching. This paper provides an overview of various techniques for point matching, which is a fundamental requirement for subsequent matching steps focusing on complex high-level entities in geospatial networks. Common point matching approaches as well as certain combinations of these are described, classified and evaluated. Furthermore, a novel similarity metric called the Exact Angular Index is introduced, which considers both topological and geometrical aspects. The results offer a basis for further research on a bottom-up matching process for complex map features, which must rely upon findings derived from suitable point matching algorithms. In the context of Road Network Matching, reliable point matches provide an immediate starting point for finding matches between line segments describing the geometry and topology of road networks, which may in turn be used for performing a structural high-level matching on the network level.

  7. Popular Theatre: A Technique for Participatory Research. Participatory Research Project. Working Paper No. 5.

    ERIC Educational Resources Information Center

    Kidd, Ross; Byram, Martin

    Popular theatre that speaks to the common man in his own language and deals with directly relevant problems can be an effective adult education tool in the process Paulo Freire calls conscientization--a process aiming to radically transform social reality and improve people's lives. It can also serve as a medium for participatory research. Popular…

  8. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  9. Measuring the apparent size of the Moon with a digital camera

    NASA Astrophysics Data System (ADS)

    Ellery, Adam; Hughes, Stephen

    2012-09-01

    The Moon appears to be much larger closer to the horizon than when higher in the sky. This is called the ‘Moon illusion’ since the observed size of the Moon is not actually larger when the Moon is just above the horizon. This paper describes a technique for verifying that the observed size of the Moon is not larger on the horizon. The technique can be performed easily in a high-school teaching environment. Moreover, the technique demonstrates the surprising fact that the observed size of the Moon is actually smaller on the horizon due to atmospheric refraction. For the purposes of this paper, several images of the Moon were taken with it close to the horizon and close to the zenith. The images were processed using a free program called ImageJ. The Moon was found to be 5.73 ± 0.04% smaller in area on the horizon then at the zenith.

  10. EPA Science Matters Newsletter: Greener Cleanups at Hazardous Waste Sites (Published August 2013)

    EPA Pesticide Factsheets

    Read about the EPA’s Smart Energy Resources Guide (SERG). The guide covers techniques for superfund managers to reduce cleanup emissions in a process called green remediation, and can be used by any site remediation and redevelopment manager.

  11. Issue Scanning: Finding the Future...Maybe.

    ERIC Educational Resources Information Center

    Plog, Michael; Sweeney, Jim; Weiss, Barry

    Issue Scanning, sometimes called Environmental Scanning, is used in many business, government, educational, and nonprofit organizations. The technique is supposed to monitor the "pulse" of the external environment. The scanning process should lessen the randomness of the information used in decision making, and it should alert managers…

  12. Esquisse d'une grammaire de l'imaginaire (Sketch of a Grammar of the Fanciful).

    ERIC Educational Resources Information Center

    Ruck, Heribert

    1986-01-01

    Proposes an approach to teaching grammar that calls on the student's imagination and frees the learning process from classroom routine. The technique uses examples of specific constructions in French poetry to illustrate principles of grammar and discourse. (MSE)

  13. MuLoG, or How to Apply Gaussian Denoisers to Multi-Channel SAR Speckle Reduction?

    PubMed

    Deledalle, Charles-Alban; Denis, Loic; Tabti, Sonia; Tupin, Florence

    2017-09-01

    Speckle reduction is a longstanding topic in synthetic aperture radar (SAR) imaging. Since most current and planned SAR imaging satellites operate in polarimetric, interferometric, or tomographic modes, SAR images are multi-channel and speckle reduction techniques must jointly process all channels to recover polarimetric and interferometric information. The distinctive nature of SAR signal (complex-valued, corrupted by multiplicative fluctuations) calls for the development of specialized methods for speckle reduction. Image denoising is a very active topic in image processing with a wide variety of approaches and many denoising algorithms available, almost always designed for additive Gaussian noise suppression. This paper proposes a general scheme, called MuLoG (MUlti-channel LOgarithm with Gaussian denoising), to include such Gaussian denoisers within a multi-channel SAR speckle reduction technique. A new family of speckle reduction algorithms can thus be obtained, benefiting from the ongoing progress in Gaussian denoising, and offering several speckle reduction results often displaying method-specific artifacts that can be dismissed by comparison between results.

  14. Perspective on Kraken Mare Shores

    NASA Image and Video Library

    2015-02-12

    This Cassini Synthetic Aperture Radar (SAR) image is presented as a perspective view and shows a landscape near the eastern shoreline of Kraken Mare, a hydrocarbon sea in Titan's north polar region. This image was processed using a technique for handling noise that results in clearer views that can be easier for researchers to interpret. The technique, called despeckling, also is useful for producing altimetry data and 3-D views called digital elevation maps. Scientists have used a technique called radargrammetry to determine the altitude of surface features in this view at a resolution of approximately half a mile, or 1 kilometer. The altimetry reveals that the area is smooth overall, with a maximum amplitude of 0.75 mile (1.2 kilometers) in height. The topography also shows that all observed channels flow downhill. The presence of what scientists call "knickpoints" -- locations on a river where a sharp change in slope occurs -- might indicate stratification in the bedrock, erosion mechanisms at work or a particular way the surface responds to runoff events, such as floods following large storms. One such knickpoint is visible just above the lower left corner, where an area of bright slopes is seen. The image was obtained during a flyby of Titan on April 10, 2007. A more traditional radar image of this area on Titan is seen in PIA19046. http://photojournal.jpl.nasa.gov/catalog/PIA19051

  15. Los Alamos, Toshiba probing Fukushima with cosmic rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Christopher

    2014-06-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create imagesmore » of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.« less

  16. Los Alamos, Toshiba probing Fukushima with cosmic rays

    ScienceCinema

    Morris, Christopher

    2018-01-16

    Los Alamos National Laboratory has announced an impending partnership with Toshiba Corporation to use a Los Alamos technique called muon tomography to safely peer inside the cores of the Fukushima Daiichi reactors and create high-resolution images of the damaged nuclear material inside without ever breaching the cores themselves. The initiative could reduce the time required to clean up the disabled complex by at least a decade and greatly reduce radiation exposure to personnel working at the plant. Muon radiography (also called cosmic-ray radiography) uses secondary particles generated when cosmic rays collide with upper regions of Earth's atmosphere to create images of the objects that the particles, called muons, penetrate. The process is analogous to an X-ray image, except muons are produced naturally and do not damage the materials they contact. Muon radiography has been used before in imaginative applications such as mapping the interior of the Great Pyramid at Giza, but Los Alamos's muon tomography technique represents a vast improvement over earlier technology.

  17. The Strategic Thinking Process: Efficient Mobilization of Human Resources for System Definition

    PubMed Central

    Covvey, H. D.

    1987-01-01

    This paper describes the application of several group management techniques to the creation of needs specifications and information systems strategic plans in health care institutions. The overall process is called the “Strategic Thinking Process”. It is a formal methodology that can reduce the time and cost of creating key documents essential for the successful implementation of health care information systems.

  18. Memory Forensics: Review of Acquisition and Analysis Techniques

    DTIC Science & Technology

    2013-11-01

    Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within

  19. The "Iron Inventor": Using Creative Problem Solving to Spur Student Creativity

    ERIC Educational Resources Information Center

    Lee, Seung Hwan; Hoffman, K. Douglas

    2014-01-01

    Based on the popular television show the "Iron Chef," an innovative marketing activity called the "Iron Inventor" is introduced. Using the creative problem-solving approach and active learning techniques, the Iron Inventor facilitates student learning pertaining to the step-by-step processes of creating a new product and…

  20. The development of the deterministic nonlinear PDEs in particle physics to stochastic case

    NASA Astrophysics Data System (ADS)

    Abdelrahman, Mahmoud A. E.; Sohaly, M. A.

    2018-06-01

    In the present work, accuracy method called, Riccati-Bernoulli Sub-ODE technique is used for solving the deterministic and stochastic case of the Phi-4 equation and the nonlinear Foam Drainage equation. Also, the control on the randomness input is studied for stability stochastic process solution.

  1. Path Planning For A Class Of Cutting Operations

    NASA Astrophysics Data System (ADS)

    Tavora, Jose

    1989-03-01

    Optimizing processing time in some contour-cutting operations requires solving the so-called no-load path problem. This problem is formulated and an approximate resolution method (based on heuristic search techniques) is described. Results for real-life instances (clothing layouts in the apparel industry) are presented and evaluated.

  2. ChemBrowser: a flexible framework for mining chemical documents.

    PubMed

    Wu, Xian; Zhang, Li; Chen, Ying; Rhodes, James; Griffin, Thomas D; Boyer, Stephen K; Alba, Alfredo; Cai, Keke

    2010-01-01

    The ability to extract chemical and biological entities and relations from text documents automatically has great value to biochemical research and development activities. The growing maturity of text mining and artificial intelligence technologies shows promise in enabling such automatic chemical entity extraction capabilities (called "Chemical Annotation" in this paper). Many techniques have been reported in the literature, ranging from dictionary and rule-based techniques to machine learning approaches. In practice, we found that no single technique works well in all cases. A combinatorial approach that allows one to quickly compose different annotation techniques together for a given situation is most effective. In this paper, we describe the key challenges we face in real-world chemical annotation scenarios. We then present a solution called ChemBrowser which has a flexible framework for chemical annotation. ChemBrowser includes a suite of customizable processing units that might be utilized in a chemical annotator, a high-level language that describes the composition of various processing units that would form a chemical annotator, and an execution engine that translates the composition language to an actual annotator that can generate annotation results for a given set of documents. We demonstrate the impact of this approach by tailoring an annotator for extracting chemical names from patent documents and show how this annotator can be easily modified with simple configuration alone.

  3. Diffusion bonding aeroengine components

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, G. A.; Broughton, T.

    1988-10-01

    The use of diffusion bonding processes at Rolls-Royce for the manufacture of titanium-alloy aircraft engine components and structures is described. A liquid-phase diffusion bonding process called activated diffusion bonding has been developed for the manufacture of the hollow titanium wide chord fan blade. In addition, solid-state diffusion bonding is being used in the manufacture of hollow vane/blade airfoil constructions mainly in conjunction with superplastic forming and hot forming techniques.

  4. Titration in the treatment of the more troubled patient.

    PubMed

    Winer, J A; Ornstein, E D

    2001-01-01

    This article defines and discusses a modification of technique recommended by the authors in the psychoanalytic treatment of more troubled patients--a modification they call titration. Titration is defined as a conscious decision by the analyst to increase or decrease assistance (or gratification) gradually, in order to facilitate the analytic process. The authors emphasize the complexity of decisions in treatment by focusing on the decision-making processes that titration requires. Guidelines and a case vignette are presented. The authors conclude by considering some of the politics involved in the introduction of technique modifications, the salience of the titration concept, and directions for further exploration.

  5. Iterative repair for scheduling and rescheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Deale, Michael

    1991-01-01

    An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.

  6. Applications for Gradient Metal Alloys Fabricated Using Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Hofmann, Douglas C.; Borgonia, John Paul C.; Dillon, Robert P.; Suh, Eric J.; Mulder, jerry L.; Gardner, Paul B.

    2013-01-01

    Recently, additive manufacturing (AM) techniques have been developed that may shift the paradigm of traditional metal production by allowing complex net-shaped hardware to be built up layer-by-layer, rather than being machined from a billet. The AM process is ubiquitous with polymers due to their low melting temperatures, fast curing, and controllable viscosity, and 3D printers are widely available as commercial or consumer products. 3D printing with metals is inherently more complicated than with polymers due to their higher melting temperatures and reactivity with air, particularly when heated or molten. The process generally requires a high-power laser or other focused heat source, like an electron beam, for precise melting and deposition. Several promising metal AM techniques have been developed, including laser deposition (also called laser engineered net shaping or LENS® and laser deposition technology (LDT)), direct metal laser sintering (DMLS), and electron beam free-form (EBF). These machines typically use powders or wire feedstock that are melted and deposited using a laser or electron beam. Complex net-shape parts have been widely demonstrated using these (and other) AM techniques and the process appears to be a promising alternative to machining in some cases. Rather than simply competing with traditional machining for cost and time savings, the true advantage of AM involves the fabrication of hardware that cannot be produced using other techniques. This could include parts with "blind" features (like foams or trusses), parts that are difficult to machine conventionally, or parts made from materials that do not exist in bulk forms. In this work, the inventors identify that several AM techniques can be used to develop metal parts that change composition from one location in the part to another, allowing for complete control over the mechanical or physical properties. This changes the paradigm for conventional metal fabrication, which relies on an assortment of "post-processing" methods to locally alter properties (such as coating, heat treating, work hardening, shot peening, etching, anodizing, among others). Building the final part in an additive process allows for the development of an entirely new class of metals, so-called "functionally graded metals" or "gradient alloys." By carefully blending feedstock materials with different properties in an AM process, hardware can be developed with properties that cannot be obtained using other techniques but with the added benefit of the net-shaped fabrication that AM allows.

  7. Encapsulation materials research

    NASA Technical Reports Server (NTRS)

    Willis, P.

    1985-01-01

    The successful use of outdoor mounting racks as an accelerated aging technique (these devices are called optal reactors); a beginning list of candidate pottant materials for thin-film encapsulation, which process at temperatures well below 100 C; and description of a preliminary flame retardant formulation for ethylene vinyl acetate which could function to increase module flammability ratings are presented.

  8. Analysis of Questionnaires Applied in the Evaluation Process of Academicians in Higher Education Institutes

    ERIC Educational Resources Information Center

    Kalayci, Nurdan; Cimen, Orhan

    2012-01-01

    The aim of this study is to examine the questionnaires used to evaluate teaching performance in higher education institutes and called "Instructor and Course Evaluation Questionnaires (ICEQ)" in terms of questionnaire preparation techniques and components of curriculum. Obtaining at least one ICEQ belonging to any state and private…

  9. Statistical Process Control Techniques for the Telecommunications Systems Manager

    DTIC Science & Technology

    1992-03-01

    products that are out of 59 tolerance and bad designs. The third type of defect, mistakes, are remedied by Poka - Yoke methods that are 1 introduced later...based on total production costs plus quality costs. Once production is underway, interventions are determined by their impact on the QLF. F. POKA - YOKE ...Mistakes require process improvements called Poka Yoke or mistake proofing. Shiego Shingo developed Poka Yoke methods to incorporate 100% inspection at

  10. Planning effectiveness may grow on fault trees.

    PubMed

    Chow, C W; Haddad, K; Mannino, B

    1991-10-01

    The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.

  11. Evolutionary neural networks for anomaly detection based on the behavior of a program.

    PubMed

    Han, Sang-Jun; Cho, Sung-Bae

    2006-06-01

    The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.

  12. [Health protection for rural workers: the need to standardize techniques for quantifying dermal exposure to pesticides].

    PubMed

    Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga

    2014-05-01

    Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

  13. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  14. Aid for the Medical Laboratory

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A process for separating chemical compounds in fluids resulted from a Jet Propulsion Laboratory (JPL)/LAPD project. The technique involves pouring a blood or urine sample into an extraction tube where packing material contained in a disposable tube called an "extraction column" absorbs water and spreads the specimen as a thin film, making it easy to identify specific components. When a solvent passes through the packing material, the desired compound dissolves and exits through the tube's bottom stem and is collected. Called AUDRI, Automated Drug Identification, it is commercially produced by Analytichem International which has successfully advanced the original technology.

  15. Systems-oriented survey of noncontact temperature measurement techniques for rapid thermal processing

    NASA Astrophysics Data System (ADS)

    Peyton, David; Kinoshita, Hiroyuki; Lo, G. Q.; Kwong, Dim-Lee

    1991-04-01

    Rapid Thermal Processing (RTP) is becoming a popular approach for future ULSI manufacturing due to its unique low thermal budget and process flexibility. Furthermore when RTP is combined with Chemical Vapor Deposition (CVD) the so-called RTP-CVD technology it can be used to deposit ultrathin films with extremely sharp interfaces and excellent material qualities. One major consequence of this type of processing however is the need for extremely tight control of wafer temperature both to obtain reproducible results for process control and to minimize slip and warpage arising from nonuniformities in temperature. Specifically temperature measurement systems suitable for RiP must have both high precision--within 1-2 degrees--and a short response time--to output an accurate reading on the order of milliseconds for closedloop control. Any such in-situ measurement technique must be non-contact since thermocouples cannot meet the response time requirements and have problems with conductive heat flow in the wafer. To date optical pyrometry has been the most widely used technique for RiP systems although a number of other techniques are being considered and researched. This article examines several such techniques from a systems perspective: optical pyrometry both conventional and a new approach using ellipsometric techniques for concurrent emissivity measurement Raman scattering infrared laser thermometry optical diffraction thermometry and photoacoustic thermometry. Each approach is evaluated in terms of its actual or estimated manufacturing cost remote sensing capability precision repeatability dependence on processing history range

  16. A modified prebind engagement process reduces biomechanical loading on front row players during scrummaging: a cross-sectional study of 11 elite teams.

    PubMed

    Cazzola, Dario; Preatoni, Ezio; Stokes, Keith A; England, Michael E; Trewartha, Grant

    2015-04-01

    Biomechanical studies of the rugby union scrum have typically been conducted using instrumented scrum machines, but a large-scale biomechanical analysis of live contested scrummaging is lacking. We investigated whether the biomechanical loading experienced by professional front row players during the engagement phase of live contested rugby scrums could be reduced using a modified engagement procedure. Eleven professional teams (22 forward packs) performed repeated scrum trials for each of the three engagement techniques, outdoors, on natural turf. The engagement processes were the 2011/2012 (referee calls crouch-touch-pause-engage), 2012/2013 (referee calls crouch-touch-set) and 2013/2014 (props prebind with the opposition prior to the 'Set' command; PreBind) variants. Forces were estimated by pressure sensors on the shoulders of the front row players of one forward pack. Inertial Measurement Units were placed on an upper spine cervical landmark (C7) of the six front row players to record accelerations. Players' motion was captured by multiple video cameras from three viewing perspectives and analysed in transverse and sagittal planes of motion. The PreBind technique reduced biomechanical loading in comparison with the other engagement techniques, with engagement speed, peak forces and peak accelerations of upper spine landmarks reduced by approximately 20%. There were no significant differences between techniques in terms of body kinematics and average force during the sustained push phase. Using a scrum engagement process which involves binding with the opposition prior to the engagement reduces the stresses acting on players and therefore may represent a possible improvement for players' safety. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghamarian, Iman, E-mail: imanghamarian@yahoo.com; Department of Materials Science and Engineering, University of North Texas, Denton, TX 76203; Samimi, Peyman

    The presence and interaction of nanotwins, geometrically necessary dislocations, and grain boundaries play a key role in the mechanical properties of nanostructured crystalline materials. Therefore, it is vital to determine the orientation, width and distance of nanotwins, the angle and axis of grain boundary misorientations as well as the type and the distributions of dislocations in an automatic and statistically meaningful fashion in a relatively large area. In this paper, such details are provided using a transmission electron microscope-based orientation microscopy technique called ASTAR™/precession electron diffraction. The remarkable spatial resolution of this technique (~ 2 nm) enables highly detailed characterizationmore » of nanotwins, grain boundaries and the configuration of dislocations. This orientation microscopy technique provides the raw data required for the determination of these parameters. The procedures to post-process the ASTAR™/PED datasets in order to obtain the important (and currently largely hidden) details of nanotwins as well as quantifications of dislocation density distributions are described in this study. - Highlights: • EBSD cannot characterize defects such as dislocations, grain boundaries and nanotwins in severely deformed metals. • TEM based orientation microscopy technique called ASTAR™/PED was used to resolve the problem. • Locations and orientations of nanotwins, dislocation density distribution and grain boundary characters can be resolved. • This work provides the bases for further studies on the interactions between dislocations, grain boundaries and nanotwins. • The computation part is explained sufficiently which helps the readers to post process their own data.« less

  18. NMR Hole-Burning Experiments on Superionic Conductor Glasses

    NASA Astrophysics Data System (ADS)

    Kawamura, J.; Kuwata, N.; Hattori, T.

    2004-04-01

    Inhomogeneity is an inherent nature of glass, which is the density and concentration fluctuation frozen at glass transition temperature. The inhomogeneity of the glass plays significant role in so called superionic conductor glasses (SIG), since the mobile ions seek to move through energetically favorable paths. The localization of mobile ions in SIG near the 2nd glass transition is a remaining issue, where the trapping, percolation and many-body interactions are playing the roles. In order to investigate the trapping process in SIG, the authors have applied 109Ag NMR Hole-Burning technique to AgI containing SIG glasses. By using this technique, the slowing down process of the site-exchange rates between different sites were evaluated.

  19. Subcritical Transition in Channel Flows

    NASA Astrophysics Data System (ADS)

    Maestri, Joseph; Hall, Philip

    2014-11-01

    Exact-coherent structures, or colloquially non-linear solutions to the Navier-Stokes equations, have been the subject of great interest over the past decade due to their relevance in understanding the process of transition to turbulence in shear flows. Over the past few years the relationship between high Reynolds number vortex-wave interaction theory and such states has been elucidated in a number of papers and has provided a solid asymptotic framework to understand the so-called self-sustaining process that maintains such structures. In this talk, we will discuss this relationship before talking about recent work on solving the vortex-wave interaction equations using numerical techniques in order to propose laminar-flow control techniques.

  20. Tissue Engineering the Cornea: The Evolution of RAFT

    PubMed Central

    Levis, Hannah J.; Kureshi, Alvena K.; Massie, Isobel; Morgan, Louise; Vernon, Amanda J.; Daniels, Julie T.

    2015-01-01

    Corneal blindness affects over 10 million people worldwide and current treatment strategies often involve replacement of the defective layer with healthy tissue. Due to a worldwide donor cornea shortage and the absence of suitable biological scaffolds, recent research has focused on the development of tissue engineering techniques to create alternative therapies. This review will detail how we have refined the simple engineering technique of plastic compression of collagen to a process we now call Real Architecture for 3D Tissues (RAFT). The RAFT production process has been standardised, and steps have been taken to consider Good Manufacturing Practice compliance. The evolution of this process has allowed us to create biomimetic epithelial and endothelial tissue equivalents suitable for transplantation and ideal for studying cell-cell interactions in vitro. PMID:25809689

  1. Safety Auditing and Assessments

    NASA Technical Reports Server (NTRS)

    Goodin, James Ronald (Ronnie)

    2005-01-01

    Safety professionals typically do not engage in audits and independent assessments with the vigor as do our quality brethren. Taking advantage of industry and government experience conducting value added Independent Assessments or Audits benefits a safety program. Most other organizations simply call this process "internal audits." Sources of audit training are presented and compared. A relation of logic between audit techniques and mishap investigation is discussed. An example of an audit process is offered. Shortcomings and pitfalls of auditing are covered.

  2. Safety Auditing and Assessments

    NASA Astrophysics Data System (ADS)

    Goodin, Ronnie

    2005-12-01

    Safety professionals typically do not engage in audits and independent assessments with the vigor as do our quality brethren. Taking advantage of industry and government experience conducting value added Independent Assessments or Audits benefits a safety program. Most other organizations simply call this process "internal audits." Sources of audit training are presented and compared. A relation of logic between audit techniques and mishap investigation is discussed. An example of an audit process is offered. Shortcomings and pitfalls of auditing are covered.

  3. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  4. BPSK Demodulation Using Digital Signal Processing

    NASA Technical Reports Server (NTRS)

    Garcia, Thomas R.

    1996-01-01

    A digital communications signal is a sinusoidal waveform that is modified by a binary (digital) information signal. The sinusoidal waveform is called the carrier. The carrier may be modified in amplitude, frequency, phase, or a combination of these. In this project a binary phase shift keyed (BPSK) signal is the communication signal. In a BPSK signal the phase of the carrier is set to one of two states, 180 degrees apart, by a binary (i.e., 1 or 0) information signal. A digital signal is a sampled version of a "real world" time continuous signal. The digital signal is generated by sampling the continuous signal at discrete points in time. The rate at which the signal is sampled is called the sampling rate (f(s)). The device that performs this operation is called an analog-to-digital (A/D) converter or a digitizer. The digital signal is composed of the sequence of individual values of the sampled BPSK signal. Digital signal processing (DSP) is the modification of the digital signal by mathematical operations. A device that performs this processing is called a digital signal processor. After processing, the digital signal may then be converted back to an analog signal using a digital-to-analog (D/A) converter. The goal of this project is to develop a system that will recover the digital information from a BPSK signal using DSP techniques. The project is broken down into the following steps: (1) Development of the algorithms required to demodulate the BPSK signal; (2) Simulation of the system; and (3) Implementation a BPSK receiver using digital signal processing hardware.

  5. [Dream in the land of paradoxical sleep].

    PubMed

    Pire, E; Herman, G; Cambron, L; Maquet, P; Poirrier, R

    2008-01-01

    Paradoxical sleep (PS or REM sleep) is traditionally a matter for neurophysiology, a science of the brain. Dream is associated with neuropsychology and sciences of the mind. The relationships between sleep and dream are better understood in the light of new methodologies in both domains, particularly those of basic neurosciences which elucidate the mechanisms underlying SP and functional imaging techniques. Data from these approaches are placed here in the perspective of rather old clinical observations in human cerebral lesions and in the phylogeny of vertebrates, in order to support a theory of dream. Dreams may be seen as a living marker of a cognitivo-emotional process, called here "eidictic process", involving posterior brain and limbic structures, keeping up during wakefulness, but subjected, at that time, to the leading role of a cognitivo-rational process, called here "thought process". The last one is of instrumental origin in human beings. It involves prefrontal cortices (executive tasks) and frontal/parietal cortices (attention) in the brain. Some clinical implications of the theory are illustrated.

  6. Modern Radar Techniques for Geophysical Applications: Two Examples

    NASA Technical Reports Server (NTRS)

    Arokiasamy, B. J.; Bianchi, C.; Sciacca, U.; Tutone, G.; Zirizzotti, A.; Zuccheretti, E.

    2005-01-01

    The last decade of the evolution of radar was heavily influenced by the rapid increase in the information processing capabilities. Advances in solid state radio HF devices, digital technology, computing architectures and software offered the designers to develop very efficient radars. In designing modern radars the emphasis goes towards the simplification of the system hardware, reduction of overall power, which is compensated by coding and real time signal processing techniques. Radars are commonly employed in geophysical radio soundings like probing the ionosphere; stratosphere-mesosphere measurement, weather forecast, GPR and radio-glaciology etc. In the laboratorio di Geofisica Ambientale of the Istituto Nazionale di Geofisica e Vulcanologia (INGV), Rome, Italy, we developed two pulse compression radars. The first is a HF radar called AIS-INGV; Advanced Ionospheric Sounder designed both for the purpose of research and for routine service of the HF radio wave propagation forecast. The second is a VHF radar called GLACIORADAR, which will be substituting the high power envelope radar used by the Italian Glaciological group. This will be employed in studying the sub glacial structures of Antarctica, giving information about layering, the bed rock and sub glacial lakes if present. These are low power radars, which heavily rely on advanced hardware and powerful real time signal processing. Additional information is included in the original extended abstract.

  7. Automating "Word of Mouth" to Recommend Classes to Students: An Application of Social Information Filtering Algorithms

    ERIC Educational Resources Information Center

    Booker, Queen Esther

    2009-01-01

    An approach used to tackle the problem of helping online students find the classes they want and need is a filtering technique called "social information filtering," a general approach to personalized information filtering. Social information filtering essentially automates the process of "word-of-mouth" recommendations: items are recommended to a…

  8. The Effects of Background Noise on Dichotic Listening to Consonant-Vowel Syllables

    ERIC Educational Resources Information Center

    Sequeira, Sarah Dos Santos; Specht, Karsten; Hamalainen, Heikki; Hugdahl, Kenneth

    2008-01-01

    Lateralization of verbal processing is frequently studied with the dichotic listening technique, yielding a so called right ear advantage (REA) to consonant-vowel (CV) syllables. However, little is known about how background noise affects the REA. To address this issue, we presented CV-syllables either in silence or with traffic background noise…

  9. Claim audits: a relic of the indemnity age?

    PubMed

    Ellender, D E

    1997-09-01

    Traditional claim audits offering quick fixes to specific problems or to recover overpayments will not provide benefit managers with the data and action plan they need to make informed decisions about cost-effective benefit administration. Today's benefits environment calls for a comprehensive review of claim administration, incorporating traditional audit techniques into a quality improvement audit process.

  10. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  11. Particle dynamics during nanoparticle synthesis by laser ablation in a background gas

    NASA Astrophysics Data System (ADS)

    Nakata, Yoshiki; Muramoto, Junichi; Okada, Tatsuo; Maeda, Mitsuo

    2002-02-01

    Particle dynamics during Si nanoparticle synthesis in a laser-ablation plume in different background gases were investigated by laser-spectroscopic imaging techniques. Two-dimensional laser induced fluorescence and ultraviolet Rayleigh scattering techniques were used to visualize the spatial distribution of the Si atoms and nanoparticles grown, respectively. We have developed a visualization technique called re-decomposition laser-induced fluorescence to observe small nanoparticles (hereafter called clusters) which are difficult to observe by the conventional imaging techniques. In this article, the whole process of nanoparticle synthesis in different background gases of He, Ne, Ar, N2 and O2 was investigated by these techniques. In He, Ne, Ar and N2 background gases at 10 Torr, the clustering of the Si atoms started 200, 250, 300 and 800 μs after ablation, respectively. The growth rate of the clusters in He background gas was much larger than that in the other gases. The spatial distributions of the Si nanoparticles were mushroom like in He, N2 and O2, and column like in Ne and Ar. It is thought that the difference in distribution was caused by differences in the flow characteristics of the background gases, which would imply that the viscosity of the background gas is one of the main governing parameters.

  12. Modified Powder-in-Tube Technique Based on the Consolidation Processing of Powder Materials for Fabricating Specialty Optical Fibers

    PubMed Central

    Auguste, Jean-Louis; Humbert, Georges; Leparmentier, Stéphanie; Kudinova, Maryna; Martin, Pierre-Olivier; Delaizir, Gaëlle; Schuster, Kay; Litzkendorf, Doris

    2014-01-01

    The objective of this paper is to demonstrate the interest of a consolidation process associated with the powder-in-tube technique in order to fabricate a long length of specialty optical fibers. This so-called Modified Powder-in-Tube (MPIT) process is very flexible and paves the way to multimaterial optical fiber fabrications with different core and cladding glassy materials. Another feature of this technique lies in the sintering of the preform under reducing or oxidizing atmosphere. The fabrication of such optical fibers implies different constraints that we have to deal with, namely chemical species diffusion or mechanical stress due to the mismatches between thermal expansion coefficients and working temperatures of the fiber materials. This paper focuses on preliminary results obtained with a lanthano-aluminosilicate glass used as the core material for the fabrication of all-glass fibers or specialty Photonic Crystal Fibers (PCFs). To complete the panel of original microstructures now available by the MPIT technique, we also present several optical fibers in which metallic particles or microwires are included into a silica-based matrix. PMID:28788176

  13. Reflectometric measurement of plasma imaging and applications

    NASA Astrophysics Data System (ADS)

    Mase, A.; Ito, N.; Oda, M.; Komada, Y.; Nagae, D.; Zhang, D.; Kogi, Y.; Tobimatsu, S.; Maruyama, T.; Shimazu, H.; Sakata, E.; Sakai, F.; Kuwahara, D.; Yoshinaga, T.; Tokuzawa, T.; Nagayama, Y.; Kawahata, K.; Yamaguchi, S.; Tsuji-Iio, S.; Domier, C. W.; Luhmann, N. C., Jr.; Park, H. K.; Yun, G.; Lee, W.; Padhi, S.; Kim, K. W.

    2012-01-01

    Progress in microwave and millimeter-wave technologies has made possible advanced diagnostics for application to various fields, such as, plasma diagnostics, radio astronomy, alien substance detection, airborne and spaceborne imaging radars called as synthetic aperture radars, living body measurements. Transmission, reflection, scattering, and radiation processes of electromagnetic waves are utilized as diagnostic tools. In this report we focus on the reflectometric measurements and applications to biological signals (vital signal detection and breast cancer detection) as well as plasma diagnostics, specifically by use of imaging technique and ultra-wideband radar technique.

  14. Transmission ultrasonography. [time delay spectrometry for soft tissue transmission imaging

    NASA Technical Reports Server (NTRS)

    Heyser, R. C.; Le Croissette, D. H.

    1973-01-01

    Review of the results of the application of an advanced signal-processing technique, called time delay spectrometry, in obtaining soft tissue transmission images by transmission ultrasonography, both in vivo and in vitro. The presented results include amplitude ultrasound pictures and phase ultrasound pictures obtained by this technique. While amplitude ultrasonographs of tissue are closely analogous to X-ray pictures in that differential absorption is imaged, phase ultrasonographs represent an entirely new source of information based on differential time of propagation. Thus, a new source of information is made available for detailed analysis.

  15. NCTM of liquids at high temperatures using polarization techniques

    NASA Technical Reports Server (NTRS)

    Krishnan, Shankar; Weber, J. K. Richard; Nordine, Paul C.; Schiffman, Robert A.

    1990-01-01

    Temperature measurement and control is extremely important in any materials processing application. However, conventional techniques for non-contact temperature measurement (mainly optical pyrometry) are very uncertain because of unknown or varying surface emittance. Optical properties like other properties change during processing. A dynamic, in-situ measurement of optical properties including the emittance is required. Intersonics is developing new technologies using polarized laser light scattering to determine surface emittance of freely radiating bodies concurrent with conventional optical pyrometry. These are sufficient to determine the true surface temperature of the target. Intersonics is currently developing a system called DAPP, the Division of Amplitude Polarimetric Pyrometer, that uses polarization information to measure the true thermodynamic temperature of freely radiating objects. This instrument has potential use in materials processing applications in ground and space based equipment. Results of thermophysical and thermodynamic measurements using laser reflection as a temperature measuring tool are presented. The impact of these techniques on thermophysical property measurements at high temperature is discussed.

  16. [Techniques for rapid production of monoclonal antibodies for use with antibody technology].

    PubMed

    Kamada, Haruhiko

    2012-01-01

    A monoclonal antibody (Mab), due to its specific binding ability to a target protein, can potentially be one of the most useful tools for the functional analysis of proteins in recent proteomics-based research. However, the production of Mab is a very time-consuming and laborious process (i.e., preparation of recombinant antigens, immunization of animals, preparation of hybridomas), making it the rate-limiting step in using Mabs in high-throughput proteomics research, which heavily relies on comprehensive and rapid methods. Therefore, there is a great demand for new methods to efficiently generate Mabs against a group of proteins identified by proteome analysis. Here, we describe a useful method called "Antibody proteomic technique" for the rapid generations of Mabs to pharmaceutical target, which were identified by proteomic analyses of disease samples (ex. tumor tissue, etc.). We also introduce another method to find profitable targets on vasculature, which is called "Vascular proteomic technique". Our results suggest that this method for the rapid generation of Mabs to proteins may be very useful in proteomics-based research as well as in clinical applications.

  17. Developments in signal processing and interpretation in laser tapping

    NASA Astrophysics Data System (ADS)

    Perton, M.; Neron, C.; Blouin, A.; Monchalin, J.-P.

    2013-01-01

    A novel technique, called laser-tapping, based on the thermoelastic excitation by laser like laser-ultrasonics has been previously introduced for inspecting honeycomb and foam core structures. If the top skin is delaminated or detached from the substrate, the detached layer is driven into vibration. The interpretation of the vibrations in terms of Lamb wave resonances is first discussed for a flat bottom hole configuration and then used to determine appropriate signal processing for samples such as honeycomb structures.

  18. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    PubMed

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  19. Technique and cue selection for graphical presentation of generic hyperdimensional data

    NASA Astrophysics Data System (ADS)

    Howard, Lee M.; Burton, Robert P.

    2013-12-01

    Several presentation techniques have been created for visualization of data with more than three variables. Packages have been written, each of which implements a subset of these techniques. However, these packages generally fail to provide all the features needed by the user during the visualization process. Further, packages generally limit support for presentation techniques to a few techniques. A new package called Petrichor accommodates all necessary and useful features together in one system. Any presentation technique may be added easily through an extensible plugin system. Features are supported by a user interface that allows easy interaction with data. Annotations allow users to mark up visualizations and share information with others. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes a complete set of features, including those that are rarely or never supported elsewhere, the user is provided with a tool that facilitates improved interaction with multivariate data to extract and disseminate information.

  20. Fabrication of glass gas cells for the HALOE and MAPS satellite experiments

    NASA Technical Reports Server (NTRS)

    Sullivan, E. M.; Walthall, H. G.

    1984-01-01

    The Halogen Occultation Experiment (HALOE) and the Measurement of Air Pollution from Satellites (MAPS) experiment are satellite-borne experiments which measure trace constituents in the Earth's atmosphere. The instruments which obtain the data for these experiments are based on the gas filter correlation radiometer measurement technique. In this technique, small samples of the gases of interest are encapsulated in glass cylinders, called gas cells, which act as very selective optical filters. This report describes the techniques employed in the fabrication of the gas cells for the HALOE and MAPS instruments. Details of the method used to fuse the sapphire windows (required for IR transmission) to the glass cell bodies are presented along with detailed descriptions of the jigs and fixtures used during the assembly process. The techniques and equipment used for window inspection and for pairing the HALOE windows are discussed. Cell body materials and the steps involved in preparing the cell bodies for the glass-to-sapphire fusion process are given.

  1. Dynamics of the brain: Mathematical models and non-invasive experimental studies

    NASA Astrophysics Data System (ADS)

    Toronov, V.; Myllylä, T.; Kiviniemi, V.; Tuchin, V. V.

    2013-10-01

    Dynamics is an essential aspect of the brain function. In this article we review theoretical models of neural and haemodynamic processes in the human brain and experimental non-invasive techniques developed to study brain functions and to measure dynamic characteristics, such as neurodynamics, neurovascular coupling, haemodynamic changes due to brain activity and autoregulation, and cerebral metabolic rate of oxygen. We focus on emerging theoretical biophysical models and experimental functional neuroimaging results, obtained mostly by functional magnetic resonance imaging (fMRI) and near-infrared spectroscopy (NIRS). We also included our current results on the effects of blood pressure variations on cerebral haemodynamics and simultaneous measurements of fast processes in the brain by near-infrared spectroscopy and a very novel functional MRI technique called magnetic resonance encephalography. Based on a rapid progress in theoretical and experimental techniques and due to the growing computational capacities and combined use of rapidly improving and emerging neuroimaging techniques we anticipate during next decade great achievements in the overall knowledge of the human brain.

  2. Documentation Driven Development for Complex Real-Time Systems

    DTIC Science & Technology

    2004-12-01

    This paper presents a novel approach for development of complex real - time systems , called the documentation-driven development (DDD) approach. This... time systems . DDD will also support automated software generation based on a computational model and some relevant techniques. DDD includes two main...stakeholders to be easily involved in development processes and, therefore, significantly improve the agility of software development for complex real

  3. Characterization and genome-wide association mapping of resistance to leaf rust, stem rust and stripe rust in a geographically diverse collection of spring wheat landraces

    USDA-ARS?s Scientific Manuscript database

    The challenge posed by rapidly changing wheat rust pathogens, both in virulence and in environmental adaptation, calls for the development and application of new techniques to accelerate the process of breeding for durable resistance. To expand the wheat resistance gene pool available for germplasm ...

  4. Highly Concurrent Scalar Processing.

    DTIC Science & Technology

    1986-01-01

    rearrangement arise from data dependencies between instructions, hence it is critical that artificial - dependencies are eliminated whenever possible...An important class of artificial depen- *. dencies arise due to register reuse. In the following example, no parallelism can be • . exploited in the...specific procedure call site. The use of inteligent procedure expansion techniques is expected to be crucial to the achievement of high performance

  5. A Laboratory Course for Teaching Laboratory Techniques, Experimental Design, Statistical Analysis, and Peer Review Process to Undergraduate Science Students

    ERIC Educational Resources Information Center

    Gliddon, C. M.; Rosengren, R. J.

    2012-01-01

    This article describes a 13-week laboratory course called Human Toxicology taught at the University of Otago, New Zealand. This course used a guided inquiry based laboratory coupled with formative assessment and collaborative learning to develop in undergraduate students the skills of problem solving/critical thinking, data interpretation and…

  6. Convergence Properties of a Class of Probabilistic Adaptive Schemes Called Sequential Reproductive Plans. Psychology and Education Series, Technical Report No. 210.

    ERIC Educational Resources Information Center

    Martin, Nancy

    Presented is a technical report concerning the use of a mathematical model describing certain aspects of the duplication and selection processes in natural genetic adaptation. This reproductive plan/model occurs in artificial genetics (the use of ideas from genetics to develop general problem solving techniques for computers). The reproductive…

  7. Foucault, Counselling and the Aesthetics of Existence

    ERIC Educational Resources Information Center

    Peters, Michael A.

    2005-01-01

    Michel Foucault was drawn late in life to study the "arts of the self" in Greco-Roman culture as a basis, following Nietzsche, for what he called an "aesthetics of existence." By this, he meant a set of creative and experimental processes and techniques by which an individual turns him- or herself into a work of art. For Nietzsche, it was above…

  8. Dynamic Optimization

    NASA Technical Reports Server (NTRS)

    Laird, Philip

    1992-01-01

    We distinguish static and dynamic optimization of programs: whereas static optimization modifies a program before runtime and is based only on its syntactical structure, dynamic optimization is based on the statistical properties of the input source and examples of program execution. Explanation-based generalization is a commonly used dynamic optimization method, but its effectiveness as a speedup-learning method is limited, in part because it fails to separate the learning process from the program transformation process. This paper describes a dynamic optimization technique called a learn-optimize cycle that first uses a learning element to uncover predictable patterns in the program execution and then uses an optimization algorithm to map these patterns into beneficial transformations. The technique has been used successfully for dynamic optimization of pure Prolog.

  9. Standardized shrinking LORETA-FOCUSS (SSLOFO): a new algorithm for spatio-temporal EEG source reconstruction.

    PubMed

    Liu, Hesheng; Schimpf, Paul H; Dong, Guoya; Gao, Xiaorong; Yang, Fusheng; Gao, Shangkai

    2005-10-01

    This paper presents a new algorithm called Standardized Shrinking LORETA-FOCUSS (SSLOFO) for solving the electroencephalogram (EEG) inverse problem. Multiple techniques are combined in a single procedure to robustly reconstruct the underlying source distribution with high spatial resolution. This algorithm uses a recursive process which takes the smooth estimate of sLORETA as initialization and then employs the re-weighted minimum norm introduced by FOCUSS. An important technique called standardization is involved in the recursive process to enhance the localization ability. The algorithm is further improved by automatically adjusting the source space according to the estimate of the previous step, and by the inclusion of temporal information. Simulation studies are carried out on both spherical and realistic head models. The algorithm achieves very good localization ability on noise-free data. It is capable of recovering complex source configurations with arbitrary shapes and can produce high quality images of extended source distributions. We also characterized the performance with noisy data in a realistic head model. An important feature of this algorithm is that the temporal waveforms are clearly reconstructed, even for closely spaced sources. This provides a convenient way to estimate neural dynamics directly from the cortical sources.

  10. Validation of nonlinear interferometric vibrational imaging as a molecular OCT technique by the use of Raman microscopy

    NASA Astrophysics Data System (ADS)

    Benalcazar, Wladimir A.; Jiang, Zhi; Marks, Daniel L.; Geddes, Joseph B.; Boppart, Stephen A.

    2009-02-01

    We validate a molecular imaging technique called Nonlinear Interferometric Vibrational Imaging (NIVI) by comparing vibrational spectra with those acquired from Raman microscopy. This broadband coherent anti-Stokes Raman scattering (CARS) technique uses heterodyne detection and OCT acquisition and design principles to interfere a CARS signal generated by a sample with a local oscillator signal generated separately by a four-wave mixing process. These are mixed and demodulated by spectral interferometry. Its confocal configuration allows the acquisition of 3D images based on endogenous molecular signatures. Images from both phantom and mammary tissues have been acquired by this instrument and its spectrum is compared with its spontaneous Raman signatures.

  11. Closed loop cavitation control - A step towards sonomechatronics.

    PubMed

    Saalbach, Kai-Alexander; Ohrdes, Hendrik; Twiefel, Jens

    2018-06-01

    In the field of sonochemistry, many processes are made possible by the generation of cavitation. This article is about closed loop control of ultrasound assisted processes with the aim of controlling the intensity of cavitation-based sonochemical processes. This is the basis for a new research field which the authors call "sonomechatronics". In order to apply closed loop control, a so called self-sensing technique is applied, which uses the ultrasound transducer's electrical signals to gain information about cavitation activity. Experiments are conducted to find out if this self-sensing technique is capable of determining the state and intensity of acoustic cavitation. A distinct frequency component in the transducer's current signal is found to be a good indicator for the onset and termination of transient cavitation. Measurements show that, depending on the boundary conditions, the onset and termination of transient cavitation occur at different thresholds, with the onset occurring at a higher value in most cases. This known hysteresis effect offers the additional possibility of achieving an energetic optimization by controlling cavitation generation. Using the cavitation indicator for the implementation of a double set point closed loop control, the mean driving current was reduced by approximately 15% compared to the value needed to exceed the transient cavitation threshold. The results presented show a great potential for the field of sonomechatronics. Nevertheless, further investigations are necessary in order to design application-specific sonomechatronic processes. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Microencapsulation of nanoemulsions: novel Trojan particles for bioactive lipid molecule delivery

    PubMed Central

    Li, Xiang; Anton, Nicolas; Ta, Thi Minh Chau; Zhao, Minjie; Messaddeq, Nadia; Vandamme, Thierry F

    2011-01-01

    Background Nanoemulsions consist of very stable nanodroplets of oil dispersed in an aqueous phase, typically below 300 nm in size. They can be used to obtain a very fine, homogeneous dispersion of lipophilic compounds in water, thus facilitating their handling and use in nanomedicine. However, the drawback is that they are suspended in an aqueous media. This study proposes a novel technique for drying lipid nanoemulsion suspensions to create so-called Trojan particles, ie, polymer microparticles (around 2 μm) which very homogeneously “entrap” the nano-oil droplets (around 150 nm) in their core. Methods Microencapsulation of the nanoemulsions was performed using a spray-drying process and resulted in a dried powder of microparticles. By using a low-energy nanoemulsification method and relatively gentle spray-drying, the process was well suited to sensitive molecules. The model lipophilic molecule tested was vitamin E acetate, encapsulated at around 20% in dried powder. Results We showed that the presence of nanoemulsions in solution before spray-drying had a significant impact on microparticle size, distribution, and morphology. However, the process itself did not destroy the oil nanodroplets, which could easily be redispersed when the powder was put back in contact with water. High-performance liquid chromatography follow-up of the integrity of the vitamin E acetate showed that the molecules were intact throughout the process, as well as when conserved in their dried form. Conclusion This study proposes a novel technique using a spray-drying process to microencapsulate nanoemulsions. The multiscale object formed, so-called Trojan microparticles, were shown to successfully encapsulate, protect, and release the lipid nanodroplets. PMID:21760727

  13. Microencapsulation of nanoemulsions: novel Trojan particles for bioactive lipid molecule delivery.

    PubMed

    Li, Xiang; Anton, Nicolas; Ta, Thi Minh Chau; Zhao, Minjie; Messaddeq, Nadia; Vandamme, Thierry F

    2011-01-01

    Nanoemulsions consist of very stable nanodroplets of oil dispersed in an aqueous phase, typically below 300 nm in size. They can be used to obtain a very fine, homogeneous dispersion of lipophilic compounds in water, thus facilitating their handling and use in nanomedicine. However, the drawback is that they are suspended in an aqueous media. This study proposes a novel technique for drying lipid nanoemulsion suspensions to create so-called Trojan particles, ie, polymer microparticles (around 2 μm) which very homogeneously "entrap" the nano-oil droplets (around 150 nm) in their core. Microencapsulation of the nanoemulsions was performed using a spray-drying process and resulted in a dried powder of microparticles. By using a low-energy nanoemulsification method and relatively gentle spray-drying, the process was well suited to sensitive molecules. The model lipophilic molecule tested was vitamin E acetate, encapsulated at around 20% in dried powder. We showed that the presence of nanoemulsions in solution before spray-drying had a significant impact on microparticle size, distribution, and morphology. However, the process itself did not destroy the oil nanodroplets, which could easily be redispersed when the powder was put back in contact with water. High-performance liquid chromatography follow-up of the integrity of the vitamin E acetate showed that the molecules were intact throughout the process, as well as when conserved in their dried form. This study proposes a novel technique using a spray-drying process to microencapsulate nanoemulsions. The multiscale object formed, so-called Trojan microparticles, were shown to successfully encapsulate, protect, and release the lipid nanodroplets.

  14. Laser induced photoluminiscence studies of primary photochemical production processes of cometary radicals

    NASA Technical Reports Server (NTRS)

    Jackson, W. M.

    1977-01-01

    A tunable vacuum ultraviolet flash lamp was constructed. This unique flash lamp was coupled with a tunable dye laser detector and permits the experimenter to measure the production rates of ground state radicals as a function of wavelength. A new technique for producing fluorescent radicals was discovered. This technique called multiphoton ultraviolet photodissociation is currently being applied to several problems of both cometary and stratospheric interest. It was demonstrated that NO2 will dissociate to produce an excited fragment and the radiation can possibly be used for remote detection of this species.

  15. Decision e Informacion en Solucion de Problemas. Publicacion No. 77 (Information and Decision Making in Problem Solving. Publication No. 77).

    ERIC Educational Resources Information Center

    Rimoldi, Horacio J. A.; And Others

    A technique using information and decision-making theories to evaluate problem solving tactics is presented. In problem solving, the process of solution is evaluated by investigating the questions that the subject doing the problem solving asks. The sequence of questions asked is called a tactic. It is assumed that: (1) tactics are the observable…

  16. Coherent-Phase Monitoring Of Cavitation In Turbomachines

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1996-01-01

    Digital electronic signal-processing system analyzes outputs of accelerometers mounted on turbomachine to detect vibrations characteristic of cavitation. Designed to overcome limitation imposed by interference from discrete components. System digitally implements technique called "coherent-phase wide-band demodulation" (CPWBD), using phase-only (PO) filtering along envelope detection to search for unique coherent-phase relationship associated with cavitation and to minimize influence of large-amplitude discrete components.

  17. Resonance Energy Transfer-Based Molecular Switch Designed Using a Systematic Design Process Based on Monte Carlo Methods and Markov Chains

    NASA Astrophysics Data System (ADS)

    Rallapalli, Arjun

    A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical. In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications. We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.

  18. A deterministic compressive sensing model for bat biosonar.

    PubMed

    Hague, David A; Buck, John R; Bilik, Igal

    2012-12-01

    The big brown bat (Eptesicus fuscus) uses frequency modulated (FM) echolocation calls to accurately estimate range and resolve closely spaced objects in clutter and noise. They resolve glints spaced down to 2 μs in time delay which surpasses what traditional signal processing techniques can achieve using the same echolocation call. The Matched Filter (MF) attains 10-12 μs resolution while the Inverse Filter (IF) achieves higher resolution at the cost of significantly degraded detection performance. Recent work by Fontaine and Peremans [J. Acoustic. Soc. Am. 125, 3052-3059 (2009)] demonstrated that a sparse representation of bat echolocation calls coupled with a decimating sensing method facilitates distinguishing closely spaced objects over realistic SNRs. Their work raises the intriguing question of whether sensing approaches structured more like a mammalian auditory system contains the necessary information for the hyper-resolution observed in behavioral tests. This research estimates sparse echo signatures using a gammatone filterbank decimation sensing method which loosely models the processing of the bat's auditory system. The decimated filterbank outputs are processed with [script-l](1) minimization. Simulations demonstrate that this model maintains higher resolution than the MF and significantly better detection performance than the IF for SNRs of 5-45 dB while undersampling the return signal by a factor of six.

  19. Large-deviation properties of Brownian motion with dry friction.

    PubMed

    Chen, Yaming; Just, Wolfram

    2014-10-01

    We investigate piecewise-linear stochastic models with regard to the probability distribution of functionals of the stochastic processes, a question that occurs frequently in large deviation theory. The functionals that we are looking into in detail are related to the time a stochastic process spends at a phase space point or in a phase space region, as well as to the motion with inertia. For a Langevin equation with discontinuous drift, we extend the so-called backward Fokker-Planck technique for non-negative support functionals to arbitrary support functionals, to derive explicit expressions for the moments of the functional. Explicit solutions for the moments and for the distribution of the so-called local time, the occupation time, and the displacement are derived for the Brownian motion with dry friction, including quantitative measures to characterize deviation from Gaussian behavior in the asymptotic long time limit.

  20. Virtual embryology: a 3D library reconstructed from human embryo sections and animation of development process.

    PubMed

    Komori, M; Miura, T; Shiota, K; Minato, K; Takahashi, T

    1995-01-01

    The volumetric shape of a human embryo and its development is hard to comprehend as they have been viewed as a 2D schemes in a textbook or microscopic sectional image. In this paper, a CAI and research support system for human embryology using multimedia presentation techniques is described. In this system, 3D data is acquired from a series of sliced specimens. Its 3D structure can be viewed interactively by rotating, extracting, and truncating its whole body or organ. Moreover, the development process of embryos can be animated using a morphing technique applied to the specimen in several stages. The system is intended to be used interactively, like a virtual reality system. Hence, the system is called Virtual Embryology.

  1. Coherent diffractive imaging of time-evolving samples with improved temporal resolution

    DOE PAGES

    Ulvestad, A.; Tripathi, A.; Hruszkewycz, S. O.; ...

    2016-05-19

    Bragg coherent x-ray diffractive imaging is a powerful technique for investigating dynamic nanoscale processes in nanoparticles immersed in reactive, realistic environments. Its temporal resolution is limited, however, by the oversampling requirements of three-dimensional phase retrieval. Here, we show that incorporating the entire measurement time series, which is typically a continuous physical process, into phase retrieval allows the oversampling requirement at each time step to be reduced, leading to a subsequent improvement in the temporal resolution by a factor of 2-20 times. The increased time resolution will allow imaging of faster dynamics and of radiation-dose-sensitive samples. Furthermore, this approach, which wemore » call "chrono CDI," may find use in improving the time resolution in other imaging techniques.« less

  2. Homogenization of CZ Si wafers by Tabula Rasa annealing

    NASA Astrophysics Data System (ADS)

    Meduňa, M.; Caha, O.; Kuběna, J.; Kuběna, A.; Buršík, J.

    2009-12-01

    The precipitation of interstitial oxygen in Czochralski grown silicon has been investigated by infrared absorption spectroscopy, chemical etching, transmission electron microscopy and X-ray diffraction after application of homogenization annealing process called Tabula Rasa. The influence of this homogenization step consisting in short time annealing at high temperature has been observed for various temperatures and times. The experimental results involving the interstitial oxygen decay in Si wafers and absorption spectra of SiOx precipitates during precipitation annealing at 1000∘ C were compared with other techniques for various Tabula Rasa temperatures. The differences in oxygen precipitation, precipitate morphology and evolution of point defects in samples with and without Tabula Rasa applied is evident from all used experimental techniques. The results qualitatively correlate with prediction of homogenization annealing process based on classical nucleation theory.

  3. Magnetic vortex nucleation modes in static magnetic fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vanatka, Marek; Urbanek, Michal; Jira, Roman

    The magnetic vortex nucleation process in nanometer- and micrometer-sized magnetic disks undergoes several phases with distinct spin configurations called the nucleation states. Before formation of the final vortex state, small submicron disks typically proceed through the so-called C-state while the larger micron-sized disks proceed through the more complicated vortex-pair state or the buckling state. This work classifies the nucleation states using micromagnetic simulations and provides evidence for the stability of vortex-pair and buckling states in static magnetic fields using magnetic imaging techniques and electrical transport measurements. Lorentz Transmission Electron Microscopy and Magnetic Transmission X-ray Microscopy are employed to reveal themore » details of spin configuration in each of the nucleation states. We further show that it is possible to unambiguously identify these states by electrical measurements via the anisotropic magnetoresistance effect. Combination of the electrical transport and magnetic imaging techniques confirms stability of a vortex-antivortex-vortex spin configuration which emerges from the buckling state in static magnetic fields.« less

  4. Magnetic vortex nucleation modes in static magnetic fields

    DOE PAGES

    Vanatka, Marek; Urbanek, Michal; Jira, Roman; ...

    2017-10-03

    The magnetic vortex nucleation process in nanometer- and micrometer-sized magnetic disks undergoes several phases with distinct spin configurations called the nucleation states. Before formation of the final vortex state, small submicron disks typically proceed through the so-called C-state while the larger micron-sized disks proceed through the more complicated vortex-pair state or the buckling state. This work classifies the nucleation states using micromagnetic simulations and provides evidence for the stability of vortex-pair and buckling states in static magnetic fields using magnetic imaging techniques and electrical transport measurements. Lorentz Transmission Electron Microscopy and Magnetic Transmission X-ray Microscopy are employed to reveal themore » details of spin configuration in each of the nucleation states. We further show that it is possible to unambiguously identify these states by electrical measurements via the anisotropic magnetoresistance effect. Combination of the electrical transport and magnetic imaging techniques confirms stability of a vortex-antivortex-vortex spin configuration which emerges from the buckling state in static magnetic fields.« less

  5. Rapid Prototyping of Continuous Fiber Reinforced Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, R.; Green, C.; Phillips, T.; Cipriani, R.; Yarlagadda, S.; Gillespie, J.; Effinger, M.; Cooper, K. C.; Gordon, Gail (Technical Monitor)

    2002-01-01

    For ceramics to be used as structural components in high temperature applications, their fracture toughness is improved by embedding continuous ceramic fibers. Ceramic matrix composite (CMC) materials allow increasing the overall operating temperature, raising the temperature safety margins, avoiding the need for cooling, and improving the damping capacity, while reducing the weight at the same time. They also need to be reliable and available in large quantities as well. In this paper, an innovative rapid prototyping technique to fabricate continuous fiber reinforced ceramic matrix composites is described. The process is simple, robust and will be widely applicable to a number of high temperature material systems. This technique was originally developed at the University of Delaware Center for Composite Materials (UD-CCM) for rapid fabrication of polymer matrix composites by a technique called automated tow placement or ATP. The results of mechanical properties and microstructural characterization are presented, together with examples of complex shapes and parts. It is believed that the process will be able to create complex shaped parts at an order of magnitude lower cost than current CVI and PIP processes.

  6. Optical surface analysis: a new technique for the inspection and metrology of optoelectronic films and wafers

    NASA Astrophysics Data System (ADS)

    Bechtler, Laurie; Velidandla, Vamsi

    2003-04-01

    In response to demand for higher volumes and greater product capability, integrated optoelectronic device processing is rapidly increasing in complexity, benefiting from techniques developed for conventional silicon integrated circuit processing. The needs for high product yield and low manufacturing cost are also similar to the silicon wafer processing industry. This paper discusses the design and use of an automated inspection instrument called the Optical Surface Analyzer (OSA) to evaluate two critical production issues in optoelectronic device manufacturing: (1) film thickness uniformity, and (2) defectivity at various process steps. The OSA measurement instrument is better suited to photonics process development than most equipment developed for conventional silicon wafer processing in two important ways: it can handle both transparent and opaque substrates (unlike most inspection and metrology tools), and it is a full-wafer inspection method that captures defects and film variations over the entire substrate surface (unlike most film thickness measurement tools). Measurement examples will be provided in the paper for a variety of films and substrates used for optoelectronics manufacturing.

  7. The CSSIAR v.1.00 Software: A new tool based on SIAR to assess soil redistribution using Compound Specific Stable Isotopes

    NASA Astrophysics Data System (ADS)

    Sergio, de los Santos-Villalobos; Claudio, Bravo-Linares; dos Anjos Roberto, Meigikos; Renan, Cardoso; Max, Gibbs; Andrew, Swales; Lionel, Mabit; Gerd, Dercon

    Soil erosion is one of the biggest challenges for food production around the world. Many techniques have been used to evaluate and mitigate soil degradation. Nowadays isotopic techniques are becoming a powerful tool to assess soil apportionment. One of the innovative techniques used is the Compound Specific Stable Isotopes (CSSI) analysis, which has been used to track down sediments and specify their sources by the isotopic signature of δ13 C in specific fatty acids. The application of this technique on soil apportionment has been recently developed, however there is a lack of user-friendly Software for data processing and interpretation. The aim of this article is to introduce a new open source tool for working with data sets generated by the use of the CSSI technique to assess soil apportionment, called the CSSIARv1.00 Software

  8. Estimation of Dynamical Parameters in Atmospheric Data Sets

    NASA Technical Reports Server (NTRS)

    Wenig, Mark O.

    2004-01-01

    In this study a new technique is used to derive dynamical parameters out of atmospheric data sets. This technique, called the structure tensor technique, can be used to estimate dynamical parameters such as motion, source strengths, diffusion constants or exponential decay rates. A general mathematical framework was developed for the direct estimation of the physical parameters that govern the underlying processes from image sequences. This estimation technique can be adapted to the specific physical problem under investigation, so it can be used in a variety of applications in trace gas, aerosol, and cloud remote sensing. The fundamental algorithm will be extended to the analysis of multi- channel (e.g. multi trace gas) image sequences and to provide solutions to the extended aperture problem. In this study sensitivity studies have been performed to determine the usability of this technique for data sets with different resolution in time and space and different dimensions.

  9. Automatic welding detection by an intelligent tool pipe inspection

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  10. Direct evaluation of fault trees using object-oriented programming techniques

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1989-01-01

    Object-oriented programming techniques are used in an algorithm for the direct evaluation of fault trees. The algorithm combines a simple bottom-up procedure for trees without repeated events with a top-down recursive procedure for trees with repeated events. The object-oriented approach results in a dynamic modularization of the tree at each step in the reduction process. The algorithm reduces the number of recursive calls required to solve trees with repeated events and calculates intermediate results as well as the solution of the top event. The intermediate results can be reused if part of the tree is modified. An example is presented in which the results of the algorithm implemented with conventional techniques are compared to those of the object-oriented approach.

  11. Investigation on the use of optimization techniques for helicopter airframe vibrations design studies

    NASA Technical Reports Server (NTRS)

    Sreekanta Murthy, T.

    1992-01-01

    Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.

  12. Vision Trainer Teaches Focusing Techniques at Home

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Based on work Stanford Research Institute did for Ames Research Center, Joseph Trachtman developed a vision trainer to treat visual focusing problems in the 1980s. In 2014, Trachtman, operating out of Seattle, released a home version of the device called the Zone-Trac. The inventor has found the biofeedback process used by the technology induces an alpha-wave brain state, causing increased hand-eye coordination and reaction times, among other effects

  13. Genetic Algorithms and Their Application to the Protein Folding Problem

    DTIC Science & Technology

    1993-12-01

    and symbolic methods, random methods such as Monte Carlo simulation and simulated annealing, distance geometry, and molecular dynamics. Many of these...calculated energies with those obtained using the molecular simulation software package called CHARMm. 10 9) Test both the simple and parallel simpie genetic...homology-based, and simplification techniques. 3.21 Molecular Dynamics. Perhaps the most natural approach is to actually simulate the folding process. This

  14. Automating the Air Force Retail-Level Equipment Management Process: An Application of Microcomputer-Based Information Systems Techniques

    DTIC Science & Technology

    1988-09-01

    could use the assistance of a microcomputer-based management information system . However, adequate system design and development requires an in-depth...understanding of the Equipment Management Section and the environment in which it functions were asked and answered. Then, a management information system was...designed, developed, and tested. The management information system is called the Equipment Management Information System (EMIS).

  15. A High Performance SOAP Engine for Grid Computing

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Welzl, Michael; Zhang, Liang

    Web Service technology still has many defects that make its usage for Grid computing problematic, most notably the low performance of the SOAP engine. In this paper, we develop a novel SOAP engine called SOAPExpress, which adopts two key techniques for improving processing performance: SCTP data transport and dynamic early binding based data mapping. Experimental results show a significant and consistent performance improvement of SOAPExpress over Apache Axis.

  16. Fingerprint image enhancement by differential hysteresis processing.

    PubMed

    Blotta, Eduardo; Moler, Emilce

    2004-05-10

    A new method to enhance defective fingerprints images through image digital processing tools is presented in this work. When the fingerprints have been taken without any care, blurred and in some cases mostly illegible, as in the case presented here, their classification and comparison becomes nearly impossible. A combination of spatial domain filters, including a technique called differential hysteresis processing (DHP), is applied to improve these kind of images. This set of filtering methods proved to be satisfactory in a wide range of cases by uncovering hidden details that helped to identify persons. Dactyloscopy experts from Policia Federal Argentina and the EAAF have validated these results.

  17. The productive techniques and constitutive effects of 'evidence-based policy' and 'consumer participation' discourses in health policy processes.

    PubMed

    Lancaster, K; Seear, K; Treloar, C; Ritter, A

    2017-03-01

    For over twenty years there have been calls for greater 'consumer' participation in health decision-making. While it is recognised by governments and other stakeholders that 'consumer' participation is desirable, barriers to meaningful involvement nonetheless remain. It has been suggested that the reifying of 'evidence-based policy' may be limiting opportunities for participation, through the way this discourse legitimates particular voices to the exclusion of others. Others have suggested that assumptions underpinning the very notion of the 'affected community' or 'consumers' as fixed and bounded 'policy publics' need to be problematised. In this paper, drawing on interviews (n = 41) with individuals closely involved in Australian drug policy discussions, we critically interrogate the productive techniques and constitutive effects of 'evidence-based policy' and 'consumer participation' discourses in the context of drug policy processes. To inform our analysis, we draw on and combine a number of critical perspectives including Foucault's concept of subjugated knowledges, the work of feminist theorists, as well as recent work regarding conceptualisations of emergent policy publics. First, we explore how the subject position of 'consumer' might be seen as enacted in the material-discursive practices of 'evidence-based policy' and 'consumer participation' in drug policy processes. Secondly, we consider the centralising power-effects of the dominant 'evidence-based policy' paradigm, and how resistance may be thought about in this context. We suggest that such interrogation has potential to recast the call for 'consumer' participation in health policy decision-making and drug policy processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Three Reading Comprehension Strategies: TELLS, Story Mapping, and QARs.

    ERIC Educational Resources Information Center

    Sorrell, Adrian L.

    1990-01-01

    Three reading comprehension strategies are presented to assist learning-disabled students: an advance organizer technique called "TELLS Fact or Fiction" used before reading a passage, a schema-based technique called "Story Mapping" used while reading, and a postreading method of categorizing questions called…

  19. Detection and Length Estimation of Linear Scratch on Solid Surfaces Using an Angle Constrained Ant Colony Technique

    NASA Astrophysics Data System (ADS)

    Pal, Siddharth; Basak, Aniruddha; Das, Swagatam

    In many manufacturing areas the detection of surface defects is one of the most important processes in quality control. Currently in order to detect small scratches on solid surfaces most of the industries working on material manufacturing rely on visual inspection primarily. In this article we propose a hybrid computational intelligence technique to automatically detect a linear scratch from a solid surface and estimate its length (in pixel unit) simultaneously. The approach is based on a swarm intelligence algorithm called Ant Colony Optimization (ACO) and image preprocessing with Wiener and Sobel filters as well as the Canny edge detector. The ACO algorithm is mostly used to compensate for the broken parts of the scratch. Our experimental results confirm that the proposed technique can be used for detecting scratches from noisy and degraded images, even when it is very difficult for conventional image processing to distinguish the scratch area from its background.

  20. A smart technique for attendance system to recognize faces through parallelism

    NASA Astrophysics Data System (ADS)

    Prabhavathi, B.; Tanuja, V.; Madhu Viswanatham, V.; Rajashekhara Babu, M.

    2017-11-01

    Major part of recognising a person is face with the help of image processing techniques we can exploit the physical features of a person. In the old approach method that is used in schools and colleges it is there that the professor calls the student name and then the attendance for the students marked. Here in paper want to deviate from the old approach and go with the new approach by using techniques that are there in image processing. In this paper we presenting spontaneous presence for students in classroom. At first classroom image has been in use and after that image is kept in data record. For the images that are stored in the database we apply system algorithm which includes steps such as, histogram classification, noise removal, face detection and face recognition methods. So by using these steps we detect the faces and then compare it with the database. The attendance gets marked automatically if the system recognizes the faces.

  1. STRCMACS: An extensive set of Macros for structured programming in OS/360 assembly language

    NASA Technical Reports Server (NTRS)

    Barth, C. W.

    1974-01-01

    Two techniques are discussed that have been most often referred to as structured programming. One is that of programming with high level control structures (such as the if and while) replacing the branch instruction (goto-less programming); the other is the process of developing a program by progressively refining descriptions of components in terms of more primitive components (called stepwise refinement or top-down programming). In addition to discussing what these techniques are, it is shown why their use is advised and how both can be implemented in OS assembly language by the use of a special macro instruction package.

  2. Machine vision for real time orbital operations

    NASA Technical Reports Server (NTRS)

    Vinz, Frank L.

    1988-01-01

    Machine vision for automation and robotic operation of Space Station era systems has the potential for increasing the efficiency of orbital servicing, repair, assembly and docking tasks. A machine vision research project is described in which a TV camera is used for inputing visual data to a computer so that image processing may be achieved for real time control of these orbital operations. A technique has resulted from this research which reduces computer memory requirements and greatly increases typical computational speed such that it has the potential for development into a real time orbital machine vision system. This technique is called AI BOSS (Analysis of Images by Box Scan and Syntax).

  3. E-GRASP/Eratosthenes: a mission proposal for millimetric TRF realization

    NASA Astrophysics Data System (ADS)

    Biancale, Richard; Pollet, Arnaud; Coulot, David; Mandea, Mioara

    2017-04-01

    The ITRF is currently worked out by independent concatenation of space technique information. GNSS, DORIS, SLR and VLBI data are processed independently by analysis centers before combination centers form mono-technique sets which are then combined together to produce official ITRF solutions. Actually this approach performs quite well, although systematisms between techniques remain visible in origin or scale parameters of the underlying terrestrial frames, for instance. Improvement and homogenization of TRF are expected in the future, provided that dedicated multi-technique platforms are used at best. The goal fixed by GGOS to realizing the terrestrial reference system with an accuracy of 1 mm and a long-term stability of 0.1 mm/yr can be next achieved in the E-GRASP/Eratosthenes scenario. This mission proposed to ESA as response of the 2017 Earth Explorer-9 call was already scientifically well assessed in the 2016 EE9 call. It co-locates all of the fundamental space-based geodetic instruments, GNSS and DORIS receivers, laser retro-reflectors, and a VLBI transmitter on the same satellite platform on a highly eccentric orbit with particular attention paid to the time and space metrology on board. Different kinds of simulations were performed both for discriminating the best orbital scenario according to many geometric/technical/physical criteria and for assessing the expected performances on the TRF according to GGOS goals. The presentation will focus on the mission scenario and simulation results.

  4. A robust seeding technique for the growth of single grain (RE)BCO and (RE)BCO-Ag bulk superconductors

    NASA Astrophysics Data System (ADS)

    Namburi, Devendra K.; Shi, Yunhua; Dennis, Anthony R.; Durrell, John H.; Cardwell, David A.

    2018-04-01

    Bulk, single grains of RE-Ba-Cu-O [(RE)BCO] high temperature superconductors have significant potential for a wide range of applications, including trapped field magnets, energy storage flywheels, superconducting mixers and magnetic separators. One of the main challenges in the production of these materials by the so-called top seeded melt growth technique is the reliable seeding of large, single grains, which are required for high field applications. A chemically aggressive liquid phase comprising of BaCuO2 and CuO is generated during the single grain growth process, which comes into direct contact with the seed crystal either instantaneously or via infiltration through a buffer pellet, if employed in the process. This can cause either partial or complete melting of the seed, leading subsequently to growth failure. Here, the underlying mechanisms of seed crystal melting and the role of seed porosity in the single grain growth process are investigated. We identify seed porosity as a key limitation in the reliable and successful fabrication of large grain (RE)BCO bulk superconductors for the first time, and propose the use of Mg-doped NdBCO generic seeds fabricated via the infiltration growth technique to reduce the effects of seed porosity on the melt growth process. Finally, we demonstrate that the use of such seeds leads to better resistance to melting during the single grain growth process, and therefore to a more reliable fabrication technique.

  5. Fluid Structure Interaction Techniques For Extrusion And Mixing Processes

    NASA Astrophysics Data System (ADS)

    Valette, Rudy; Vergnes, Bruno; Coupez, Thierry

    2007-05-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each sub-domain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique background computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  6. On splice site prediction using weight array models: a comparison of smoothing techniques

    NASA Astrophysics Data System (ADS)

    Taher, Leila; Meinicke, Peter; Morgenstern, Burkhard

    2007-11-01

    In most eukaryotic genes, protein-coding exons are separated by non-coding introns which are removed from the primary transcript by a process called "splicing". The positions where introns are cut and exons are spliced together are called "splice sites". Thus, computational prediction of splice sites is crucial for gene finding in eukaryotes. Weight array models are a powerful probabilistic approach to splice site detection. Parameters for these models are usually derived from m-tuple frequencies in trusted training data and subsequently smoothed to avoid zero probabilities. In this study we compare three different ways of parameter estimation for m-tuple frequencies, namely (a) non-smoothed probability estimation, (b) standard pseudo counts and (c) a Gaussian smoothing procedure that we recently developed.

  7. The Modern Design of Experiments: A Technical and Marketing Framework

    NASA Technical Reports Server (NTRS)

    DeLoach, R.

    2000-01-01

    A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.

  8. Getting Over the Barrel- Achieving Independence from Foreign Oil in 2018

    DTIC Science & Technology

    2009-02-03

    material called kerogen. Kerogen can be converted into oil via heating in the chemical process of pyrolysis .44 Depending on the richness of oil shale, it...vegetable oil, animal fat, corn , soybeans, jatropha seed oil, palm oil, switch grass and even algae. Biofuel production techniques and technologies...vary widely based on the input source – sugar-based, starch-based or oil-based. This document only examines corn -based ethanol production. The other

  9. Novel optical scanning cryptography using Fresnel telescope imaging.

    PubMed

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  10. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false 911 call processing procedures; 911-only calling mode. 22.921 Section 22.921 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing...

  11. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false 911 call processing procedures; 911-only calling mode. 22.921 Section 22.921 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing...

  12. Symbolically Modeling Concurrent MCAPI Executions

    NASA Technical Reports Server (NTRS)

    Fischer, Topher; Mercer, Eric; Rungta, Neha

    2011-01-01

    Improper use of Inter-Process Communication (IPC) within concurrent systems often creates data races which can lead to bugs that are challenging to discover. Techniques that use Satisfiability Modulo Theories (SMT) problems to symbolically model possible executions of concurrent software have recently been proposed for use in the formal verification of software. In this work we describe a new technique for modeling executions of concurrent software that use a message passing API called MCAPI. Our technique uses an execution trace to create an SMT problem that symbolically models all possible concurrent executions and follows the same sequence of conditional branch outcomes as the provided execution trace. We check if there exists a satisfying assignment to the SMT problem with respect to specific safety properties. If such an assignment exists, it provides the conditions that lead to the violation of the property. We show how our method models behaviors of MCAPI applications that are ignored in previously published techniques.

  13. Assembly processes comparison for a miniaturized laser used for the Exomars European Space Agency mission

    NASA Astrophysics Data System (ADS)

    Ribes-Pleguezuelo, Pol; Inza, Andoni Moral; Basset, Marta Gilaberte; Rodríguez, Pablo; Rodríguez, Gemma; Laudisio, Marco; Galan, Miguel; Hornaff, Marcel; Beckert, Erik; Eberhardt, Ramona; Tünnermann, Andreas

    2016-11-01

    A miniaturized diode-pumped solid-state laser (DPSSL) designed as part of the Raman laser spectrometer (RLS) instrument for the European Space Agency (ESA) Exomars mission 2020 is assembled and tested for the mission purpose and requirements. Two different processes were tried for the laser assembling: one based on adhesives, following traditional laser manufacturing processes; another based on a low-stress and organic-free soldering technique called solderjet bumping technology. The manufactured devices were tested for the processes validation by passing mechanical, thermal cycles, radiation, and optical functional tests. The comparison analysis showed a device improvement in terms of reliability of the optical performances from the soldered to the assembled by adhesive-based means.

  14. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false 911 call processing procedures; 911-only... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...

  15. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false 911 call processing procedures; 911-only... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...

  16. Microelectrode voltammetry of multi-electron transfers complicated by coupled chemical equilibria: a general theory for the extended square scheme.

    PubMed

    Laborda, Eduardo; Gómez-Gil, José María; Molina, Angela

    2017-06-28

    A very general and simple theoretical solution is presented for the current-potential-time response of reversible multi-electron transfer processes complicated by homogeneous chemical equilibria (the so-called extended square scheme). The expressions presented here are applicable regardless of the number of electrons transferred and coupled chemical processes, and they are particularized for a wide variety of microelectrode geometries. The voltammetric response of very different systems presenting multi-electron transfers is considered for the most widely-used techniques (namely, cyclic voltammetry, square wave voltammetry, differential pulse voltammetry and steady state voltammetry), studying the influence of the microelectrode geometry and the number and thermodynamics of the (electro)chemical steps. Most appropriate techniques and procedures for the determination of the 'interaction' between successive transfers are discussed. Special attention is paid to those situations where homogeneous chemical processes, such as protonation, complexation or ion association, affect the electrochemical behaviour of the system by different stabilization of the oxidation states.

  17. Digression and Value Concatenation to Enable Privacy-Preserving Regression.

    PubMed

    Li, Xiao-Bai; Sarkar, Sumit

    2014-09-01

    Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.

  18. Overall equipment efficiency of Flexographic Printing process: A case study

    NASA Astrophysics Data System (ADS)

    Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.

    2017-12-01

    This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.

  19. Double emulsion solvent evaporation techniques used for drug encapsulation.

    PubMed

    Iqbal, Muhammad; Zafar, Nadiah; Fessi, Hatem; Elaissari, Abdelhamid

    2015-12-30

    Double emulsions are complex systems, also called "emulsions of emulsions", in which the droplets of the dispersed phase contain one or more types of smaller dispersed droplets themselves. Double emulsions have the potential for encapsulation of both hydrophobic as well as hydrophilic drugs, cosmetics, foods and other high value products. Techniques based on double emulsions are commonly used for the encapsulation of hydrophilic molecules, which suffer from low encapsulation efficiency because of rapid drug partitioning into the external aqueous phase when using single emulsions. The main issue when using double emulsions is their production in a well-controlled manner, with homogeneous droplet size by optimizing different process variables. In this review special attention has been paid to the application of double emulsion techniques for the encapsulation of various hydrophilic and hydrophobic anticancer drugs, anti-inflammatory drugs, antibiotic drugs, proteins and amino acids and their applications in theranostics. Moreover, the optimized ratio of the different phases and other process parameters of double emulsions are discussed. Finally, the results published regarding various types of solvents, stabilizers and polymers used for the encapsulation of several active substances via double emulsion processes are reported. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Turnover of Lipidated LC3 and Autophagic Cargoes in Mammalian Cells.

    PubMed

    Rodríguez-Arribas, M; Yakhine-Diop, S M S; González-Polo, R A; Niso-Santano, M; Fuentes, J M

    2017-01-01

    Macroautophagy (usually referred to as autophagy) is the most important degradation system in mammalian cells. It is responsible for the elimination of protein aggregates, organelles, and other cellular content. During autophagy, these materials (i.e., cargo) must be engulfed by a double-membrane structure called an autophagosome, which delivers the cargo to the lysosome to complete its degradation. Autophagy is a very dynamic pathway called autophagic flux. The process involves all the steps that are implicated in cargo degradation from autophagosome formation. There are several techniques to monitor autophagic flux. Among them, the method most used experimentally to assess autophagy is the detection of LC3 protein processing and p62 degradation by Western blotting. In this chapter, we provide a detailed and straightforward protocol for this purpose in cultured mammalian cells, including a brief set of notes concerning problems associated with the Western-blotting detection of LC3 and p62. © 2017 Elsevier Inc. All rights reserved.

  1. Fluctuations in protein synthesis from a single RNA template: stochastic kinetics of ribosomes.

    PubMed

    Garai, Ashok; Chowdhury, Debashish; Ramakrishnan, T V

    2009-01-01

    Proteins are polymerized by cyclic machines called ribosomes, which use their messenger RNA (mRNA) track also as the corresponding template, and the process is called translation. We explore, in depth and detail, the stochastic nature of the translation. We compute various distributions associated with the translation process; one of them--namely, the dwell time distribution--has been measured in recent single-ribosome experiments. The form of the distribution, which fits best with our simulation data, is consistent with that extracted from the experimental data. For our computations, we use a model that captures both the mechanochemistry of each individual ribosome and their steric interactions. We also demonstrate the effects of the sequence inhomogeneities of real genes on the fluctuations and noise in translation. Finally, inspired by recent advances in the experimental techniques of manipulating single ribosomes, we make theoretical predictions on the force-velocity relation for individual ribosomes. In principle, all our predictions can be tested by carrying out in vitro experiments.

  2. Electrochemically active biofilms: facts and fiction. A review

    PubMed Central

    Babauta, Jerome; Renslow, Ryan; Lewandowski, Zbigniew; Beyenal, Haluk

    2014-01-01

    This review examines the electrochemical techniques used to study extracellular electron transfer in the electrochemically active biofilms that are used in microbial fuel cells and other bioelectrochemical systems. Electrochemically active biofilms are defined as biofilms that exchange electrons with conductive surfaces: electrodes. Following the electrochemical conventions, and recognizing that electrodes can be considered reactants in these bioelectrochemical processes, biofilms that deliver electrons to the biofilm electrode are called anodic, ie electrode-reducing, biofilms, while biofilms that accept electrons from the biofilm electrode are called cathodic, ie electrode-oxidizing, biofilms. How to grow these electrochemically active biofilms in bioelec-trochemical systems is discussed and also the critical choices made in the experimental setup that affect the experimental results. The reactor configurations used in bioelectrochemical systems research are also described and the authors demonstrate how to use selected voltammetric techniques to study extracellular electron transfer in bioelectrochemical systems. Finally, some critical concerns with the proposed electron transfer mechanisms in bioelectrochemical systems are addressed together with the prospects of bioelectrochemical systems as energy-converting and energy-harvesting devices. PMID:22856464

  3. Scalloping minimization in deep Si etching on Unaxis DSE tools

    NASA Astrophysics Data System (ADS)

    Lai, Shouliang; Johnson, Dave J.; Westerman, Russ J.; Nolan, John J.; Purser, David; Devre, Mike

    2003-01-01

    Sidewall smoothness is often a critical requirement for many MEMS devices, such as microfludic devices, chemical, biological and optical transducers, while fast silicon etch rate is another. For such applications, the time division multiplex (TDM) etch processes, so-called "Bosch" processes are widely employed. However, in the conventional TDM processes, rough sidewalls result due to scallop formation. To date, the amplitude of the scalloping has been directly linked to the silicon etch rate. At Unaxis USA Inc., we have developed a proprietary fast gas switching technique that is effective for scalloping minimization in deep silicon etching processes. In this technique, process cycle times can be reduced from several seconds to as little as a fraction of second. Scallop amplitudes can be reduced with shorter process cycles. More importantly, as the scallop amplitude is progressively reduced, the silicon etch rate can be maintained relatively constant at high values. An optimized experiment has shown that at etch rate in excess of 7 μm/min, scallops with length of 116 nm and depth of 35 nm were obtained. The fast gas switching approach offers an ideal manufacturing solution for MEMS applications where extremely smooth sidewall and fast etch rate are crucial.

  4. Combining active learning and semi-supervised learning techniques to extract protein interaction sentences.

    PubMed

    Song, Min; Yu, Hwanjo; Han, Wook-Shin

    2011-11-24

    Protein-protein interaction (PPI) extraction has been a focal point of many biomedical research and database curation tools. Both Active Learning and Semi-supervised SVMs have recently been applied to extract PPI automatically. In this paper, we explore combining the AL with the SSL to improve the performance of the PPI task. We propose a novel PPI extraction technique called PPISpotter by combining Deterministic Annealing-based SSL and an AL technique to extract protein-protein interaction. In addition, we extract a comprehensive set of features from MEDLINE records by Natural Language Processing (NLP) techniques, which further improve the SVM classifiers. In our feature selection technique, syntactic, semantic, and lexical properties of text are incorporated into feature selection that boosts the system performance significantly. By conducting experiments with three different PPI corpuses, we show that PPISpotter is superior to the other techniques incorporated into semi-supervised SVMs such as Random Sampling, Clustering, and Transductive SVMs by precision, recall, and F-measure. Our system is a novel, state-of-the-art technique for efficiently extracting protein-protein interaction pairs.

  5. Information Processing Techniques Program. Volume 1. Packet Speech Systems Technology

    DTIC Science & Technology

    1980-03-31

    DMA transfer is enabled from the 2652 serial I/O device to the buffer memory. This enables automatic recep- tion of an incoming packet without (’PU...conference speaker. Producing multiple copies at the source wastes network bandwidth and is likely to cause local overload conditions for a large... wasted . If the setup fails because ST can fird no route with sufficient capacity, the phone will have rung and possibly been answered 18 but the call will

  6. An earth imaging camera simulation using wide-scale construction of reflectance surfaces

    NASA Astrophysics Data System (ADS)

    Murthy, Kiran; Chau, Alexandra H.; Amin, Minesh B.; Robinson, M. Dirk

    2013-10-01

    Developing and testing advanced ground-based image processing systems for earth-observing remote sensing applications presents a unique challenge that requires advanced imagery simulation capabilities. This paper presents an earth-imaging multispectral framing camera simulation system called PayloadSim (PaySim) capable of generating terabytes of photorealistic simulated imagery. PaySim leverages previous work in 3-D scene-based image simulation, adding a novel method for automatically and efficiently constructing 3-D reflectance scenes by draping tiled orthorectified imagery over a geo-registered Digital Elevation Map (DEM). PaySim's modeling chain is presented in detail, with emphasis given to the techniques used to achieve computational efficiency. These techniques as well as cluster deployment of the simulator have enabled tuning and robust testing of image processing algorithms, and production of realistic sample data for customer-driven image product development. Examples of simulated imagery of Skybox's first imaging satellite are shown.

  7. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    NASA Astrophysics Data System (ADS)

    Gaudin, J.; Fourment, C.; Cho, B. I.; Engelhorn, K.; Galtier, E.; Harmand, M.; Leguay, P. M.; Lee, H. J.; Nagler, B.; Nakatsutsumi, M.; Ozkan, C.; Störmer, M.; Toleikis, S.; Tschentscher, Th; Heimann, P. A.; Dorchies, F.

    2014-04-01

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called ``molecular movie'' within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level of the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes.

  8. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    PubMed Central

    Gaudin, J.; Fourment, C.; Cho, B. I.; Engelhorn, K.; Galtier, E.; Harmand, M.; Leguay, P. M.; Lee, H. J.; Nagler, B.; Nakatsutsumi, M.; Ozkan, C.; Störmer, M.; Toleikis, S.; Tschentscher, Th; Heimann, P. A.; Dorchies, F.

    2014-01-01

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called “molecular movie” within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level of the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes. PMID:24740172

  9. SoundView: an auditory guidance system based on environment understanding for the visually impaired people.

    PubMed

    Nie, Min; Ren, Jie; Li, Zhengjun; Niu, Jinhai; Qiu, Yihong; Zhu, Yisheng; Tong, Shanbao

    2009-01-01

    Without visual information, the blind people live in various hardships with shopping, reading, finding objects and etc. Therefore, we developed a portable auditory guide system, called SoundView, for visually impaired people. This prototype system consists of a mini-CCD camera, a digital signal processing unit and an earphone, working with built-in customizable auditory coding algorithms. Employing environment understanding techniques, SoundView processes the images from a camera and detects objects tagged with barcodes. The recognized objects in the environment are then encoded into stereo speech signals for the blind though an earphone. The user would be able to recognize the type, motion state and location of the interested objects with the help of SoundView. Compared with other visual assistant techniques, SoundView is object-oriented and has the advantages of cheap cost, smaller size, light weight, low power consumption and easy customization.

  10. Real-time catheter localization and visualization using three-dimensional echocardiography

    NASA Astrophysics Data System (ADS)

    Kozlowski, Pawel; Bandaru, Raja Sekhar; D'hooge, Jan; Samset, Eigil

    2017-03-01

    Real-time three-dimensional transesophageal echocardiography (RT3D-TEE) is increasingly used during minimally invasive cardiac surgeries (MICS). In many cath labs, RT3D-TEE is already one of the requisite tools for image guidance during MICS. However, the visualization of the catheter is not always satisfactory making 3D- TEE challenging to use as the only modality for guidance. We propose a novel technique for better visualization of the catheter along with the cardiac anatomy using TEE alone - exploiting both beamforming and post processing methods. We extended our earlier method called Delay and Standard Deviation (DASD) beamforming to 3D in order to enhance specular reflections. The beam-formed image was further post-processed by the Frangi filter to segment the catheter. Multi-variate visualization techniques enabled us to render both the standard tissue and the DASD beam-formed image on a clinical ultrasound scanner simultaneously. A frame rate of 15 FPS was achieved.

  11. Does Nursing Facility Use of Habilitation Therapy Improve Performance on Quality Measures?

    PubMed

    Fitzler, Sandra; Raia, Paul; Buckley, Fredrick O; Wang, Mei

    2016-12-01

    The purpose of the project, Centers for Medicare & Medicaid Services (CMS) Innovation study, was to evaluate the impact on 12 quality measures including 10 Minimum Data Set (MDS) publicly reported measures and 2 nursing home process measures using habilitation therapy techniques and a behavior team to manage dementia-related behaviors. A prospective design was used to assess the changes in the measures. A total of 30 Massachusetts nursing homes participated in the project over a 12-month period. Project participation required the creation of an interdisciplinary behavior team, habilitation therapy training, facility visit by the program coordinator, attendance at bimonthly support and sharing calls, and monthly collection of process measure data. Participating facilities showed improvement in 9 of the 12 reported measures. Findings indicate potential quality improvement in having nursing homes learn habilitation therapy techniques and know how to use the interdisciplinary team to manage problem behaviors. © The Author(s) 2016.

  12. Towards simultaneous measurements of electronic and structural properties in ultra-fast x-ray free electron laser absorption spectroscopy experiments

    DOE PAGES

    Gaudin, J.; Fourment, C.; Cho, B. I.; ...

    2014-04-17

    The rapidly growing ultrafast science with X-ray lasers unveils atomic scale processes with unprecedented time resolution bringing the so called “molecular movie” within reach. X-ray absorption spectroscopy is one of the most powerful x-ray techniques providing both local atomic order and electronic structure when coupled with ad-hoc theory. Collecting absorption spectra within few x-ray pulses is possible only in a dispersive setup. We demonstrate ultrafast time-resolved measurements of the LIII-edge x-ray absorption near-edge spectra of irreversibly laser excited Molybdenum using an average of only few x-ray pulses with a signal to noise ratio limited only by the saturation level ofmore » the detector. The simplicity of the experimental set-up makes this technique versatile and applicable for a wide range of pump-probe experiments, particularly in the case of non-reversible processes.« less

  13. A series connection architecture for large-area organic photovoltaic modules with a 7.5% module efficiency.

    PubMed

    Hong, Soonil; Kang, Hongkyu; Kim, Geunjin; Lee, Seongyu; Kim, Seok; Lee, Jong-Hoon; Lee, Jinho; Yi, Minjin; Kim, Junghwan; Back, Hyungcheol; Kim, Jae-Ryoung; Lee, Kwanghee

    2016-01-05

    The fabrication of organic photovoltaic modules via printing techniques has been the greatest challenge for their commercial manufacture. Current module architecture, which is based on a monolithic geometry consisting of serially interconnecting stripe-patterned subcells with finite widths, requires highly sophisticated patterning processes that significantly increase the complexity of printing production lines and cause serious reductions in module efficiency due to so-called aperture loss in series connection regions. Herein we demonstrate an innovative module structure that can simultaneously reduce both patterning processes and aperture loss. By using a charge recombination feature that occurs at contacts between electron- and hole-transport layers, we devise a series connection method that facilitates module fabrication without patterning the charge transport layers. With the successive deposition of component layers using slot-die and doctor-blade printing techniques, we achieve a high module efficiency reaching 7.5% with area of 4.15 cm(2).

  14. Bovine somatic cell nuclear transfer.

    PubMed

    Ross, Pablo J; Cibelli, Jose B

    2010-01-01

    Somatic cell nuclear transfer (SCNT) is a technique by which the nucleus of a differentiated cell is introduced into an oocyte from which its genetic material has been removed by a process called enucleation. In mammals, the reconstructed embryo is artificially induced to initiate embryonic development (activation). The oocyte turns the somatic cell nucleus into an embryonic nucleus. This process is called nuclear reprogramming and involves an important change of cell fate, by which the somatic cell nucleus becomes capable of generating all the cell types required for the formation of a new individual, including extraembryonic tissues. Therefore, after transfer of a cloned embryo to a surrogate mother, an offspring genetically identical to the animal from which the somatic cells where isolated, is born. Cloning by nuclear transfer has potential applications in agriculture and biomedicine, but is limited by low efficiency. Cattle were the second mammalian species to be cloned after Dolly the sheep, and it is probably the most widely used species for SCNT experiments. This is, in part due to the high availability of bovine oocytes and the relatively higher efficiency levels usually obtained in cattle. Given the wide utilization of this species for cloning, several alternatives to this basic protocol can be found in the literature. Here we describe a basic protocol for bovine SCNT currently being used in our laboratory, which is amenable for the use of the nuclear transplantation technique for research or commercial purposes.

  15. Minimization of model representativity errors in identification of point source emission from atmospheric concentration measurements

    NASA Astrophysics Data System (ADS)

    Sharan, Maithili; Singh, Amit Kumar; Singh, Sarvesh Kumar

    2017-11-01

    Estimation of an unknown atmospheric release from a finite set of concentration measurements is considered an ill-posed inverse problem. Besides ill-posedness, the estimation process is influenced by the instrumental errors in the measured concentrations and model representativity errors. The study highlights the effect of minimizing model representativity errors on the source estimation. This is described in an adjoint modelling framework and followed in three steps. First, an estimation of point source parameters (location and intensity) is carried out using an inversion technique. Second, a linear regression relationship is established between the measured concentrations and corresponding predicted using the retrieved source parameters. Third, this relationship is utilized to modify the adjoint functions. Further, source estimation is carried out using these modified adjoint functions to analyse the effect of such modifications. The process is tested for two well known inversion techniques, called renormalization and least-square. The proposed methodology and inversion techniques are evaluated for a real scenario by using concentrations measurements from the Idaho diffusion experiment in low wind stable conditions. With both the inversion techniques, a significant improvement is observed in the retrieval of source estimation after minimizing the representativity errors.

  16. Optimization technique for problems with an inequality constraint

    NASA Technical Reports Server (NTRS)

    Russell, K. J.

    1972-01-01

    General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.

  17. Pseudo-shading technique in the two-dimensional domain: a post-processing algorithm for enhancing the Z-buffer of a three-dimensional binary image.

    PubMed

    Tan, A C; Richards, R

    1989-01-01

    Three-dimensional (3D) medical graphics is becoming popular in clinical use on tomographic scanners. Research work in 3D reconstructive display of computerized tomography (CT) and magnetic resonance imaging (MRI) scans on conventional computers has produced many so-called pseudo-3D images. The quality of these images depends on the rendering algorithm, the coarseness of the digitized object, the number of grey levels and the image screen resolution. CT and MRI data are fundamentally voxel based and they produce images that are coarse because of the resolution of the data acquisition system. 3D images produced by the Z-buffer depth shading technique suffer loss of detail when complex objects with fine textural detail need to be displayed. Attempts have been made to improve the display of voxel objects, and existing techniques have shown the improvement possible using these post-processing algorithms. The improved rendering technique works on the Z-buffer image to generate a shaded image using a single light source in any direction. The effectiveness of the technique in generating a shaded image has been shown to be a useful means of presenting 3D information for clinical use.

  18. JIGSAW: Preference-directed, co-operative scheduling

    NASA Technical Reports Server (NTRS)

    Linden, Theodore A.; Gaw, David

    1992-01-01

    Techniques that enable humans and machines to cooperate in the solution of complex scheduling problems have evolved out of work on the daily allocation and scheduling of Tactical Air Force resources. A generalized, formal model of these applied techniques is being developed. It is called JIGSAW by analogy with the multi-agent, constructive process used when solving jigsaw puzzles. JIGSAW begins from this analogy and extends it by propagating local preferences into global statistics that dynamically influence the value and variable ordering decisions. The statistical projections also apply to abstract resources and time periods--allowing more opportunities to find a successful variable ordering by reserving abstract resources and deferring the choice of a specific resource or time period.

  19. Differential Binary Encoding Method for Calibrating Image Sensors Based on IOFBs

    PubMed Central

    Fernández, Pedro R.; Lázaro-Galilea, José Luis; Gardel, Alfredo; Espinosa, Felipe; Bravo, Ignacio; Cano, Ángel

    2012-01-01

    Image transmission using incoherent optical fiber bundles (IOFBs) requires prior calibration to obtain the spatial in-out fiber correspondence necessary to reconstruct the image captured by the pseudo-sensor. This information is recorded in a Look-Up Table called the Reconstruction Table (RT), used later for reordering the fiber positions and reconstructing the original image. This paper presents a very fast method based on image-scanning using spaces encoded by a weighted binary code to obtain the in-out correspondence. The results demonstrate that this technique yields a remarkable reduction in processing time and the image reconstruction quality is very good compared to previous techniques based on spot or line scanning, for example. PMID:22666023

  20. Signal processing for the profoundly deaf.

    PubMed

    Boothyroyd, A

    1990-01-01

    Profound deafness, defined here as a hearing loss in excess of 90 dB, is characterized by high thresholds, reduced hearing range in the intensity and frequency domains, and poor resolution in the frequency and time domains. The high thresholds call for hearing aids with unusually high gains or remote microphones that can be placed close to the signal source. The former option creates acoustic feedback problems for which digital signal processing may yet offer solutions. The latter option calls for carrier wave technology that is already available. The reduced frequency and intensity ranges would appear to call for frequency and/or amplitude compression. It might also be argued, however, that any attempts to compress the acoustic signal into the limited hearing range of the profoundly deaf will be counterproductive because of poor frequency and time resolution, especially when the signal is present in noise. In experiments with a 2-channel compression system, only 1 of 9 subjects showed an improvement of perception with the introduction of fast-release (20 ms) compression. The other 8 experienced no benefit or a slight deterioration of performance. These results support the concept of providing the profoundly deaf with simpler, rather than more complex, patterns, perhaps through the use of feature extraction hearing aids. Data from users of cochlear implants already employing feature extraction techniques also support this concept.

  1. Dynamic test input generation for multiple-fault isolation

    NASA Technical Reports Server (NTRS)

    Schaefer, Phil

    1990-01-01

    Recent work is Causal Reasoning has provided practical techniques for multiple fault diagnosis. These techniques provide a hypothesis/measurement diagnosis cycle. Using probabilistic methods, they choose the best measurements to make, then update fault hypotheses in response. For many applications such as computers and spacecraft, few measurement points may be accessible, or values may change quickly as the system under diagnosis operates. In these cases, a hypothesis/measurement cycle is insufficient. A technique is presented for a hypothesis/test-input/measurement diagnosis cycle. In contrast to generating tests a priori for determining device functionality, it dynamically generates tests in response to current knowledge about fault probabilities. It is shown how the mathematics previously used for measurement specification can be applied to the test input generation process. An example from an efficient implementation called Multi-Purpose Causal (MPC) is presented.

  2. Essential Psychoanalysis: Toward a Re-Appraisal of the Relationship between Psychoanalysis and Dynamic Psychotherapy.

    PubMed

    Sripada, Bhaskar

    2015-09-01

    Freud stated that any line of investigation which recognizes transference and resistance, regardless of its results, was entitled to call itself psychoanalysis (Freud, 1914a, p. 16). Separately he wrote that psychoanalysis was the science of unconscious mental processes (Freud, 1925, p. 70). Combining these two ideas defines Essential Psychoanalysis: Any line of treatment, theory, or science which recognizes the facts of unconscious, transference, or resistance, and takes them as the starting point of its work, regardless of its results, is psychoanalysis. Freud formulated two conflicting definitions of psychoanalysis: Essential Psychoanalysis, applicable to all analysts regardless of their individuality and Extensive Psychoanalysis, modeled on his individuality. They differ in how psychoanalytic technique is viewed. For Essential Psychoanalysis, flexible recommendations constitute psychoanalytic technique, whereas for Extensive Psychoanalysis, rules constitute a key part of psychoanalytic technique.

  3. Processing the Bouguer anomaly map of Biga and the surrounding area by the cellular neural network: application to the southwestern Marmara region

    NASA Astrophysics Data System (ADS)

    Aydogan, D.

    2007-04-01

    An image processing technique called the cellular neural network (CNN) approach is used in this study to locate geological features giving rise to gravity anomalies such as faults or the boundary of two geologic zones. CNN is a stochastic image processing technique based on template optimization using the neighborhood relationships of cells. These cells can be characterized by a functional block diagram that is typical of neural network theory. The functionality of CNN is described in its entirety by a number of small matrices (A, B and I) called the cloning template. CNN can also be considered to be a nonlinear convolution of these matrices. This template describes the strength of the nearest neighbor interconnections in the network. The recurrent perceptron learning algorithm (RPLA) is used in optimization of cloning template. The CNN and standard Canny algorithms were first tested on two sets of synthetic gravity data with the aim of checking the reliability of the proposed approach. The CNN method was compared with classical derivative techniques by applying the cross-correlation method (CC) to the same anomaly map as this latter approach can detect some features that are difficult to identify on the Bouguer anomaly maps. This approach was then applied to the Bouguer anomaly map of Biga and its surrounding area, in Turkey. Structural features in the area between Bandirma, Biga, Yenice and Gonen in the southwest Marmara region are investigated by applying the CNN and CC to the Bouguer anomaly map. Faults identified by these algorithms are generally in accordance with previously mapped surface faults. These examples show that the geologic boundaries can be detected from Bouguer anomaly maps using the cloning template approach. A visual evaluation of the outputs of the CNN and CC approaches is carried out, and the results are compared with each other. This approach provides quantitative solutions based on just a few assumptions, which makes the method more powerful than the classical methods.

  4. Topics in the Detection of Gravitational Waves from Compact Binary Inspirals

    NASA Astrophysics Data System (ADS)

    Kapadia, Shasvath Jagat

    Orbiting compact binaries - such as binary black holes, binary neutron stars and neutron star-black hole binaries - are among the most promising sources of gravitational waves observable by ground-based interferometric detectors. Despite numerous sophisticated engineering techniques, the gravitational wave signals will be buried deep within noise generated by various instrumental and environmental processes, and need to be extracted via a signal processing technique referred to as matched filtering. Matched filtering requires large banks of signal templates that are faithful representations of the true gravitational waveforms produced by astrophysical binaries. The accurate and efficient production of templates is thus crucial to the success of signal processing and data analysis. To that end, the dissertation presents a numerical technique that calibrates existing analytical (Post-Newtonian) waveforms, which are relatively inexpensive, to more accurate fiducial waveforms that are computationally expensive to generate. The resulting waveform family is significantly more accurate than the analytical waveforms, without incurring additional computational costs of production. Certain kinds of transient background noise artefacts, called "glitches'', can masquerade as gravitational wave signals for short durations and throw-off the matched-filter algorithm. Identifying glitches from true gravitational wave signals is a highly non-trivial exercise in data analysis which has been attempted with varying degrees of success. We present here a machine-learning based approach that exploits the various attributes of glitches and signals within detector data to provide a classification scheme that is a significant improvement over previous methods. The dissertation concludes by investigating the possibility of detecting a non-linear DC imprint, called the Christodoulou memory, produced in the arms of ground-based interferometers by the recently detected gravitational waves. The memory, which is even smaller in amplitude than the primary (detected) gravitational waves, will almost certainly not be seen in the current detection event. Nevertheless, future space-based detectors will likely be sensitive enough to observe the memory.

  5. Architecture and settings optimization procedure of a TES frequency domain multiplexed readout firmware

    NASA Astrophysics Data System (ADS)

    Clenet, A.; Ravera, L.; Bertrand, B.; den Hartog, R.; Jackson, B.; van Leeuwen, B.-J.; van Loon, D.; Parot, Y.; Pointecouteau, E.; Sournac, A.

    2014-11-01

    IRAP is developing the readout electronics of the SPICA-SAFARI's TES bolometer arrays. Based on the frequency domain multiplexing technique the readout electronics provides the AC-signals to voltage-bias the detectors; it demodulates the data; and it computes a feedback to linearize the detection chain. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several μ s) and with fast signals (i.e. frequency carriers of the order of 5 MHz). To optimize the power consumption we took advantage of the reduced science signal bandwidth to decouple the signal sampling frequency and the data processing rate. This technique allowed a reduction of the power consumption of the circuit by a factor of 10. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed, to operate an array TES one has to properly define about 21000 parameters. We defined a set of procedures to automatically characterize these parameters and find out the optimal settings.

  6. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    NASA Astrophysics Data System (ADS)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  7. Laser-induced Forward Transfer of Ag Nanopaste.

    PubMed

    Breckenfeld, Eric; Kim, Heungsoo; Auyeung, Raymond C Y; Piqué, Alberto

    2016-03-31

    Over the past decade, there has been much development of non-lithographic methods(1-3) for printing metallic inks or other functional materials. Many of these processes such as inkjet(3) and laser-induced forward transfer (LIFT)(4) have become increasingly popular as interest in printable electronics and maskless patterning has grown. These additive manufacturing processes are inexpensive, environmentally friendly, and well suited for rapid prototyping, when compared to more traditional semiconductor processing techniques. While most direct-write processes are confined to two-dimensional structures and cannot handle materials with high viscosity (particularly inkjet), LIFT can transcend both constraints if performed properly. Congruent transfer of three dimensional pixels (called voxels), also referred to as laser decal transfer (LDT)(5-9), has recently been demonstrated with the LIFT technique using highly viscous Ag nanopastes to fabricate freestanding interconnects, complex voxel shapes, and high-aspect-ratio structures. In this paper, we demonstrate a simple yet versatile process for fabricating a variety of micro- and macroscale Ag structures. Structures include simple shapes for patterning electrical contacts, bridging and cantilever structures, high-aspect-ratio structures, and single-shot, large area transfers using a commercial digital micromirror device (DMD) chip.

  8. Laser-induced Forward Transfer of Ag Nanopaste

    PubMed Central

    Breckenfeld, Eric; Kim, Heungsoo; Auyeung, Raymond C. Y.; Piqué, Alberto

    2016-01-01

    Over the past decade, there has been much development of non-lithographic methods1-3 for printing metallic inks or other functional materials. Many of these processes such as inkjet3 and laser-induced forward transfer (LIFT)4 have become increasingly popular as interest in printable electronics and maskless patterning has grown. These additive manufacturing processes are inexpensive, environmentally friendly, and well suited for rapid prototyping, when compared to more traditional semiconductor processing techniques. While most direct-write processes are confined to two-dimensional structures and cannot handle materials with high viscosity (particularly inkjet), LIFT can transcend both constraints if performed properly. Congruent transfer of three dimensional pixels (called voxels), also referred to as laser decal transfer (LDT)5-9, has recently been demonstrated with the LIFT technique using highly viscous Ag nanopastes to fabricate freestanding interconnects, complex voxel shapes, and high-aspect-ratio structures. In this paper, we demonstrate a simple yet versatile process for fabricating a variety of micro- and macroscale Ag structures. Structures include simple shapes for patterning electrical contacts, bridging and cantilever structures, high-aspect-ratio structures, and single-shot, large area transfers using a commercial digital micromirror device (DMD) chip. PMID:27077645

  9. Processing of Antenna-Array Signals on the Basis of the Interference Model Including a Rank-Deficient Correlation Matrix

    NASA Astrophysics Data System (ADS)

    Rodionov, A. A.; Turchin, V. I.

    2017-06-01

    We propose a new method of signal processing in antenna arrays, which is called the Maximum-Likelihood Signal Classification. The proposed method is based on the model in which interference includes a component with a rank-deficient correlation matrix. Using numerical simulation, we show that the proposed method allows one to ensure variance of the estimated arrival angle of the plane wave, which is close to the Cramer-Rao lower boundary and more efficient than the best-known MUSIC method. It is also shown that the proposed technique can be efficiently used for estimating the time dependence of the useful signal.

  10. Biochips: A fruitful product of solid state physics and molecular biology

    NASA Astrophysics Data System (ADS)

    Mendoza-Alvarez, Julio G.

    1998-08-01

    The application of the standard high resolution photolithography techniques used in the semiconductor device industry to the growth of a chain of nucleotides with a precise and well known sequence, has made possible the fabrication of a new kind of device, the so called biochips. At the National Polytechnic Institute in Mexico we have joined a multidisciplinary scientific group, and we are in the process of developing the technical capabilities in order to set up a processing lab to fabricate biochips focused to very specific applications in the area of cancer detection. We present here the main lines along which this project is being developed.

  11. Growth of high quality bulk size single crystals of inverted solubility lithium sulphate monohydrate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silambarasan, A.; Rajesh, P., E-mail: rajeshp@ssn.edu.in; Ramasamy, P.

    2015-06-24

    The paper summarizes the processes of growing large lithium sulfate monohydrate (LSMH) single crystals. We have established a procedure to grow high quality bulk size single crystals of inverted solubility LSMH by a newly developed unidirectional crystallization technique called the Sankeranarayenan - Ramasamy (SR) method. The convective flow of crystal growth processes from solution and the conditions of growing crystals of various aspects were discussed. Good quality LSMH single crystal is grown of the size 20 mmX80 mm without cracks, localized-defects and inclusions. The as-grown crystals are suitable for piezoelectric and nonlinear optical applications.

  12. Creating a halo traction wheelchair resource manual: using the EBP approach.

    PubMed

    Difazio, Rachel

    2003-04-01

    This article describes a clinically based project that used evidence-based practice (EBP). It follows the EBP process of: (1) identifying a clinical problem and stating a clinical question that focuses the process; (2) doing a literature search for best research evidence; (3) using query techniques, such as phone calls and e-mails, to determine best clinical practice among similar institutions; and (4) drawing a practice conclusion-to accept the status quo, to instigate change of practice, or to do more research. This project was an interdisciplinary effort orchestrated by the surgical programs nurses at Boston Children's Hospital. Copyright 2003, Elsevier Inc. All rights reserved.

  13. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  14. The endpoint detection technique for deep submicrometer plasma etching

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Du, Zhi-yun; Zeng, Yong; Lan, Zhong-went

    2009-07-01

    The availability of reliable optical sensor technology provides opportunities to better characterize and control plasma etching processes in real time, they could play a important role in endpoint detection, fault diagnostics and processes feedback control and so on. The optical emission spectroscopy (OES) method becomes deficient in the case of deep submicrometer gate etching. In the newly developed high density inductively coupled plasma (HD-ICP) etching system, Interferometry endpoint (IEP) is introduced to get the EPD. The IEP fringe count algorithm is investigated to predict the end point, and then its signal is used to control etching rate and to call end point with OES signal in over etching (OE) processes step. The experiment results show that IEP together with OES provide extra process control margin for advanced device with thinner gate oxide.

  15. Covert Channels in SIP for VoIP Signalling

    NASA Astrophysics Data System (ADS)

    Mazurczyk, Wojciech; Szczypiorski, Krzysztof

    In this paper, we evaluate available steganographic techniques for SIP (Session Initiation Protocol) that can be used for creating covert channels during signaling phase of VoIP (Voice over IP) call. Apart from characterizing existing steganographic methods we provide new insights by introducing new techniques. We also estimate amount of data that can be transferred in signalling messages for typical IP telephony call.

  16. Adaptive correction of ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane

    2017-04-01

    Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.

  17. Techniques for Automatically Generating Biographical Summaries from News Articles

    DTIC Science & Technology

    2007-09-01

    non-trivial because of the many NLP areas that must be used to efficiently extract the relevant facts. Yet, no study has been done to determine how...also non-trivial because of the many NLP areas that must be used to efficiently extract the relevant facts. Yet, no study has been done to determine...AI) research is called Natural Language Processing ( NLP ). NLP seeks to find ways for computers to read and write documents in as human a way as

  18. 47 CFR 22.921 - 911 call processing procedures; 911-only calling mode.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... programming in the mobile unit that determines the handling of a non-911 call and permit the call to be... CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.921 911 call processing procedures; 911-only calling mode. Mobile telephones manufactured after February 13, 2000 that are capable of...

  19. Constraint-based Data Mining

    NASA Astrophysics Data System (ADS)

    Boulicaut, Jean-Francois; Jeudy, Baptiste

    Knowledge Discovery in Databases (KDD) is a complex interactive process. The promising theoretical framework of inductive databases considers this is essentially a querying process. It is enabled by a query language which can deal either with raw data or patterns which hold in the data. Mining patterns turns to be the so-called inductive query evaluation process for which constraint-based Data Mining techniques have to be designed. An inductive query specifies declaratively the desired constraints and algorithms are used to compute the patterns satisfying the constraints in the data. We survey important results of this active research domain. This chapter emphasizes a real breakthrough for hard problems concerning local pattern mining under various constraints and it points out the current directions of research as well.

  20. CO2 Capture Using Electric Fields: Low-Cost Electrochromic Film on Plastic for Net-Zero Energy Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2010-01-01

    Broad Funding Opportunity Announcement Project: Two faculty members at Lehigh University created a new technique called supercapacitive swing adsorption (SSA) that uses electrical charges to encourage materials to capture and release CO2. Current CO2 capture methods include expensive processes that involve changes in temperature or pressure. Lehigh University’s approach uses electric fields to improve the ability of inexpensive carbon sorbents to trap CO2. Because this process uses electric fields and not electric current, the overall energy consumption is projected to be much lower than conventional methods. Lehigh University is now optimizing the materials to maximize CO2 capture and minimize themore » energy needed for the process.« less

  1. Decontamination of Nuclear Liquid Wastes Status of CEA and AREVA R and D: Application to Fukushima Waste Waters - 12312

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fournel, B.; Barre, Y.; Lepeytre, C.

    2012-07-01

    Liquid wastes decontamination processes are mainly based on two techniques: Bulk processes and the so called Cartridges processes. The first technique has been developed for the French nuclear fuel reprocessing industry since the 60's in Marcoule and La Hague. It is a proven and mature technology which has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The second technique, involving cartridges processes, offers new opportunities for the use of innovative adsorbents. The AREVA process developed for Fukushima and some results obtained on site will be presented as well as laboratory scale resultsmore » obtained in CEA laboratories. Examples of new adsorbents development for liquid wastes decontamination are also given. A chemical process unit based on co-precipitation technique has been successfully and quickly implemented by AREVA at Fukushima site for the processing of contaminated waters. The asset of this technique is its ability to process large volumes in a continuous mode. Several chemical products can be used to address specific radioelements such as: Cs, Sr, Ru. Its drawback is the production of sludge (about 1% in volume of initial liquid volume). CEA developed strategies to model the co-precipitation phenomena in order to firstly minimize the quantity of added chemical reactants and secondly, minimize the size of co-precipitation units. We are on the way to design compact units that could be mobilized very quickly and efficiently in case of an accidental situation. Addressing the problem of sludge conditioning, cementation appears to be a very attractive solution. Fukushima accident has focused attention on optimizations that should be taken into account in future studies: - To better take account for non-typical aqueous matrixes like seawater; - To enlarge the spectrum of radioelements that can be efficiently processed and especially short lives radioelements that are usually less present in standard effluents resulting from nuclear activities; - To develop reversible solid adsorbents for cartridge-type applications in order to minimize wastes. (authors)« less

  2. 77 FR 56710 - Proposed Information Collection (Call Center Satisfaction Survey): Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0744] Proposed Information Collection (Call Center Satisfaction Survey): Comment Request AGENCY: Veterans Benefits Administration, Department of... techniques or the use of other forms of information technology. Title: VBA Call Center Satisfaction Survey...

  3. The DCU: the detector control unit for SPICA-SAFARI

    NASA Astrophysics Data System (ADS)

    Clénet, Antoine; Ravera, Laurent; Bertrand, Bernard; den Hartog, Roland H.; Jackson, Brian D.; van Leeuven, Bert-Joost; van Loon, Dennis; Parot, Yann; Pointecouteau, Etienne; Sournac, Anthony

    2014-08-01

    IRAP is developing the warm electronic, so called Detector Control Unit" (DCU), in charge of the readout of the SPICA-SAFARI's TES type detectors. The architecture of the electronics used to readout the 3 500 sensors of the 3 focal plane arrays is based on the frequency domain multiplexing technique (FDM). In each of the 24 detection channels the data of up to 160 pixels are multiplexed in frequency domain between 1 and 3:3 MHz. The DCU provides the AC signals to voltage-bias the detectors; it demodulates the detectors data which are readout in the cold by a SQUID; and it computes a feedback signal for the SQUID to linearize the detection chain in order to optimize its dynamic range. The feedback is computed with a specific technique, so called baseband feedback (BBFB) which ensures that the loop is stable even with long propagation and processing delays (i.e. several µs) and with fast signals (i.e. frequency carriers at 3:3 MHz). This digital signal processing is complex and has to be done at the same time for the 3 500 pixels. It thus requires an optimisation of the power consumption. We took the advantage of the relatively reduced science signal bandwidth (i.e. 20 - 40 Hz) to decouple the signal sampling frequency (10 MHz) and the data processing rate. Thanks to this method we managed to reduce the total number of operations per second and thus the power consumption of the digital processing circuit by a factor of 10. Moreover we used time multiplexing techniques to share the resources of the circuit (e.g. a single BBFB module processes 32 pixels). The current version of the firmware is under validation in a Xilinx Virtex 5 FPGA, the final version will be developed in a space qualified digital ASIC. Beyond the firmware architecture the optimization of the instrument concerns the characterization routines and the definition of the optimal parameters. Indeed the operation of the detection and readout chains requires to properly define more than 17 500 parameters (about 5 parameters per pixel). Thus it is mandatory to work out an automatic procedure to set up these optimal values. We defined a fast algorithm which characterizes the phase correction to be applied by the BBFB firmware and the pixel resonance frequencies. We also defined a technique to define the AC-carrier initial phases in such a way that the amplitude of their sum is minimized (for a better use of the DAC dynamic range).

  4. Involving staff pharmacists in management decisions.

    PubMed

    Robinson, L A; Vanderveen, T W

    1977-03-01

    Various administrative techniques used to bring staff pharmacists in a decentralized, satellite pharmacy system into the managerial decision-making process are discussed. These techniques include a staff pharmacist on-call procedure to discourage absenteeism, and the concept of a head pharmacist to serve as a link with departmental administration. The head pharmacist works in the satelite pharmacy, is responsible for its daily operation and is the spokesman for the satellite. Active roles for the head pharmacist in the selection and evaluation of technicians are outlines. Management skills are developed in head pharmacists through a program of special classes and discussion groups. It is concluded that this program has improved the credibility of administrative decisions and has tapped an underused source of ideas and talent.

  5. Early driver fatigue detection from electroencephalography signals using artificial neural networks.

    PubMed

    King, L M; Nguyen, H T; Lal, S K L

    2006-01-01

    This paper describes a driver fatigue detection system using an artificial neural network (ANN). Using electroencephalogram (EEG) data sampled from 20 professional truck drivers and 35 non professional drivers, the time domain data are processed into alpha, beta, delta and theta bands and then presented to the neural network to detect the onset of driver fatigue. The neural network uses a training optimization technique called the magnified gradient function (MGF). This technique reduces the time required for training by modifying the standard back propagation (SBP) algorithm. The MGF is shown to classify professional driver fatigue with 81.49% accuracy (80.53% sensitivity, 82.44% specificity) and non-professional driver fatigue with 83.06% accuracy (84.04% sensitivity and 82.08% specificity).

  6. HMMBinder: DNA-Binding Protein Prediction Using HMM Profile Based Features.

    PubMed

    Zaman, Rianon; Chowdhury, Shahana Yasmin; Rashid, Mahmood A; Sharma, Alok; Dehzangi, Abdollah; Shatabda, Swakkhar

    2017-01-01

    DNA-binding proteins often play important role in various processes within the cell. Over the last decade, a wide range of classification algorithms and feature extraction techniques have been used to solve this problem. In this paper, we propose a novel DNA-binding protein prediction method called HMMBinder. HMMBinder uses monogram and bigram features extracted from the HMM profiles of the protein sequences. To the best of our knowledge, this is the first application of HMM profile based features for the DNA-binding protein prediction problem. We applied Support Vector Machines (SVM) as a classification technique in HMMBinder. Our method was tested on standard benchmark datasets. We experimentally show that our method outperforms the state-of-the-art methods found in the literature.

  7. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed Central

    Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921

  8. Towards elicitation of users requirements for hospital information system: from a care process modelling technique to a web based collaborative tool.

    PubMed

    Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius

    2002-01-01

    Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.

  9. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  10. AI in CALL--Artificially Inflated or Almost Imminent?

    ERIC Educational Resources Information Center

    Schulze, Mathias

    2008-01-01

    The application of techniques from artificial intelligence (AI) to CALL has commonly been referred to as intelligent CALL (ICALL). ICALL is only slightly older than the "CALICO Journal", and this paper looks back at a quarter century of published research mainly in North America and by North American scholars. This "inventory…

  11. Reducing software mass through behavior control. [of planetary roving robots

    NASA Technical Reports Server (NTRS)

    Miller, David P.

    1992-01-01

    Attention is given to the tradeoff between communication and computation as regards a planetary rover (both these subsystems are very power-intensive, and both can be the major driver of the rover's power subsystem, and therefore the minimum mass and size of the rover). Software techniques that can be used to reduce the requirements on both communciation and computation, allowing the overall robot mass to be greatly reduced, are discussed. Novel approaches to autonomous control, called behavior control, employ an entirely different approach, and for many tasks will yield a similar or superior level of autonomy to traditional control techniques, while greatly reducing the computational demand. Traditional systems have several expensive processes that operate serially, while behavior techniques employ robot capabilities that run in parallel. Traditional systems make extensive world models, while behavior control systems use minimal world models or none at all.

  12. Persistence Mapping Using EUV Solar Imager Data

    NASA Technical Reports Server (NTRS)

    Thompson, B. J.; Young, C. A.

    2016-01-01

    We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call "Persistence Mapping," to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or "time-lapse" imaging uses the full sample (of size N ), Persistence Mapping rejects (N - 1)/N of the data set and identifies the most relevant 1/N values using the following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.

  13. Domain decomposition methods in aerodynamics

    NASA Technical Reports Server (NTRS)

    Venkatakrishnan, V.; Saltz, Joel

    1990-01-01

    Compressible Euler equations are solved for two-dimensional problems by a preconditioned conjugate gradient-like technique. An approximate Riemann solver is used to compute the numerical fluxes to second order accuracy in space. Two ways to achieve parallelism are tested, one which makes use of parallelism inherent in triangular solves and the other which employs domain decomposition techniques. The vectorization/parallelism in triangular solves is realized by the use of a recording technique called wavefront ordering. This process involves the interpretation of the triangular matrix as a directed graph and the analysis of the data dependencies. It is noted that the factorization can also be done in parallel with the wave front ordering. The performances of two ways of partitioning the domain, strips and slabs, are compared. Results on Cray YMP are reported for an inviscid transonic test case. The performances of linear algebra kernels are also reported.

  14. Railway crossing risk area detection using linear regression and terrain drop compensation techniques.

    PubMed

    Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing

    2014-06-16

    Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas.

  15. Railway Crossing Risk Area Detection Using Linear Regression and Terrain Drop Compensation Techniques

    PubMed Central

    Chen, Wen-Yuan; Wang, Mei; Fu, Zhou-Xing

    2014-01-01

    Most railway accidents happen at railway crossings. Therefore, how to detect humans or objects present in the risk area of a railway crossing and thus prevent accidents are important tasks. In this paper, three strategies are used to detect the risk area of a railway crossing: (1) we use a terrain drop compensation (TDC) technique to solve the problem of the concavity of railway crossings; (2) we use a linear regression technique to predict the position and length of an object from image processing; (3) we have developed a novel strategy called calculating local maximum Y-coordinate object points (CLMYOP) to obtain the ground points of the object. In addition, image preprocessing is also applied to filter out the noise and successfully improve the object detection. From the experimental results, it is demonstrated that our scheme is an effective and corrective method for the detection of railway crossing risk areas. PMID:24936948

  16. Visual Image Sensor Organ Replacement

    NASA Technical Reports Server (NTRS)

    Maluf, David A.

    2014-01-01

    This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.

  17. DeMAID/GA an Enhanced Design Manager's Aid for Intelligent Decomposition

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1996-01-01

    Many companies are looking for new tools and techniques to aid a design manager in making decisions that can reduce the time and cost of a design cycle. One tool is the Design Manager's Aid for Intelligent Decomposition (DeMAID). Since the initial public release of DeMAID in 1989, much research has been done in the areas of decomposition, concurrent engineering, parallel processing, and process management; many new tools and techniques have emerged. Based on these recent research and development efforts, numerous enhancements have been added to DeMAID to further aid the design manager in saving both cost and time in a design cycle. The key enhancement, a genetic algorithm (GA), will be available in the next public release called DeMAID/GA. The GA sequences the design processes to minimize the cost and time in converging a solution. The major enhancements in the upgrade of DeMAID to DeMAID/GA are discussed in this paper. A sample conceptual design project is used to show how these enhancements can be applied to improve the design cycle.

  18. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  19. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  20. An improved numerical method to compute neutron/gamma deexcitation cascades starting from a high spin state

    DOE PAGES

    Regnier, D.; Litaize, O.; Serot, O.

    2015-12-23

    Numerous nuclear processes involve the deexcitation of a compound nucleus through the emission of several neutrons, gamma-rays and/or conversion electrons. The characteristics of such a deexcitation are commonly derived from a total statistical framework often called “Hauser–Feshbach” method. In this work, we highlight a numerical limitation of this kind of method in the case of the deexcitation of a high spin initial state. To circumvent this issue, an improved technique called the Fluctuating Structure Properties (FSP) method is presented. Two FSP algorithms are derived and benchmarked on the calculation of the total radiative width for a thermal neutron capture onmore » 238U. We compare the standard method with these FSP algorithms for the prediction of particle multiplicities in the deexcitation of a high spin level of 143Ba. The gamma multiplicity turns out to be very sensitive to the numerical method. The bias between the two techniques can reach 1.5 γγ/cascade. Lastly, the uncertainty of these calculations coming from the lack of knowledge on nuclear structure is estimated via the FSP method.« less

  1. Temporal abstraction and inductive logic programming for arrhythmia recognition from electrocardiograms.

    PubMed

    Carrault, G; Cordier, M-O; Quiniou, R; Wang, F

    2003-07-01

    This paper proposes a novel approach to cardiac arrhythmia recognition from electrocardiograms (ECGs). ECGs record the electrical activity of the heart and are used to diagnose many heart disorders. The numerical ECG is first temporally abstracted into series of time-stamped events. Temporal abstraction makes use of artificial neural networks to extract interesting waves and their features from the input signals. A temporal reasoner called a chronicle recogniser processes such series in order to discover temporal patterns called chronicles which can be related to cardiac arrhythmias. Generally, it is difficult to elicit an accurate set of chronicles from a doctor. Thus, we propose to learn automatically from symbolic ECG examples the chronicles discriminating the arrhythmias belonging to some specific subset. Since temporal relationships are of major importance, inductive logic programming (ILP) is the tool of choice as it enables first-order relational learning. The approach has been evaluated on real ECGs taken from the MIT-BIH database. The performance of the different modules as well as the efficiency of the whole system is presented. The results are rather good and demonstrate that integrating numerical techniques for low level perception and symbolic techniques for high level classification is very valuable.

  2. Celebrating variability and a call to limit systematisation: the example of the Behaviour Change Technique Taxonomy and the Behaviour Change Wheel.

    PubMed

    Ogden, Jane

    2016-09-01

    Within any discipline there is always a degree of variability. For medicine it takes the form of Health Professional's behaviour, for education it's the style and content of the classroom, and for health psychology, it can be found in patient's behaviour, the theories used and clinical practice. Over recent years, attempts have been made to reduce this variability through the use of the Behaviour Change Technique Taxonomy, the COM-B and the Behaviour Change Wheel. This paper argues that although the call for better descriptions of what is done is useful for clarity and replication, this systematisation may be neither feasible nor desirable. In particular, it is suggested that the gaps inherent in the translational process from coding a protocol to behaviour will limit the effectiveness of reducing patient variability, that theory variability is necessary for the health and well-being of a discipline and that practice variability is central to the professional status of our practitioners. It is therefore argued that we should celebrate rather than remove this variability in order for our discipline to thrive and for us to remain as professionals rather than as technicians.

  3. Processing of semen by density gradient centrifugation selects spermatozoa with longer telomeres for assisted reproduction techniques.

    PubMed

    Yang, Qingling; Zhang, Nan; Zhao, Feifei; Zhao, Wanli; Dai, Shanjun; Liu, Jinhao; Bukhari, Ihtisham; Xin, Hang; Niu, Wenbing; Sun, Yingpu

    2015-07-01

    The ends of eukaryotic chromosomes contain specialized chromatin structures called telomeres, the length of which plays a key role in early human embryonic development. Although the effect of sperm preparation techniques on major sperm characteristics, such as concentration, motility and morphology have been previously documented, the possible status of telomere length and its relation with sperm preparation techniques is not well-known for humans. The aim of this study was to investigate the role of density gradient centrifugation in the selection of spermatozoa with longer telomeres for use in assisted reproduction techniques in 105 samples before and after sperm processing. After density gradient centrifugation, the average telomere length of the sperm was significantly longer (6.51 ± 2.54 versus 5.16 ± 2.29, P < 0.01), the average motile sperm rate was significantly higher (77.9 ± 11.8 versus 44.6 ± 11.2, P < 0.01), but average DNA fragmentation rate was significantly lower (11.1 ± 5.9 versus 25.9 ± 12.9, P < 0.01) compared with raw semen. Additionally, telomere length was positively correlated with semen sperm count (rs = 0.58; P < 0.01). In conclusion, density gradient centrifugation is a useful technique for selection of sperm with longer telomeres. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  4. Development of lead zirconate titanate cantilevers on the micrometer length scale

    NASA Astrophysics Data System (ADS)

    Martin, Christopher Robert

    The objective of this research project was to fabricate a functional ferroelectric microcantilever from patterned lead zirconate titanate (PZT) thin films. Cantilevers fabricated from ferroelectric materials have tremendous potential in sensing applications, particularly due to the increased sensitivity that miniaturized devices offer. This thesis highlights and explores a number of the processing issues that hindered the production of a working prototype. PZT is patterned using soft lithography-inspired techniques from a PZT chemical precursor solution derived by the chelation synthesis route. As the ability to pattern ceramic materials derived from sol-gels on the micrometer scale is a relatively new technology, this thesis aims to expand the scientific understanding of new issues that arise when working with these patterned films. For example, the use of Micromolding in Capillaries (MIMIC) to pattern the PZT thin films results in the evolution of topographical distortions from the shape of the original mold during the shrinkage of patterned thin film during drying and sintering. The factors that contribute to this effect have been explained and a new processing technique called MicroChannel Molding (muCM) was developed. This new process combines the advantages of soft lithography with traditional silicon microfabrication techniques to ensure compatibility with current industrial practices. This work lays the foundation for the future production of working ferroelectric microcantilevers. The proposed microfabrication process is described along with descriptions of each processing difficulty that was encountered. Modifications to the process are proposed along with the descriptions of alternative processing techniques that were attempted for the benefit of future researchers. This dissertation concludes with the electronic characterization of micropattemed PZT thin films. To our knowledge, the ferroelectric properties of patterned PZT thin films have never been directly characterized before. The properties are measured with a commercial ferroelectric test system connected through a conductive Atomic Force Microscope tip. The films patterned by MIMIC and muCM are compared to large-area spin cast films to identify the role that the processing method has on the resulting properties.

  5. Real-time access of large volume imagery through low-bandwidth links

    NASA Astrophysics Data System (ADS)

    Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew

    2010-04-01

    Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.

  6. Exploiting Software Tool Towards Easier Use And Higher Efficiency

    NASA Astrophysics Data System (ADS)

    Lin, G. H.; Su, J. T.; Deng, Y. Y.

    2006-08-01

    In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing

  7. High Harmonic Generation XUV Spectroscopy for Studying Ultrafast Photophysics of Coordination Complexes

    NASA Astrophysics Data System (ADS)

    Ryland, Elizabeth S.; Lin, Ming-Fu; Benke, Kristin; Verkamp, Max A.; Zhang, Kaili; Vura-Weis, Josh

    2017-06-01

    Extreme ultraviolet (XUV) spectroscopy is an inner shell technique that probes the M_{2,3}-edge excitation of atoms. Absorption of the XUV photon causes a 3p→3d transition, the energy and shape of which is directly related to the element and ligand environment. This technique is thus element-, oxidation state-, spin state-, and ligand field specific. A process called high-harmonic generation (HHG) enables the production of ultrashort (˜20fs) pulses of collimated XUV photons in a tabletop instrument. This allows transient XUV spectroscopy to be conducted as an in-lab experiment, where it was previously only possible at accelerator-based light sources. Additionally, ultrashort pulses provide the capability for unprecedented time resolution (˜50fs IRF). This technique has the capacity to serve a pivotal role in the study of electron and energy transfer processes in materials and chemical biology. I will present the XUV transient absorption instrument we have built, along with ultrafast transient M_{2,3}-edge absorption data of a series of small inorganic molecules in order to demonstrate the high specificity and time resolution of this tabletop technique as well as how our group is applying it to the study of ultrafast electronic dynamics of coordination complexes.

  8. Holographic femtosecond laser processing and its application to biological materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hayasaki, Yoshio

    2017-02-01

    Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.

  9. Input-output relationship in social communications characterized by spike train analysis

    NASA Astrophysics Data System (ADS)

    Aoki, Takaaki; Takaguchi, Taro; Kobayashi, Ryota; Lambiotte, Renaud

    2016-10-01

    We study the dynamical properties of human communication through different channels, i.e., short messages, phone calls, and emails, adopting techniques from neuronal spike train analysis in order to characterize the temporal fluctuations of successive interevent times. We first measure the so-called local variation (LV) of incoming and outgoing event sequences of users and find that these in- and out-LV values are positively correlated for short messages and uncorrelated for phone calls and emails. Second, we analyze the response-time distribution after receiving a message to focus on the input-output relationship in each of these channels. We find that the time scales and amplitudes of response differ between the three channels. To understand the effects of the response-time distribution on the correlations between the LV values, we develop a point process model whose activity rate is modulated by incoming and outgoing events. Numerical simulations of the model indicate that a quick response to incoming events and a refractory effect after outgoing events are key factors to reproduce the positive LV correlations.

  10. "I experienced freedom within the frame of my own narrative": The contribution of psychodrama techniques to experiential learning in teacher training

    NASA Astrophysics Data System (ADS)

    ter Avest, Ina

    2017-02-01

    To prepare Dutch students in education for critical situations in their professional life as a teacher, part of their training is to ask them to reflect upon their own experiences in their life as a child, a pupil and a student - experiences of crucial moments or with significant others which are still of the utmost importance to them. This article underlines the significance of so-called "experiential learning" in student career counselling. In this context, experiential learning is understood as an extension of in- depth reflection on critical incidents and critical persons in the biography of pre-service teachers. This reflection - customary and effective in Dutch teacher training - is a verbal process. However, this technique does not seem to be adequate for many students from other cultural backgrounds (e.g. second-generation descendants of migrant workers). By consequence, some of these students are not able to take newly offered information on board, but remain imprisoned in their own culture-related narrative, their own ethnic society of mind. Research has shown that for these students, psychodrama techniques, focusing on non-verbal and playful aspects of reflection, seem to be more suitable. The author of this article presents a sample case from a pilot study which used one of the psychodrama techniques called the empty chair. The findings of the pilot study are promising in the sense that experiencing different I- positions does seem to help students from other cultural backgrounds to develop agency in responding to hitherto unfamiliar and confusing situations.

  11. Tomographic Aperture-Encoded Particle Tracking Velocimetry: A New Approach to Volumetric PIV

    NASA Astrophysics Data System (ADS)

    Troolin, Dan; Boomsma, Aaron; Lai, Wing; Pothos, Stamatios; Fluid Mechanics Research Instruments Team

    2016-11-01

    Volumetric velocity fields are useful in a wide variety of fluid mechanics applications. Several types of three-dimensional imaging methods have been used in the past to varying degrees of success, for example, 3D PTV (Maas et al., 1993), DDPIV (Peireira et al., 2006), Tomographic PIV (Elsinga, 2006), and V3V (Troolin and Longmire, 2009), among others. Each of these techniques has shown advantages and disadvantages in different areas. With the advent of higher resolution and lower noise cameras with higher stability levels, new techniques are emerging that combine the advantages of the existing techniques. This talk describes a new technique called Tomographic Aperture-Encoded Particle Tracking Velocimetry (TAPTV), in which segmented triangulation and diameter tolerance are used to achieve three-dimensional particle tracking with extremely high particle densities (on the order of ppp = 0.2 or higher) without the drawbacks normally associated with ghost particles (for example in TomoPIV). The results are highly spatially-resolved data with very fast processing times. A detailed explanation of the technique as well as plots, movies, and experimental considerations will be discussed.

  12. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry.

    PubMed

    Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D

    2016-01-01

    One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques.

  13. Large Terrain Continuous Level of Detail 3D Visualization Tool

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.

  14. Space-filling designs for computer experiments: A review

    DOE PAGES

    Joseph, V. Roshan

    2016-01-29

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  15. Space-filling designs for computer experiments: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan

    Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less

  16. Evolutionary learning processes as the foundation for behaviour change.

    PubMed

    Crutzen, Rik; Peters, Gjalt-Jorn Ygram

    2018-03-01

    We argue that the active ingredients of behaviour change interventions, often called behaviour change methods (BCMs) or techniques (BCTs), can usefully be placed on a dimension of psychological aggregation. We introduce evolutionary learning processes (ELPs) as fundamental building blocks that are on a lower level of psychological aggregation than BCMs/BCTs. A better understanding of ELPs is useful to select the appropriate BCMs/BCTs to target determinants of behaviour, or vice versa, to identify potential determinants targeted by a given BCM/BCT, and to optimally translate them into practical applications. Using these insights during intervention development may increase the likelihood of developing effective interventions - both in terms of behaviour change as well as maintenance of behaviour change.

  17. Elucidating rhizosphere processes by mass spectrometry - A review.

    PubMed

    Rugova, Ariana; Puschenreiter, Markus; Koellensperger, Gunda; Hann, Stephan

    2017-03-01

    The presented review discusses state-of-the-art mass spectrometric methods, which have been developed and applied for investigation of chemical processes in the soil-root interface, the so-called rhizosphere. Rhizosphere soil's physical and chemical characteristics are to a great extent influenced by a complex mixture of compounds released from plant roots, i.e. root exudates, which have a high impact on nutrient and trace element dynamics in the soil-root interface as well as on microbial activities or soil physico-chemical characteristics. Chemical characterization as well as accurate quantification of the compounds present in the rhizosphere is a major prerequisite for a better understanding of rhizosphere processes and requires the development and application of advanced sampling procedures in combination with highly selective and sensitive analytical techniques. During the last years, targeted and non-targeted mass spectrometry-based methods have emerged and their combination with specific separation methods for various elements and compounds of a wide polarity range have been successfully applied in several studies. With this review we critically discuss the work that has been conducted within the last decade in the context of rhizosphere research and elemental or molecular mass spectrometry emphasizing different separation techniques as GC, LC and CE. Moreover, selected applications such as metal detoxification or nutrient acquisition will be discussed regarding the mass spectrometric techniques applied in studies of root exudates in plant-bacteria interactions. Additionally, a more recent isotope probing technique as novel mass spectrometry based application is highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. A comparison in Colorado of three methods to monitor breeding amphibians

    USGS Publications Warehouse

    Corn, P.S.; Muths, E.; Iko, W.M.

    2000-01-01

    We surveyed amphibians at 4 montane and 2 plains lentic sites in northern Colorado using 3 techniques: standardized call surveys, automated recording devices (frog-loggers), and intensive surveys including capture-recapture techniques. Amphibians were observed at 5 sites. Species richness varied from 0 to 4 species at each site. Richness scores, the sums of species richness among sites, were similar among methods: 8 for call surveys, 10 for frog-loggers, and 11 for intensive surveys (9 if the non-vocal salamander Ambystoma tigrinum is excluded). The frog-logger at 1 site recorded Spea bombifrons which was not active during the times when call and intensive surveys were conducted. Relative abundance scores from call surveys failed to reflect a relatively large population of Bufo woodhousii at 1 site and only weakly differentiated among different-sized populations of Pseudacris maculata at 3 other sites. For extensive applications, call surveys have the lowest costs and fewest requirements for highly trained personnel. However, for a variety of reasons, call surveys cannot be used with equal effectiveness in all parts of North America.

  19. PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems

    NASA Astrophysics Data System (ADS)

    da Silva, Glauco; Netto Lahoz, Carlos Henrique

    2013-09-01

    This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).

  20. GNSS Radio Occultation Observations as a data source for Ionospheric Assimilation: COSMIC-1 & COSMIC-2

    NASA Astrophysics Data System (ADS)

    Yue, X.; Schreiner, W. S.; Kuo, Y. H.

    2014-12-01

    Since the pioneer GPS/MET mission, low Earth orbit (LEO) based global navigation satellite system (GNSS) Radio Occultation (RO) technique has been a powerful technique in ionosphere monitoring. After that, many LEO satellites were launched with RO payload, include: CHAMP , GRACE, SAC-C/D, COSMIC, C/NOFS, Metop-A/B, TerraSAR-X/TanDEM-X, and etc. COSMIC was the first constellation of satellites dedicated primarily to RO and delivering RO data in near real time. Currently in UCAR CDAAC, we process most of these missions' RO data for the community. Due to the success of COSMIC mission, a follow on mission called COSMIC-2 will be launched in 2016 and 2018, respectively. The COSMIC-2 RO data will be 4-6 times of COSMIC due to the doubled satellite and GNSS signals. In this paper we will describe: (1) Data process and quality in UCAR/CDAAC; (2) Ionospheric data assimilation results based on COSMIC data; (3) OSSE study for COSMIC-2.

  1. A modular positron camera for the study of industrial processes

    NASA Astrophysics Data System (ADS)

    Leadbeater, T. W.; Parker, D. J.

    2011-10-01

    Positron imaging techniques rely on the detection of the back-to-back annihilation photons arising from positron decay within the system under study. A standard technique, called positron emitting particle tracking (PEPT) [1], uses a number of these detected events to rapidly determine the position of a positron emitting tracer particle introduced into the system under study. Typical applications of PEPT are in the study of granular and multi-phase materials in the disciplines of engineering and the physical sciences. Using components from redundant medical PET scanners a modular positron camera has been developed. This camera consists of a number of small independent detector modules, which can be arranged in custom geometries tailored towards the application in question. The flexibility of the modular camera geometry allows for high photon detection efficiency within specific regions of interest, the ability to study large and bulky systems and the application of PEPT to difficult or remote processes as the camera is inherently transportable.

  2. A critical review on tablet disintegration.

    PubMed

    Quodbach, Julian; Kleinebudde, Peter

    2016-09-01

    Tablet disintegration is an important factor for drug release and can be modified with excipients called tablet disintegrants. Tablet disintegrants act via different mechanisms and the efficacy of these excipients is influenced by various factors. In this review, the existing literature on tablet disintegration is critically reviewed. Potential disintegration mechanisms, as well as impact factors on the disintegration process will be discussed based on experimental evidence. Search terms for Scopus and Web of Science included "tablet disintegration", "mechanism tablet disintegration", "superdisintegrants", "disintegrants", "swelling force", "disintegration force", "disintegration mechanisms", as well as brand names of commonly applied superdisintegrants. References of identified papers were screened as well. Experimental data supports swelling and shape recovery as main mechanisms of action of disintegrants. Other tablet excipients and different manufacturing techniques greatly influence the disintegration process. The use of different excipients, experimental setups and manufacturing techniques, as well as the demand for original research led to a distinct patchwork of knowledge. Broader, more systematic approaches are necessary not only to structure the past but also future findings.

  3. Principal components technique analysis for vegetation and land use discrimination. [Brazilian cerrados

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Formaggio, A. R.; Dossantos, J. R.; Dias, L. A. V.

    1984-01-01

    Automatic pre-processing technique called Principal Components (PRINCO) in analyzing LANDSAT digitized data, for land use and vegetation cover, on the Brazilian cerrados was evaluated. The chosen pilot area, 223/67 of MSS/LANDSAT 3, was classified on a GE Image-100 System, through a maximum-likehood algorithm (MAXVER). The same procedure was applied to the PRINCO treated image. PRINCO consists of a linear transformation performed on the original bands, in order to eliminate the information redundancy of the LANDSAT channels. After PRINCO only two channels were used thus reducing computer effort. The original channels and the PRINCO channels grey levels for the five identified classes (grassland, "cerrado", burned areas, anthropic areas, and gallery forest) were obtained through the MAXVER algorithm. This algorithm also presented the average performance for both cases. In order to evaluate the results, the Jeffreys-Matusita distance (JM-distance) between classes was computed. The classification matrix, obtained through MAXVER, after a PRINCO pre-processing, showed approximately the same average performance in the classes separability.

  4. Semiconductor photoelectrochemistry

    NASA Technical Reports Server (NTRS)

    Buoncristiani, A. M.; Byvik, C. E.

    1983-01-01

    Semiconductor photoelectrochemical reactions are investigated. A model of the charge transport processes in the semiconductor, based on semiconductor device theory, is presented. It incorporates the nonlinear processes characterizing the diffusion and reaction of charge carriers in the semiconductor. The model is used to study conditions limiting useful energy conversion, specifically the saturation of current flow due to high light intensity. Numerical results describing charge distributions in the semiconductor and its effects on the electrolyte are obtained. Experimental results include: an estimate rate at which a semiconductor photoelectrode is capable of converting electromagnetic energy into chemical energy; the effect of cell temperature on the efficiency; a method for determining the point of zero zeta potential for macroscopic semiconductor samples; a technique using platinized titanium dioxide powders and ultraviolet radiation to produce chlorine, bromine, and iodine from solutions containing their respective ions; the photoelectrochemical properties of a class of layered compounds called transition metal thiophosphates; and a technique used to produce high conversion efficiency from laser radiation to chemical energy.

  5. Noncausal telemetry data recovery techniques

    NASA Technical Reports Server (NTRS)

    Tsou, H.; Lee, R.; Mileant, A.; Hinedi, S.

    1995-01-01

    Cost efficiency is becoming a major driver in future space missions. Because of the constraints on total cost, including design, implementation, and operation, future spacecraft are limited in terms of their size power and complexity. Consequently, it is expected that future missions will operate on marginal space-to-ground communication links that, in turn, can pose an additional risk on the successful scientific data return of these missions. For low data-rate and low downlink-margin missions, the buffering of the telemetry signal for further signal processing to improve data return is a possible strategy; it has been adopted for the Galileo S-band mission. This article describes techniques used for postprocessing of buffered telemetry signal segments (called gaps) to recover data lost during acquisition and resynchronization. Two methods, one for a closed-loop and the other one for an open-loop configuration, are discussed in this article. Both of them can be used in either forward or backward processing of signal segments, depending on where a gap is specifically situated in a pass.

  6. QoS measurement of workflow-based web service compositions using Colored Petri net.

    PubMed

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  7. A series connection architecture for large-area organic photovoltaic modules with a 7.5% module efficiency

    PubMed Central

    Hong, Soonil; Kang, Hongkyu; Kim, Geunjin; Lee, Seongyu; Kim, Seok; Lee, Jong-Hoon; Lee, Jinho; Yi, Minjin; Kim, Junghwan; Back, Hyungcheol; Kim, Jae-Ryoung; Lee, Kwanghee

    2016-01-01

    The fabrication of organic photovoltaic modules via printing techniques has been the greatest challenge for their commercial manufacture. Current module architecture, which is based on a monolithic geometry consisting of serially interconnecting stripe-patterned subcells with finite widths, requires highly sophisticated patterning processes that significantly increase the complexity of printing production lines and cause serious reductions in module efficiency due to so-called aperture loss in series connection regions. Herein we demonstrate an innovative module structure that can simultaneously reduce both patterning processes and aperture loss. By using a charge recombination feature that occurs at contacts between electron- and hole-transport layers, we devise a series connection method that facilitates module fabrication without patterning the charge transport layers. With the successive deposition of component layers using slot-die and doctor-blade printing techniques, we achieve a high module efficiency reaching 7.5% with area of 4.15 cm2. PMID:26728507

  8. Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.

    PubMed

    Khan, F I; Abbasi, S A

    2000-07-10

    Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.

  9. Novel Online Diagnostic Analysis for In-Flight Particle Properties in Cold Spraying

    NASA Astrophysics Data System (ADS)

    Koivuluoto, Heli; Matikainen, Ville; Larjo, Jussi; Vuoristo, Petri

    2018-02-01

    In cold spraying, powder particles are accelerated by preheated supersonic gas stream to high velocities and sprayed on a substrate. The particle velocities depend on the equipment design and process parameters, e.g., on the type of the process gas and its pressure and temperature. These, in turn, affect the coating structure and the properties. The particle velocities in cold spraying are high, and the particle temperatures are low, which can, therefore, be a challenge for the diagnostic methods. A novel optical online diagnostic system, HiWatch HR, will open new possibilities for measuring particle in-flight properties in cold spray processes. The system employs an imaging measurement technique called S-PTV (sizing-particle tracking velocimetry), first introduced in this research. This technique enables an accurate particle size measurement also for small diameter particles with a large powder volume. The aim of this study was to evaluate the velocities of metallic particles sprayed with HPCS and LPCS systems and with varying process parameters. The measured in-flight particle properties were further linked to the resulting coating properties. Furthermore, the camera was able to provide information about variations during the spraying, e.g., fluctuating powder feeding, which is important from the process control and quality control point of view.

  10. Particle image velocimetry for the Surface Tension Driven Convection Experiment using a particle displacement tracking technique

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Pline, Alexander D.

    1991-01-01

    The Surface Tension Driven Convection Experiment (STDCE) is a Space Transportation System flight experiment to study both transient and steady thermocapillary fluid flows aboard the USML-1 Spacelab mission planned for 1992. One of the components of data collected during the experiment is a video record of the flow field. This qualitative data is then quantified using an all electronic, two-dimensional particle image velocimetry technique called particle displacement tracking (PDT) which uses a simple space domain particle tracking algorithm. The PDT system is successful in producing velocity vector fields from the raw video data. Application of the PDT technique to a sample data set yielded 1606 vectors in 30 seconds of processing time. A bottom viewing optical arrangement is used to image the illuminated plane, which causes keystone distortion in the final recorded image. A coordinate transformation was incorporated into the system software to correct this viewing angle distortion. PDT processing produced 1.8 percent false identifications, due to random particle locations. A highly successful routine for removing the false identifications was also incorporated, reducing the number of false identifications to 0.2 percent.

  11. CARES: Completely Automated Robust Edge Snapper for carotid ultrasound IMT measurement on a multi-institutional database of 300 images: a two stage system combining an intensity-based feature approach with first order absolute moments

    NASA Astrophysics Data System (ADS)

    Molinari, Filippo; Acharya, Rajendra; Zeng, Guang; Suri, Jasjit S.

    2011-03-01

    The carotid intima-media thickness (IMT) is the most used marker for the progression of atherosclerosis and onset of the cardiovascular diseases. Computer-aided measurements improve accuracy, but usually require user interaction. In this paper we characterized a new and completely automated technique for carotid segmentation and IMT measurement based on the merits of two previously developed techniques. We used an integrated approach of intelligent image feature extraction and line fitting for automatically locating the carotid artery in the image frame, followed by wall interfaces extraction based on Gaussian edge operator. We called our system - CARES. We validated the CARES on a multi-institutional database of 300 carotid ultrasound images. IMT measurement bias was 0.032 +/- 0.141 mm, better than other automated techniques and comparable to that of user-driven methodologies. Our novel approach of CARES processed 96% of the images leading to the figure of merit to be 95.7%. CARES ensured complete automation and high accuracy in IMT measurement; hence it could be a suitable clinical tool for processing of large datasets in multicenter studies involving atherosclerosis.pre-

  12. Particle image velocimetry for the surface tension driven convection experiment using a particle displacement tracking technique

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Pline, Alexander D.

    1991-01-01

    The Surface Tension Driven Convection Experiment (STDCE) is a Space Transportation System flight experiment to study both transient and steady thermocapillary fluid flows aboard the USML-1 Spacelab mission planned for 1992. One of the components of data collected during the experiment is a video record of the flow field. This qualitative data is then quantified using an all electronic, two-dimensional particle image velocimetry technique called particle displacement tracking (PDT) which uses a simple space domain particle tracking algorithm. The PDT system is successful in producing velocity vector fields from the raw video data. Application of the PDT technique to a sample data set yielded 1606 vectors in 30 seconds of processing time. A bottom viewing optical arrangement is used to image the illuminated plane, which causes keystone distortion in the final recorded image. A coordinate transformation was incorporated into the system software to correct this viewing angle distortion. PDT processing produced 1.8 percent false identifications, due to random particle locations. A highly successful routine for removing the false identifications was also incorporated, reducing the number of false identifications to 0.2 percent.

  13. Interactive, graphical processing unitbased evaluation of evacuation scenarios at the state scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perumalla, Kalyan S; Aaby, Brandon G; Yoginath, Srikanth B

    2011-01-01

    In large-scale scenarios, transportation modeling and simulation is severely constrained by simulation time. For example, few real- time simulators scale to evacuation traffic scenarios at the level of an entire state, such as Louisiana (approximately 1 million links) or Florida (2.5 million links). New simulation approaches are needed to overcome severe computational demands of conventional (microscopic or mesoscopic) modeling techniques. Here, a new modeling and execution methodology is explored that holds the potential to provide a tradeoff among the level of behavioral detail, the scale of transportation network, and real-time execution capabilities. A novel, field-based modeling technique and its implementationmore » on graphical processing units are presented. Although additional research with input from domain experts is needed for refining and validating the models, the techniques reported here afford interactive experience at very large scales of multi-million road segments. Illustrative experiments on a few state-scale net- works are described based on an implementation of this approach in a software system called GARFIELD. Current modeling cap- abilities and implementation limitations are described, along with possible use cases and future research.« less

  14. Optoelectronic image processing for cervical cancer screening

    NASA Astrophysics Data System (ADS)

    Narayanswamy, Ramkumar; Sharpe, John P.; Johnson, Kristina M.

    1994-05-01

    Automation of the Pap-smear cervical screening method is highly desirable as it relieves tedium for the human operators, reduces cost and should increase accuracy and provide repeatability. We present here the design for a high-throughput optoelectronic system which forms the first stage of a two stage system to automate pap-smear screening. We use a mathematical morphological technique called the hit-or-miss transform to identify the suspicious areas on a pap-smear slide. This algorithm is implemented using a VanderLugt architecture and a time-sequential ANDing smart pixel array.

  15. Protein blotting protocol for beginners.

    PubMed

    Petrasovits, Lars A

    2014-01-01

    The transfer and immobilization of biological macromolecules onto solid nitrocellulose or nylon (polyvinylidene difluoride (PVDF)) membranes subsequently followed by specific detection is referred to as blotting. DNA blots are called Southerns after the inventor of the technique, Edwin Southern. By analogy, RNA blots are referred to as northerns and protein blots as westerns (Burnette, Anal Biochem 112:195-203, 1981). With few exceptions, western blotting involves five steps, namely, sample collection, preparation, separation, immobilization, and detection. In this chapter, protocols for the entire process from sample collection to detection are described.

  16. Chemotaxonomic identification of single bacteria by micro-Raman spectroscopy: application to clean-room-relevant biological contaminations.

    PubMed

    Rösch, Petra; Harz, Michaela; Schmitt, Michael; Peschke, Klaus-Dieter; Ronneberger, Olaf; Burkhardt, Hans; Motzkus, Hans-Walter; Lankers, Markus; Hofer, Stefan; Thiele, Hans; Popp, Jürgen

    2005-03-01

    Microorganisms, such as bacteria, which might be present as contamination inside an industrial food or pharmaceutical clean room process need to be identified on short time scales in order to minimize possible health hazards as well as production downtimes causing financial deficits. Here we describe the first results of single-particle micro-Raman measurements in combination with a classification method, the so-called support vector machine technique, allowing for a fast, reliable, and nondestructive online identification method for single bacteria.

  17. Chemotaxonomic Identification of Single Bacteria by Micro-Raman Spectroscopy: Application to Clean-Room-Relevant Biological Contaminations

    PubMed Central

    Rösch, Petra; Harz, Michaela; Schmitt, Michael; Peschke, Klaus-Dieter; Ronneberger, Olaf; Burkhardt, Hans; Motzkus, Hans-Walter; Lankers, Markus; Hofer, Stefan; Thiele, Hans; Popp, Jürgen

    2005-01-01

    Microorganisms, such as bacteria, which might be present as contamination inside an industrial food or pharmaceutical clean room process need to be identified on short time scales in order to minimize possible health hazards as well as production downtimes causing financial deficits. Here we describe the first results of single-particle micro-Raman measurements in combination with a classification method, the so-called support vector machine technique, allowing for a fast, reliable, and nondestructive online identification method for single bacteria. PMID:15746368

  18. Self-mapping in treating suicide ideation: a case study.

    PubMed

    Robertson, Lloyd Hawkeye

    2011-03-01

    This case study traces the development and use of a self-mapping exercise in the treatment of a youth who had been at risk for re-attempting suicide. A life skills exercise was modified to identify units of culture called memes from which a map of the youth's self was prepared. A successful treatment plan followed the mapping exercise. The process of self-map construction is presented along with an interpretive analysis. It is suggested that therapists from a range of perspectives could use this technique in assessment and treatment.

  19. Advanced sensor-simulation capability

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Kalman, Linda S.; Keller, Robert A.

    1990-09-01

    This paper provides an overview of an advanced simulation capability currently in use for analyzing visible and infrared sensor systems. The software system, called VISTAS (VISIBLE/INFRARED SENSOR TRADES, ANALYSES, AND SIMULATIONS) combines classical image processing techniques with detailed sensor models to produce static and time dependent simulations of a variety of sensor systems including imaging, tracking, and point target detection systems. Systems modelled to date include space-based scanning line-array sensors as well as staring 2-dimensional array sensors which can be used for either imaging or point source detection.

  20. Adaptive strategy for joint measurements

    NASA Astrophysics Data System (ADS)

    Uola, Roope; Luoma, Kimmo; Moroder, Tobias; Heinosaari, Teiko

    2016-08-01

    We develop a technique to find simultaneous measurements for noisy quantum observables in finite-dimensional Hilbert spaces. We use the method to derive lower bounds for the noise needed to make incompatible measurements jointly measurable. Using our strategy together with recent developments in the field of one-sided quantum information processing we show that the attained lower bounds are tight for various symmetric sets of quantum measurements. We use this characterisation to prove the existence of so called 4-Specker sets, i.e. sets of four incompatible observables with compatible subsets in the qubit case.

  1. Alternatives in the complement and structure of NASA teleprocessing resources

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results are presented of a program to identify technical innovations which would have an impact on NASA data processing and describe as fully as possible the development work necessary to exploit them. Seven of these options for NASA development, as the opportunities to participate in and enhance the advancing information system technology were called, are reported. A detailed treatment is given of three of the options, involving minicomputers, mass storage devices and software development techniques. These areas were picked by NASA as having the most potential for improving their operations.

  2. Application of nuclear analytical techniques using long-life sealed-tube neutron generators.

    PubMed

    Bach, P; Cluzeau, S; Lambermont, C

    1994-01-01

    The new range of sealed-tube neutron generators developed by SODERN appears to be appropriate for the industrial environment. The main characteristics are the high emission stability during the very long lifetime of the tube, flexible pulsed mode capability, safety in operation with no radiation in "off" state, and the easy transportation of equipment. Some applications of the neutron generators, called GENIE, are considered: high-sensitivity measurement of transuranic elements in nuclear waste drums, bulk material analysis for process control, and determination of the airborne pollutants for environmental monitoring.

  3. A technique for determining cloud free versus cloud contaminated pixels in satellite imagery

    NASA Technical Reports Server (NTRS)

    Wohlman, Richard A.

    1994-01-01

    Weather forecasting has been called the second oldest profession. To do so accurately and with some consistency requires an ability to understand the processes which create the clouds, drive the winds, and produce the ever changing atmospheric conditions. Measurement of basic parameters such as temperature, water vapor content, pressure, windspeed and wind direction throughout the three dimensional atmosphere form the foundation upon which a modern forecast is created. Doppler radar, and space borne remote sensing have provided forecasters the new tools with which to ply their trade.

  4. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    NASA Astrophysics Data System (ADS)

    Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping

    2014-04-01

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.

  5. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less

  6. Application driven interface generation for EASIE. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kao, Ya-Chen

    1992-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a user interface and a set of utility programs which support the rapid integration and execution of analysis programs about a central relational database. EASIE provides users with two basic modes of execution. One of them is a menu-driven execution mode, called Application-Driven Execution (ADE), which provides sufficient guidance to review data, select a menu action item, and execute an application program. The other mode of execution, called Complete Control Execution (CCE), provides an extended executive interface which allows in-depth control of the design process. Currently, the EASIE system is based on alphanumeric techniques only. It is the purpose of this project to extend the flexibility of the EASIE system in the ADE mode by implementing it in a window system. Secondly, a set of utilities will be developed to assist the experienced engineer in the generation of an ADE application.

  7. Second Thoughts on Educational Innovation and New Faculty

    NASA Astrophysics Data System (ADS)

    Ringwald, F. A.

    2003-05-01

    New Math. Teaching machines. Programmed instruction. Whole Language. Educational television. Self-esteem. Process writing. Writing about "feelings." We have seen many educational innovations in recent years. Now, it's Peer Instruction, also called Active Learning, also called Learner-Centered Teaching. Fine, I say: after all, what I do is active and learner-centered, too. But is the introduction to new techniques really what new faculty need? Wouldn't the effort be better spent helping them with making their teaching better, as opposed to merely innovative? There are many things they need to know that almost no one is telling them, the most important of which is that they are not alone when dealing with the special problems of today's students. New faculty also need to be active and productive researchers, who are increasingly expected to involve students in research. This talk will examine these and other issues for new faculty.

  8. Mapping Base Modifications in DNA by Transverse-Current Sequencing

    NASA Astrophysics Data System (ADS)

    Alvarez, Jose R.; Skachkov, Dmitry; Massey, Steven E.; Kalitsov, Alan; Velev, Julian P.

    2018-02-01

    Sequencing DNA modifications and lesions, such as methylation of cytosine and oxidation of guanine, is even more important and challenging than sequencing the genome itself. The traditional methods for detecting DNA modifications are either insensitive to these modifications or require additional processing steps to identify a particular type of modification. Transverse-current sequencing in nanopores can potentially identify the canonical bases and base modifications in the same run. In this work, we demonstrate that the most common DNA epigenetic modifications and lesions can be detected with any predefined accuracy based on their tunneling current signature. Our results are based on simulations of the nanopore tunneling current through DNA molecules, calculated using nonequilibrium electron-transport methodology within an effective multiorbital model derived from first-principles calculations, followed by a base-calling algorithm accounting for neighbor current-current correlations. This methodology can be integrated with existing experimental techniques to improve base-calling fidelity.

  9. Deconvolution of the relaxations associated with local and segmental motions in poly(methacrylate)s containing dichlorinated benzyl moieties in the ester residue.

    PubMed

    Dominguez-Espinosa, Gustavo; Díaz-Calleja, Ricardo; Riande, Evaristo; Gargallo, Ligia; Radic, Deodato

    2005-09-15

    The relaxation behavior of poly(2,3-dichlorobenzyl methacrylate) is studied by broadband dielectric spectroscopy in the frequency range of 10(-1)-10(9) Hz and temperature interval of 303-423 K. The isotherms representing the dielectric loss of the glassy polymer in the frequency domain present a single absorption, called beta process. At temperatures close to Tg, the dynamical alpha relaxation already overlaps with the beta process, the degree of overlapping increasing with temperature. The deconvolution of the alpha and beta relaxations is facilitated using the retardation spectra calculated from the isotherms utilizing linear programming regularization parameter techniques. The temperature dependence of the beta relaxation presents a crossover associated with a change in activation energy of the local processes. The distance between the alpha and beta peaks, expressed as log(fmax;beta/fmax;alpha) where fmax is the frequency at the peak maximum, follows Arrhenius behavior in the temperature range of 310-384 K. Above 384 K, the distance between the peaks remains nearly constant and, as a result, the a onset temperature exhibited for many polymers is not reached in this system. The fraction of relaxation carried out through the alpha process, without beta assistance, is larger than 60% in the temperature range of 310-384 K where the so-called Williams ansatz holds.

  10. The Next Frontier: Quantitative Biochemistry in Living Cells.

    PubMed

    Honigmann, Alf; Nadler, André

    2018-01-09

    Researchers striving to convert biology into an exact science foremost rely on structural biology and biochemical reconstitution approaches to obtain quantitative data. However, cell biological research is moving at an ever-accelerating speed into areas where these approaches lose much of their edge. Intrinsically unstructured proteins and biochemical interaction networks composed of interchangeable, multivalent, and unspecific interactions pose unique challenges to quantitative biology, as do processes that occur in discrete cellular microenvironments. Here we argue that a conceptual change in our way of conducting biochemical experiments is required to take on these new challenges. We propose that reconstitution of cellular processes in vitro should be much more focused on mimicking the cellular environment in vivo, an approach that requires detailed knowledge of the material properties of cellular compartments, essentially requiring a material science of the cell. In a similar vein, we suggest that quantitative biochemical experiments in vitro should be accompanied by corresponding experiments in vivo, as many newly relevant cellular processes are highly context-dependent. In essence, this constitutes a call for chemical biologists to convert their discipline from a proof-of-principle science to an area that could rightfully be called quantitative biochemistry in living cells. In this essay, we discuss novel techniques and experimental strategies with regard to their potential to fulfill such ambitious aims.

  11. CD process control through machine learning

    NASA Astrophysics Data System (ADS)

    Utzny, Clemens

    2016-10-01

    For the specific requirements of the 14nm and 20nm site applications a new CD map approach was developed at the AMTC. This approach relies on a well established machine learning technique called recursive partitioning. Recursive partitioning is a powerful technique which creates a decision tree by successively testing whether the quantity of interest can be explained by one of the supplied covariates. The test performed is generally a statistical test with a pre-supplied significance level. Once the test indicates significant association between the variable of interest and a covariate a split performed at a threshold value which minimizes the variation within the newly attained groups. This partitioning is recurred until either no significant association can be detected or the resulting sub group size falls below a pre-supplied level.

  12. Maize embryogenesis.

    PubMed

    Fontanet, Pilar; Vicient, Carlos M

    2008-01-01

    Plant embryo development is a complex process that includes several coordinated events. Maize mature embryos consist of a well-differentiated embryonic axis surrounded by a single massive cotyledon called scutellum. Mature embryo axis also includes lateral roots and several developed leaves. In contrast to Arabidopsis, in which the orientation of cell divisions are perfectly established, only the first planes of cell division are predictable in maize embryos. These distinctive characteristics joined to the availability of a large collection of embryo mutants, well-developed molecular biology and tissue culture tools, an established genetics and its economical importance make maize a good model plant for grass embryogenesis. Here, we describe basic concepts and techniques necessary for studying maize embryo development: how to grow maize in greenhouses and basic techniques for in vitro embryo culture, somatic embryogenesis and in situ hybridization.

  13. Linear increases in carbon nanotube density through multiple transfer technique.

    PubMed

    Shulaker, Max M; Wei, Hai; Patil, Nishant; Provine, J; Chen, Hong-Yu; Wong, H-S P; Mitra, Subhasish

    2011-05-11

    We present a technique to increase carbon nanotube (CNT) density beyond the as-grown CNT density. We perform multiple transfers, whereby we transfer CNTs from several growth wafers onto the same target surface, thereby linearly increasing CNT density on the target substrate. This process, called transfer of nanotubes through multiple sacrificial layers, is highly scalable, and we demonstrate linear CNT density scaling up to 5 transfers. We also demonstrate that this linear CNT density increase results in an ideal linear increase in drain-source currents of carbon nanotube field effect transistors (CNFETs). Experimental results demonstrate that CNT density can be improved from 2 to 8 CNTs/μm, accompanied by an increase in drain-source CNFET current from 4.3 to 17.4 μA/μm.

  14. The More Things Change the More They Stay the Same

    NASA Astrophysics Data System (ADS)

    Moore, John W.

    1998-01-01

    In what year would you guess that these statements appeared in this Journal? Students can be classified as problem oriented or answer oriented. The answer-oriented student ... does little or no reflective thinking. ...To simply work a problem for a student may not be educational at all. The student should be taught the process used in the solution. ...My experience indicates that an answer-oriented attitude can be changed. ...But one can't do much teaching of problem-solving techniques and at the same time get on with the day's lecture. ...Problem-solving technique is a tool of learning. ...To teach it well should be about the most rewarding academic activity. ...A year of stressing methods of problem solving would alter the orientation and motivation of many students we now call poor.

  15. Monitoring the englacial fracture state using virtual-reflector seismology

    NASA Astrophysics Data System (ADS)

    Lindner, F.; Weemstra, C.; Walter, F.; Hadziioannou, C.

    2017-12-01

    Fracturing and changes in the englacial macroscopic water content change the elastic bulk properties of ice bodies. Small seismic velocity variations, resulting from such changes, can be measured using a technique called coda-wave interferometry. Here, coda refers to the later-arriving, multiply scattered waves. Often, this technique is applied to so-called virtual-source responses, which can be obtained using seismic interferometry (a simple crosscorrelation process). Compared to other media (e.g., the Earth's crust), however, ice bodies exhibit relatively little scattering. This complicates the application of coda-wave interferometry to the retrieved virtual-source responses. In this work, we therefore investigate the applicability of coda-wave interferometry to virtual-source responses obtained using two alternative seismic interferometric techniques, namely, seismic interferometry by multidimensional deconvolution (SI by MDD), and virtual-reflector seismology (VRS). To that end, we use synthetic data, as well as active-source glacier data acquired on Glacier de la Plaine Morte, Switzerland. Both SI by MDD and VRS allow the retrieval of more accurate virtual-source responses. In particular, the dependence of the retrieved virtual-source responses on the illumination pattern is reduced. We find that this results in more accurate glacial phase-velocity estimates. In addition, VRS introduces virtual reflections from a receiver contour (partly) enclosing the medium of interest. By acting as a sort of virtual reverberation, the coda resulting from the application of VRS significantly increases seismic monitoring capabilities, in particular in cases where natural scattering coda is not available.

  16. Faster, better, cheaper: lean labs are the key to future survival.

    PubMed

    Bryant, Patsy M; Gulling, Richard D

    2006-03-28

    Process improvement techniques have been used in manufacturing for many years to rein in costs and improve quality. Health care is now grappling with similar challenges. The Department of Laboratory Services at Good Samaritan Hospital, a 560-bed facility in Dayton, OH, used the Lean process improvement method in a 12-week project to streamline its core laboratory processes. By analyzing the flow of samples through the system and identifying value-added and non-value-added steps, both in the laboratory and during the collection process, Good Samaritan's project team redesigned systems and reconfigured the core laboratory layout to trim collection-to-results time from 65 minutes to 40 minutes. As a result, virtually all morning results are available to physicians by 7 a.m., critical values are called to nursing units within 30 minutes, and core laboratory services are optimally staffed for maximum cost-effectiveness.

  17. Electrohydrodynamic atomization: A two-decade effort to produce and process micro-/nanoparticulate materials

    PubMed Central

    Xie, Jingwei; Jiang, Jiang; Davoodi, Pooya; Srinivasan, M. P.; Wang, Chi-Hwa

    2014-01-01

    Electrohydrodynamic atomization (EHDA), also called electrospray technique, has been studied for more than one century. However, since 1990s it has begun to be used to produce and process micro-/nanostructured materials. Owing to the simplicity and flexibility in EHDA experimental setup, it has been successfully employed to generate particulate materials with controllable compositions, structures, sizes, morphologies, and shapes. EHDA has also been used to deposit micro- and nanoparticulate materials on surfaces in a well-controlled manner. All these attributes make EHDA a fascinating tool for preparing and assembling a wide range of micro- and nanostructured materials which have been exploited for use in pharmaceutics, food, and healthcare to name a few. Our goal is to review this field, which allows scientists and engineers to learn about the EHDA technique and how it might be used to create, process, and assemble micro-/nanoparticulate materials with unique and intriguing properties. We begin with a brief introduction to the mechanism and setup of EHDA technique. We then discuss issues critical to successful application of EHDA technique, including control of composition, size, shape, morphology, structure of particulate materials and their assembly. We also illustrate a few of the many potential applications of particulate materials, especially in the area of drug delivery and regenerative medicine. Next, we review the simulation and modeling of Taylor cone-jet formation for a single and co-axial nozzle. The mathematical modeling of particle transport and deposition is presented to provide a deeper understanding of the effective parameters in the preparation, collection and pattering processes. We conclude this article with a discussion on perspectives and future possibilities in this field. PMID:25684778

  18. High frequency source localization in a shallow ocean sound channel using frequency difference matched field processing.

    PubMed

    Worthmann, Brian M; Song, H C; Dowling, David R

    2015-12-01

    Matched field processing (MFP) is an established technique for source localization in known multipath acoustic environments. Unfortunately, in many situations, particularly those involving high frequency signals, imperfect knowledge of the actual propagation environment prevents accurate propagation modeling and source localization via MFP fails. For beamforming applications, this actual-to-model mismatch problem was mitigated through a frequency downshift, made possible by a nonlinear array-signal-processing technique called frequency difference beamforming [Abadi, Song, and Dowling (2012). J. Acoust. Soc. Am. 132, 3018-3029]. Here, this technique is extended to conventional (Bartlett) MFP using simulations and measurements from the 2011 Kauai Acoustic Communications MURI experiment (KAM11) to produce ambiguity surfaces at frequencies well below the signal bandwidth where the detrimental effects of mismatch are reduced. Both the simulation and experimental results suggest that frequency difference MFP can be more robust against environmental mismatch than conventional MFP. In particular, signals of frequency 11.2 kHz-32.8 kHz were broadcast 3 km through a 106-m-deep shallow ocean sound channel to a sparse 16-element vertical receiving array. Frequency difference MFP unambiguously localized the source in several experimental data sets with average peak-to-side-lobe ratio of 0.9 dB, average absolute-value range error of 170 m, and average absolute-value depth error of 10 m.

  19. Influence of substrate metal alloy type on the properties of hydroxyapatite coatings deposited using a novel ambient temperature deposition technique.

    PubMed

    Barry, J N; Cowley, A; McNally, P J; Dowling, D P

    2014-03-01

    Hydroxyapatite (HA) coatings are applied widely to enhance the level of osteointegration onto orthopedic implants. Atmospheric plasma spray (APS) is typically used for the deposition of these coatings; however, HA crystalline changes regularly occur during this high-thermal process. This article reports on the evaluation of a novel low-temperature (<47°C) HA deposition technique, called CoBlast, for the application of crystalline HA coatings. To-date, reports on the CoBlast technique have been limited to titanium alloy substrates. This study addresses the suitability of the CoBlast technique for the deposition of HA coatings on a number of alternative metal alloys utilized in the fabrication of orthopedic devices. In addition to titanium grade 5, both cobalt chromium and stainless steel 316 were investigated. In this study, HA coatings were deposited using both the CoBlast and the plasma sprayed techniques, and the resultant HA coating and substrate properties were evaluated and compared. The CoBlast-deposited HA coatings were found to present similar surface morphologies, interfacial properties, and composition irrespective of the substrate alloy type. Coating thickness however displayed some variation with the substrate alloy, ranging from 2.0 to 3.0 μm. This perhaps is associated with the electronegativity of the metal alloys. The APS-treated samples exhibited evidence of both coating, and significantly, substrate phase alterations for two metal alloys; titanium grade 5 and cobalt chrome. Conversely, the CoBlast-processed samples exhibited no phase changes in the substrates after depositions. The APS alterations were attributed to the brief, but high-intensity temperatures experienced during processing. Copyright © 2013 Wiley Periodicals, Inc.

  20. Charged-particle emission tomography

    NASA Astrophysics Data System (ADS)

    Ding, Yijun

    Conventional charged-particle imaging techniques--such as autoradiography-- provide only two-dimensional (2D) images of thin tissue slices. To get volumetric information, images of multiple thin slices are stacked. This process is time consuming and prone to distortions, as registration of 2D images is required. We propose a direct three-dimensional (3D) autoradiography technique, which we call charged-particle emission tomography (CPET). This 3D imaging technique enables imaging of thick sections, thus increasing laboratory throughput and eliminating distortions due to registration. In CPET, molecules or cells of interest are labeled so that they emit charged particles without significant alteration of their biological function. Therefore, by imaging the source of the charged particles, one can gain information about the distribution of the molecules or cells of interest. Two special case of CPET include beta emission tomography (BET) and alpha emission tomography (alphaET), where the charged particles employed are fast electrons and alpha particles, respectively. A crucial component of CPET is the charged-particle detector. Conventional charged-particle detectors are sensitive only to the 2-D positions of the detected particles. We propose a new detector concept, which we call particle-processing detector (PPD). A PPD measures attributes of each detected particle, including location, direction of propagation, and/or the energy deposited in the detector. Reconstruction algorithms for CPET are developed, and reconstruction results from simulated data are presented for both BET and alphaET. The results show that, in addition to position, direction and energy provide valuable information for 3D reconstruction of CPET. Several designs of particle-processing detectors are described. Experimental results for one detector are discussed. With appropriate detector design and careful data analysis, it is possible to measure direction and energy, as well as position of each detected particle. The null functions of CPET with PPDs that measure different combinations of attributes are calculated through singular-value decomposition. In general, the more particle attributes are measured from each detection event, the smaller the null space of CPET is. In other words, the higher dimension the data space is, the more information about an object can be recovered from CPET.

  1. Nonlinear acoustics in cicada mating calls enhance sound propagation.

    PubMed

    Hughes, Derke R; Nuttall, Albert H; Katz, Richard A; Carter, G Clifford

    2009-02-01

    An analysis of cicada mating calls, measured in field experiments, indicates that the very high levels of acoustic energy radiated by this relatively small insect are mainly attributed to the nonlinear characteristics of the signal. The cicada emits one of the loudest sounds in all of the insect population with a sound production system occupying a physical space typically less than 3 cc. The sounds made by tymbals are amplified by the hollow abdomen, functioning as a tuned resonator, but models of the signal based solely on linear techniques do not fully account for a sound radiation capability that is so disproportionate to the insect's size. The nonlinear behavior of the cicada signal is demonstrated by combining the mutual information and surrogate data techniques; the results obtained indicate decorrelation when the phase-randomized and non-phase-randomized data separate. The Volterra expansion technique is used to fit the nonlinearity in the insect's call. The second-order Volterra estimate provides further evidence that the cicada mating calls are dominated by nonlinear characteristics and also suggests that the medium contributes to the cicada's efficient sound propagation. Application of the same principles has the potential to improve radiated sound levels for sonar applications.

  2. Methods and materials, for locating and studying spotted owls.

    Treesearch

    Eric D. Forsman

    1983-01-01

    Nocturnal calling surveys are the most effective and most frequently used technique for locating spotted owls. Roosts and general nest locations may be located during the day by calling in suspected roost or nest areas. Specific nest trees are located by: (1) baiting with a live mouse to induce owls to visit the nest, (2) calling in suspected nest areas to stimulate...

  3. Gas Shielding Technology for Welding and Brazing

    NASA Technical Reports Server (NTRS)

    Nunes, Arthur J.; Gradl, Paul R.

    2012-01-01

    Welding is a common method that allows two metallic materials to be joined together with high structural integrity. When joints need to be leak-tight, light-weight, or free of contaminant-trapping seams or surface asperities, welding tends to be specified. There are many welding techniques, each with its own advantages and disadvantages. Some of these techniques include Forge Welding, Gas Tungsten Arc Welding, Friction Stir Welding, and Laser Beam Welding to name a few. Whichever technique is used, the objective is a structural joint that meets the requirements of a particular component or assembly. A key practice in producing quality welds is the use of shielding gas. This article discusses various weld techniques, quality of the welds, and importance of shielding gas in each of those techniques. Metallic bonds, or joints, are produced when metals are put into intimate contact. In the solid-state "blacksmith welding" process, now called Forge Welding (FOW), the site to be joined is pounded into intimate contact. The surfaces to be joined usually need to be heated to make it easier to deform the metal. The surfaces are sprinkled with a flux to melt surface oxides and given a concave shape so that surface contamination can be squeezed out of the joint as the surfaces are pounded together; otherwise the surface contamination would be trapped in the joint and would weaken the weld. In solid-state welding processes surface oxides or other contamination are typically squeezed out of the joint in "flash."

  4. Granular computing with multiple granular layers for brain big data processing.

    PubMed

    Wang, Guoyin; Xu, Ji

    2014-12-01

    Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.

  5. Molecular tagging techniques and their applications to the study of complex thermal flow phenomena

    NASA Astrophysics Data System (ADS)

    Chen, Fang; Li, Haixing; Hu, Hui

    2015-08-01

    This review article reports the recent progress in the development of a new group of molecule-based flow diagnostic techniques, which include molecular tagging velocimetry (MTV) and molecular tagging thermometry (MTT), for both qualitative flow visualization of thermally induced flow structures and quantitative whole-field measurements of flow velocity and temperature distributions. The MTV and MTT techniques can also be easily combined to result in a so-called molecular tagging velocimetry and thermometry (MTV&T) technique, which is capble of achieving simultaneous measurements of flow velocity and temperature distribution in fluid flows. Instead of using tiny particles, the molecular tagging techniques (MTV, MTT, and MTV&T) use phosphorescent molecules, which can be turned into long-lasting glowing marks upon excitation by photons of appropriate wavelength, as the tracers for the flow velocity and temperature measurements. The unique attraction and implementation of the molecular tagging techniques are demonstrated by three application examples, which include: (1) to quantify the unsteady heat transfer process from a heated cylinder to the surrounding fluid flow in order to examine the thermal effects on the wake instabilities behind the heated cylinder operating in mixed and forced heat convection regimes, (2) to reveal the time evolution of unsteady heat transfer and phase changing process inside micro-sized, icing water droplets in order to elucidate the underlying physics pertinent to aircraft icing phenomena, and (3) to achieve simultaneous droplet size, velocity and temperature measurements of "in-flight" droplets to characterize the dynamic and thermodynamic behaviors of flying droplets in spray flows.

  6. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  7. A review on recent technologies for the manufacture of pulmonary drugs.

    PubMed

    Hadiwinoto, Gabriela Daisy; Lip Kwok, Philip Chi; Lakerveld, Richard

    2018-01-01

    This review discusses recent developments in the manufacture of inhalable dry powder formulations. Pulmonary drugs have distinct advantages compared with other drug administration routes. However, requirements of drugs properties complicate the manufacture. Control over crystallization to make particles with the desired properties in a single step is often infeasible, which calls for micronization techniques. Although spray drying produces particles in the desired size range, a stable solid state may not be attainable. Supercritical fluids may be used as a solvent or antisolvent, which significantly reduces solvent waste. Future directions include application areas such as biopharmaceuticals for dry powder inhalers and new processing strategies to improve the control over particle formation such as continuous manufacturing with in-line process analytical technologies.

  8. Accretor: Generative Materiality in the Work of Driessens and Verstappen.

    PubMed

    Whitelaw, Mitchell

    2015-01-01

    Accretor, by the Dutch artists Erwin Driessens and Maria Verstappen, is a generative artwork that adopts and adapts artificial life techniques to produce intricate three-dimensional forms. This article introduces and analyzes Accretor, considering the enigmatic quality of the generated objects and in particular the role of materiality in this highly computational work. Accretor demonstrates a tangled continuity between digital and physical domains, where the constraints and affordances of matter inform both formal processes and aesthetic interpretations. Drawing on Arp's notion of the concrete artwork and McCormack and Dorin's notion of the computational sublime, the article finally argues that Accretor demonstrates what might be called a processual sublime, evoking expansive processes that span both computational and non-computational systems.

  9. Estimating the number of double-strand breaks formed during meiosis from partial observation.

    PubMed

    Toyoizumi, Hiroshi; Tsubouchi, Hideo

    2012-12-01

    Analyzing the basic mechanism of DNA double-strand breaks (DSB) formation during meiosis is important for understanding sexual reproduction and genetic diversity. The location and amount of meiotic DSBs can be examined by using a common molecular biological technique called Southern blotting, but only a subset of the total DSBs can be observed; only DSB fragments still carrying the region recognized by a Southern blot probe are detected. With the assumption that DSB formation follows a nonhomogeneous Poisson process, we propose two estimators of the total number of DSBs on a chromosome: (1) an estimator based on the Nelson-Aalen estimator, and (2) an estimator based on a record value process. Further, we compared their asymptotic accuracy.

  10. Development and Validation of Instruments to Measure Learning of Expert-Like Thinking

    NASA Astrophysics Data System (ADS)

    Adams, Wendy K.; Wieman, Carl E.

    2011-06-01

    This paper describes the process for creating and validating an assessment test that measures the effectiveness of instruction by probing how well that instruction causes students in a class to think like experts about specific areas of science. The design principles and process are laid out and it is shown how these align with professional standards that have been established for educational and psychological testing and the elements of assessment called for in a recent National Research Council study on assessment. The importance of student interviews for creating and validating the test is emphasized, and the appropriate interview procedures are presented. The relevance and use of standard psychometric statistical tests are discussed. Additionally, techniques for effective test administration are presented.

  11. Smart signal processing for an evolving electric grid

    NASA Astrophysics Data System (ADS)

    Silva, Leandro Rodrigues Manso; Duque, Calos Augusto; Ribeiro, Paulo F.

    2015-12-01

    Electric grids are interconnected complex systems consisting of generation, transmission, distribution, and active loads, recently called prosumers as they produce and consume electric energy. Additionally, these encompass a vast array of equipment such as machines, power transformers, capacitor banks, power electronic devices, motors, etc. that are continuously evolving in their demand characteristics. Given these conditions, signal processing is becoming an essential assessment tool to enable the engineer and researcher to understand, plan, design, and operate the complex and smart electronic grid of the future. This paper focuses on recent developments associated with signal processing applied to power system analysis in terms of characterization and diagnostics. The following techniques are reviewed and their characteristics and applications discussed: active power system monitoring, sparse representation of power system signal, real-time resampling, and time-frequency (i.e., wavelets) applied to power fluctuations.

  12. A Machine Learning Method for Power Prediction on the Mobile Devices.

    PubMed

    Chen, Da-Ren; Chen, You-Shyang; Chen, Lin-Chih; Hsu, Ming-Yang; Chiang, Kai-Feng

    2015-10-01

    Energy profiling and estimation have been popular areas of research in multicore mobile architectures. While short sequences of system calls have been recognized by machine learning as pattern descriptions for anomalous detection, power consumption of running processes with respect to system-call patterns are not well studied. In this paper, we propose a fuzzy neural network (FNN) for training and analyzing process execution behaviour with respect to series of system calls, parameters and their power consumptions. On the basis of the patterns of a series of system calls, we develop a power estimation daemon (PED) to analyze and predict the energy consumption of the running process. In the initial stage, PED categorizes sequences of system calls as functional groups and predicts their energy consumptions by FNN. In the operational stage, PED is applied to identify the predefined sequences of system calls invoked by running processes and estimates their energy consumption.

  13. Use of simulated experiments for material characterization of brittle materials subjected to high strain rate dynamic tension

    PubMed Central

    Saletti, Dominique

    2017-01-01

    Rapid progress in ultra-high-speed imaging has allowed material properties to be studied at high strain rates by applying full-field measurements and inverse identification methods. Nevertheless, the sensitivity of these techniques still requires a better understanding, since various extrinsic factors present during an actual experiment make it difficult to separate different sources of errors that can significantly affect the quality of the identified results. This study presents a methodology using simulated experiments to investigate the accuracy of the so-called spalling technique (used to study tensile properties of concrete subjected to high strain rates) by numerically simulating the entire identification process. The experimental technique uses the virtual fields method and the grid method. The methodology consists of reproducing the recording process of an ultra-high-speed camera by generating sequences of synthetically deformed images of a sample surface, which are then analysed using the standard tools. The investigation of the uncertainty of the identified parameters, such as Young's modulus along with the stress–strain constitutive response, is addressed by introducing the most significant user-dependent parameters (i.e. acquisition speed, camera dynamic range, grid sampling, blurring), proving that the used technique can be an effective tool for error investigation. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956505

  14. Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.

    PubMed

    Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J

    2015-01-01

    The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.

  15. Identification of Age-Related Macular Degeneration Using OCT Images

    NASA Astrophysics Data System (ADS)

    Arabi, Punal M., Dr; Krishna, Nanditha; Ashwini, V.; Prathibha, H. M.

    2018-02-01

    Age-related Macular Degeneration is the most leading retinal disease in the recent years. Macular degeneration occurs when the central portion of the retina, called macula deteriorates. As the deterioration occurs with the age, it is commonly referred as Age-related Macular Degeneration. This disease can be visualized by several imaging modalities such as Fundus imaging technique, Optical Coherence Tomography (OCT) technique and many other. Optical Coherence Tomography is the widely used technique for screening the Age-related Macular Degeneration disease, because it has an ability to detect the very minute changes in the retina. The Healthy and AMD affected OCT images are classified by extracting the Retinal Pigmented Epithelium (RPE) layer of the images using the image processing technique. The extracted layer is sampled, the no. of white pixels in each of the sample is counted and the mean value of the no. of pixels is calculated. The average mean value is calculated for both the Healthy and the AMD affected images and a threshold value is fixed and a decision rule is framed to classify the images of interest. The proposed method showed an accuracy of 75%.

  16. Calling behavior of blue and fin whales off California

    NASA Astrophysics Data System (ADS)

    Oleson, Erin Marie

    Passive acoustic monitoring is an effective means for evaluating cetacean presence in remote regions and over long time periods, and may become an important component of cetacean abundance surveys. To use passive acoustic recordings for abundance estimation, an understanding of the behavioral ecology of cetacean calling is crucial. In this dissertation, I develop a better understanding of how blue (Balaenoptera musculus) and fin (B. physalus ) whales use sound with the goal of evaluating passive acoustic techniques for studying their populations. Both blue and fin whales produce several different call types, though the behavioral and environmental context of these calls have not been widely investigated. To better understand how calling is used by these whales off California I have employed both new technologies and traditional techniques, including acoustic recording tags, continuous long-term autonomous acoustic recordings, and simultaneous shipboard acoustic and visual surveys. The outcome of these investigations has led to several conclusions. The production of blue whale calls varies with sex, behavior, season, location, and time of day. Each blue whale call type has a distinct behavioral context, including a male-only bias in the production of song, a call type thought to function in reproduction, and the production of some calls by both sexes. Long-term acoustic records, when interpreted using all call types, provide a more accurate measure of the local seasonal presence of whales, and how they use the region annually, seasonally and daily. The relative occurrence of different call types may indicate prime foraging habitat and the presence of different segments of the population. The proportion of animals heard calling changes seasonally and geographically relative to the number seen, indicating the calibration of acoustic and visual surveys is complex and requires further study on the motivations behind call production and the behavior of calling whales. These findings will play a role in the future development of acoustic census methods and habitat studies for these species, and will provide baseline information for the determination of anthropogenic impacts on these populations.

  17. Digital audio watermarking using moment-preserving thresholding

    NASA Astrophysics Data System (ADS)

    Choi, DooSeop; Jung, Hae Kyung; Choi, Hyuk; Kim, Taejeong

    2007-09-01

    The Moment-Preserving Thresholding technique for digital images has been used in digital image processing for decades, especially in image binarization and image compression. Its main strength lies in that the binary values that the MPT produces as a result, called representative values, are usually unaffected when the signal being thresholded goes through a signal processing operation. The two representative values in MPT together with the threshold value are obtained by solving the system of the preservation equations for the first, second, and third moment. Relying on this robustness of the representative values to various signal processing attacks considered in the watermarking context, this paper proposes a new watermarking scheme for audio signals. The watermark is embedded in the root-sum-square (RSS) of the two representative values of each signal block using the quantization technique. As a result, the RSS values are modified by scaling the signal according to the watermark bit sequence under the constraint of inaudibility relative to the human psycho-acoustic model. We also address and suggest solutions to the problem of synchronization and power scaling attacks. Experimental results show that the proposed scheme maintains high audio quality and robustness to various attacks including MP3 compression, re-sampling, jittering, and, DA/AD conversion.

  18. Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data

    NASA Astrophysics Data System (ADS)

    Pathak, Jaideep; Lu, Zhixin; Hunt, Brian R.; Girvan, Michelle; Ott, Edward

    2017-12-01

    We use recent advances in the machine learning area known as "reservoir computing" to formulate a method for model-free estimation from data of the Lyapunov exponents of a chaotic process. The technique uses a limited time series of measurements as input to a high-dimensional dynamical system called a "reservoir." After the reservoir's response to the data is recorded, linear regression is used to learn a large set of parameters, called the "output weights." The learned output weights are then used to form a modified autonomous reservoir designed to be capable of producing an arbitrarily long time series whose ergodic properties approximate those of the input signal. When successful, we say that the autonomous reservoir reproduces the attractor's "climate." Since the reservoir equations and output weights are known, we can compute the derivatives needed to determine the Lyapunov exponents of the autonomous reservoir, which we then use as estimates of the Lyapunov exponents for the original input generating system. We illustrate the effectiveness of our technique with two examples, the Lorenz system and the Kuramoto-Sivashinsky (KS) equation. In the case of the KS equation, we note that the high dimensional nature of the system and the large number of Lyapunov exponents yield a challenging test of our method, which we find the method successfully passes.

  19. Full-Physics Inverse Learning Machine for Satellite Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Loyola, D. G.

    2017-12-01

    The satellite remote sensing retrievals are usually ill-posed inverse problems that are typically solved by finding a state vector that minimizes the residual between simulated data and real measurements. The classical inversion methods are very time-consuming as they require iterative calls to complex radiative-transfer forward models to simulate radiances and Jacobians, and subsequent inversion of relatively large matrices. In this work we present a novel and extremely fast algorithm for solving inverse problems called full-physics inverse learning machine (FP-ILM). The FP-ILM algorithm consists of a training phase in which machine learning techniques are used to derive an inversion operator based on synthetic data generated using a radiative transfer model (which expresses the "full-physics" component) and the smart sampling technique, and an operational phase in which the inversion operator is applied to real measurements. FP-ILM has been successfully applied to the retrieval of the SO2 plume height during volcanic eruptions and to the retrieval of ozone profile shapes from UV/VIS satellite sensors. Furthermore, FP-ILM will be used for the near-real-time processing of the upcoming generation of European Sentinel sensors with their unprecedented spectral and spatial resolution and associated large increases in the amount of data.

  20. Maltodextrin: a novel excipient used in sugar-based orally disintegrating tablets and phase transition process.

    PubMed

    Elnaggar, Yosra Shaaban R; El-Massik, Magda A; Abdallah, Ossama Y; Ebian, Abd Elazim R

    2010-06-01

    The recent challenge in orally disintegrating tablets (ODT) manufacturing encompasses the compromise between instantaneous disintegration, sufficient hardness, and standard processing equipment. The current investigation constitutes one attempt to fulfill this challenge. Maltodextrin, in the present work, was utilized as a novel excipient to prepare ODT of meclizine. Tablets were prepared by both direct compression and wet granulation techniques. The effect of maltodextrin concentrations on ODT characteristics--manifested as hardness and disintegration time--was studied. The effect of conditioning (40 degrees C and 75% relative humidity) as a post-compression treatment on ODT characteristics was also assessed. Furthermore, maltodextrin-pronounced hardening effect was investigated using differential scanning calorimetry (DSC) and X-ray analysis. Results revealed that in both techniques, rapid disintegration (30-40 s) would be achieved on the cost of tablet hardness (about 1 kg). Post-compression conditioning of tablets resulted in an increase in hardness (3 kg), while keeping rapid disintegration (30-40 s) according to guidance of the FDA for ODT. However, direct compression-conditioning technique exhibited drawbacks of long conditioning time and appearance of the so-called patch effect. These problems were, yet, absent in wet granulation-conditioning technique. DSC and X-ray analysis suggested involvement of glass-elastic deformation in maltodextrin hardening effect. High-performance liquid chromatography analysis of meclizine ODT suggested no degradation of the drug by the applied conditions of temperature and humidity. Overall results proposed that maltodextrin is a promising saccharide for production of ODT with accepted hardness-disintegration time compromise, utilizing standard processing equipment and phenomena of phase transition.

  1. Interferometry-based free space communication and information processing

    NASA Astrophysics Data System (ADS)

    Arain, Muzammil Arshad

    This dissertation studies, analyzes, and experimentally demonstrates the innovative use of interference phenomenon in the field of opto-electronic information processing and optical communications. A number of optical systems using interferometric techniques both in the optical and the electronic domains has been demonstrated in the filed of signal transmission and processing, optical metrology, defense, and physical sensors. Specifically it has been shown that the interference of waves in the form of holography can be exploited to realize a novel optical scanner called Code Multiplexed Optical Scanner (C-MOS). The C-MOS features large aperture, wide scan angles, 3-D beam control, no moving parts, and high beam scanning resolution. A C-MOS based free space optical transceiver for bi-directional communication has also been experimentally demonstrated. For high speed, large bandwidth, and high frequency operation, an optically implemented reconfigurable RF transversal filter design is presented that implements wide range of filtering algorithms. A number of techniques using heterodyne interferometry via acousto-optic device for optical path length measurements have been described. Finally, a whole new class of interferometric sensors for optical metrology and sensing applications is presented. A non-traditional interferometric output signal processing scheme has been developed. Applications include, for example, temperature sensors for harsh environments for a wide temperature range from room temperature to 1000°C.

  2. Correntropy-based partial directed coherence for testing multivariate Granger causality in nonlinear processes

    NASA Astrophysics Data System (ADS)

    Kannan, Rohit; Tangirala, Arun K.

    2014-06-01

    Identification of directional influences in multivariate systems is of prime importance in several applications of engineering and sciences such as plant topology reconstruction, fault detection and diagnosis, and neurosciences. A spectrum of related directionality measures, ranging from linear measures such as partial directed coherence (PDC) to nonlinear measures such as transfer entropy, have emerged over the past two decades. The PDC-based technique is simple and effective, but being a linear directionality measure has limited applicability. On the other hand, transfer entropy, despite being a robust nonlinear measure, is computationally intensive and practically implementable only for bivariate processes. The objective of this work is to develop a nonlinear directionality measure, termed as KPDC, that possesses the simplicity of PDC but is still applicable to nonlinear processes. The technique is founded on a nonlinear measure called correntropy, a recently proposed generalized correlation measure. The proposed method is equivalent to constructing PDC in a kernel space where the PDC is estimated using a vector autoregressive model built on correntropy. A consistent estimator of the KPDC is developed and important theoretical results are established. A permutation scheme combined with the sequential Bonferroni procedure is proposed for testing hypothesis on absence of causality. It is demonstrated through several case studies that the proposed methodology effectively detects Granger causality in nonlinear processes.

  3. Design of a 3D Navigation Technique Supporting VR Interaction

    NASA Astrophysics Data System (ADS)

    Boudoin, Pierre; Otmane, Samir; Mallem, Malik

    2008-06-01

    Multimodality is a powerful paradigm to increase the realness and the easiness of the interaction in Virtual Environments (VEs). In particular, the search for new metaphors and techniques for 3D interaction adapted to the navigation task is an important stage for the realization of future 3D interaction systems that support multimodality, in order to increase efficiency and usability. In this paper we propose a new multimodal 3D interaction model called Fly Over. This model is especially devoted to the navigation task. We present a qualitative comparison between Fly Over and a classical navigation technique called gaze-directed steering. The results from preliminary evaluation on the IBISC semi-immersive Virtual Reality/Augmented Realty EVR@ platform show that Fly Over is a user friendly and efficient navigation technique.

  4. Pattern Recognition in Optical Remote Sensing Data Processing

    NASA Astrophysics Data System (ADS)

    Kozoderov, Vladimir; Kondranin, Timofei; Dmitriev, Egor; Kamentsev, Vladimir

    Computational procedures of the land surface biophysical parameters retrieval imply that modeling techniques are available of the outgoing radiation description together with monitoring techniques of remote sensing data processing using registered radiances between the related optical sensors and the land surface objects called “patterns”. Pattern recognition techniques are a valuable approach to the processing of remote sensing data for images of the land surface - atmosphere system. Many simplified codes of the direct and inverse problems of atmospheric optics are considered applicable for the imagery processing of low and middle spatial resolution. Unless the authors are not interested in the accuracy of the final information products, they utilize these standard procedures. The emerging necessity of processing data of high spectral and spatial resolution given by imaging spectrometers puts forward the newly defined pattern recognition techniques. The proposed tools of using different types of classifiers combined with the parameter retrieval procedures for the forested environment are maintained to have much wider applications as compared with the image features and object shapes extraction, which relates to photometry and geometry in pixel-level reflectance representation of the forested land cover. The pixel fraction and reflectance of “end-members” (sunlit forest canopy, sunlit background and shaded background for a particular view and solar illumination angle) are only a part in the listed techniques. It is assumed that each pixel views collections of the individual forest trees and the pixel-level reflectance can thus be computed as a linear mixture of sunlit tree tops, sunlit background (or understory) and shadows. Instead of these photometry and geometry constraints, the improved models are developed of the functional description of outgoing spectral radiation, in which such parameters of the forest canopy like the vegetation biomass density for particular forest species and age are embedded. This permits us to calculate the relationships between the registered radiances and the biomass densities (the direct problem of atmospheric optics). The next stage is to find solutions of this problem as cross-sections of the related curves in the multi-dimensional space given by the parameters of these models (the inverse problem). The typical solutions may not be mathematically unique and the computational procedure is undertaken to their regularization by finding minima of the functional called “the energy for the particular class of forests”. The relevant optimization procedures serve to identify the likelihood between any registered set of data and the theoretical distributions as well as to regularize the solution by employing the derivative functions characterizing the neighborhood of the pixels for the related classes. As a result, we have elaborated a rigorous approach to optimize spectral channels based on searching their most informative sets by combining the channels and finding correlations between them. A successive addition method is used with the calculation of the total probability error. The step up method consists in fixing the level of the probability error that is not improved by further adding the channels in the calculation scheme of the pattern recognition. The best distinguishable classes are recognized at the first stage of this procedure. The analytical technique called “cross-validation” is used at its second stage. This procedure is in removing some data before the classifier training begins employing, for instance, the known “leaving-out-one” strategy. This strategy serves to explain the accuracy category additionally to the standard confusion matrix between the modeling approach and the available ground-based observations, once the employed validation map may not be perfect or needs renewal. Such cross-validation carried out for ensembles of airborne data from the imaging spectrometer produced in Russia enables to conclude that the forest classes on a test area are separated with high accuracy. The proposed approach is recommended to account for the needed set of ground-based measurements during field campaigns for the validation purposes of remote sensing data processing and for the retrieval procedures of such parameters of forests like Net Primary Productivity with an ensured accuracy that results from the described here computational procedures.

  5. Comparison of Analytic Hierarchy Process, Catastrophe and Entropy techniques for evaluating groundwater prospect of hard-rock aquifer systems

    NASA Astrophysics Data System (ADS)

    Jenifer, M. Annie; Jha, Madan K.

    2017-05-01

    Groundwater is a treasured underground resource, which plays a central role in sustainable water management. However, it being hidden and dynamic in nature, its sustainable development and management calls for precise quantification of this precious resource at an appropriate scale. This study demonstrates the efficacy of three GIS-based multi-criteria decision analysis (MCDA) techniques, viz., Analytic Hierarchy Process (AHP), Catastrophe and Entropy in evaluating groundwater potential through a case study in hard-rock aquifer systems. Using satellite imagery and relevant field data, eight thematic layers (rainfall, land slope, drainage density, soil, lineament density, geology, proximity to surface water bodies and elevation) of the factors having significant influence on groundwater occurrence were prepared. These thematic layers and their features were assigned suitable weights based on the conceptual frameworks of AHP, Catastrophe and Entropy techniques and then they were integrated in the GIS environment to generate an integrated raster layer depicting groundwater potential index of the study area. The three groundwater prospect maps thus yielded by these MCDA techniques were verified using a novel approach (concept of 'Dynamic Groundwater Potential'). The validation results revealed that the groundwater potential predicted by the AHP technique has a pronounced accuracy of 87% compared to the Catastrophe (46% accuracy) and Entropy techniques (51% accuracy). It is concluded that the AHP technique is the most reliable for the assessment of groundwater resources followed by the Entropy method. The developed groundwater potential maps can serve as a scientific guideline for the cost-effective siting of wells and the effective planning of groundwater development at a catchment or basin scale.

  6. Using comparative genome analysis to identify problems in annotated microbial genomes.

    PubMed

    Poptsova, Maria S; Gogarten, J Peter

    2010-07-01

    Genome annotation is a tedious task that is mostly done by automated methods; however, the accuracy of these approaches has been questioned since the beginning of the sequencing era. Genome annotation is a multilevel process, and errors can emerge at different stages: during sequencing, as a result of gene-calling procedures, and in the process of assigning gene functions. Missed or wrongly annotated genes differentially impact different types of analyses. Here we discuss and demonstrate how the methods of comparative genome analysis can refine annotations by locating missing orthologues. We also discuss possible reasons for errors and show that the second-generation annotation systems, which combine multiple gene-calling programs with similarity-based methods, perform much better than the first annotation tools. Since old errors may propagate to the newly sequenced genomes, we emphasize that the problem of continuously updating popular public databases is an urgent and unresolved one. Due to the progress in genome-sequencing technologies, automated annotation techniques will remain the main approach in the future. Researchers need to be aware of the existing errors in the annotation of even well-studied genomes, such as Escherichia coli, and consider additional quality control for their results.

  7. Effects of different analysis techniques and recording duty cycles on passive acoustic monitoring of killer whales.

    PubMed

    Riera, Amalis; Ford, John K; Ross Chapman, N

    2013-09-01

    Killer whales in British Columbia are at risk, and little is known about their winter distribution. Passive acoustic monitoring of their year-round habitat is a valuable supplemental method to traditional visual and photographic surveys. However, long-term acoustic studies of odontocetes have some limitations, including the generation of large amounts of data that require highly time-consuming processing. There is a need to develop tools and protocols to maximize the efficiency of such studies. Here, two types of analysis, real-time and long term spectral averages, were compared to assess their performance at detecting killer whale calls in long-term acoustic recordings. In addition, two different duty cycles, 1/3 and 2/3, were tested. Both the use of long term spectral averages and a lower duty cycle resulted in a decrease in call detection and positive pod identification, leading to underestimations of the amount of time the whales were present. The impact of these limitations should be considered in future killer whale acoustic surveys. A compromise between a lower resolution data processing method and a higher duty cycle is suggested for maximum methodological efficiency.

  8. New fluorescence techniques for high-throughput drug discovery.

    PubMed

    Jäger, S; Brand, L; Eggeling, C

    2003-12-01

    The rapid increase of compound libraries as well as new targets emerging from the Human Genome Project require constant progress in pharmaceutical research. An important tool is High-Throughput Screening (HTS), which has evolved as an indispensable instrument in the pre-clinical target-to-IND (Investigational New Drug) discovery process. HTS requires machinery, which is able to test more than 100,000 potential drug candidates per day with respect to a specific biological activity. This calls for certain experimental demands especially with respect to sensitivity, speed, and statistical accuracy, which are fulfilled by using fluorescence technology instrumentation. In particular the recently developed family of fluorescence techniques, FIDA (Fluorescence Intensity Distribution Analysis), which is based on confocal single-molecule detection, has opened up a new field of HTS applications. This report describes the application of these new techniques as well as of common fluorescence techniques--such as confocal fluorescence lifetime and anisotropy--to HTS. It gives experimental examples and presents advantages and disadvantages of each method. In addition the most common artifacts (auto-fluorescence or quenching by the drug candidates) emerging from the fluorescence detection techniques are highlighted and correction methods for confocal fluorescence read-outs are presented, which are able to circumvent this deficiency.

  9. Applications of Evolutionary Technology to Manufacturing and Logistics Systems : State-of-the Art Survey

    NASA Astrophysics Data System (ADS)

    Gen, Mitsuo; Lin, Lin

    Many combinatorial optimization problems from industrial engineering and operations research in real-world are very complex in nature and quite hard to solve them by conventional techniques. Since the 1960s, there has been an increasing interest in imitating living beings to solve such kinds of hard combinatorial optimization problems. Simulating the natural evolutionary process of human beings results in stochastic optimization techniques called evolutionary algorithms (EAs), which can often outperform conventional optimization methods when applied to difficult real-world problems. In this survey paper, we provide a comprehensive survey of the current state-of-the-art in the use of EA in manufacturing and logistics systems. In order to demonstrate the EAs which are powerful and broadly applicable stochastic search and optimization techniques, we deal with the following engineering design problems: transportation planning models, layout design models and two-stage logistics models in logistics systems; job-shop scheduling, resource constrained project scheduling in manufacturing system.

  10. Multipulse technique exploiting the intermodulation of ultrasound waves in a nonlinear medium.

    PubMed

    Biagi, Elena; Breschi, Luca; Vannacci, Enrico; Masotti, Leonardo

    2009-03-01

    In recent years, the nonlinear properties of materials have attracted much interest in nondestructive testing and in ultrasound diagnostic applications. Acoustic nonlinear parameters represent an opportunity to improve the information that can be extracted from a medium such as structural organization and pathologic status of tissue. In this paper, a method called pulse subtraction intermodulation (PSI), based on a multipulse technique, is presented and investigated both theoretically and experimentally. This method allows separation of the intermodulation products, which arise when 2 separate frequencies are transmitted in a nonlinear medium, from fundamental and second harmonic components, making them available for improved imaging techniques or signal processing algorithms devoted to tissue characterization. The theory of intermodulation product generation was developed according the Khokhlov-Zabolotskaya-Kuznetsov (KZK) nonlinear propagation equation, which is consistent with experimental results. The description of the proposed method, characterization of the intermodulation spectral contents, and quantitative results coming from in vitro experimentation are reported and discussed in this paper.

  11. Wind Lidar Edge Technique Shuttle Demonstration Mission: Anemos

    NASA Technical Reports Server (NTRS)

    Leete, Stephen J.; Bundas, David J.; Martino, Anthony J.; Carnahan, Timothy M.; Zukowski, Barbara J.

    1998-01-01

    A NASA mission is planned to demonstrate the technology for a wind lidar. This will implement the direct detection edge technique. The Anemos instrument will fly on the Space Transportation System (STS), or shuttle, aboard a Hitchhiker bridge. The instrument is being managed by the Goddard Space Flight Center as an in-house build, with science leadership from the GSFC Laboratory for Atmospheres, Mesoscale Atmospheric Processes Branch. During a roughly ten-day mission, the instrument will self calibrate and adjust for launch induced mis-alignments, and perform a campaign of measurements of tropospheric winds. The mission is planned for early 2001. The instrument is being developed under the auspices of NASA's New Millennium Program, in parallel with a comparable mission being managed by the Marshall Space Flight Center. That mission, called SPARCLE, will implement the coherent technique. NASA plans to fly the two missions together on the same shuttle flight, to allow synergy of wind measurements and a direct comparison of performance.

  12. PERSISTENCE MAPPING USING EUV SOLAR IMAGER DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, B. J.; Young, C. A., E-mail: barbara.j.thompson@nasa.gov

    We describe a simple image processing technique that is useful for the visualization and depiction of gradually evolving or intermittent structures in solar physics extreme-ultraviolet imagery. The technique is an application of image segmentation, which we call “Persistence Mapping,” to isolate extreme values in a data set, and is particularly useful for the problem of capturing phenomena that are evolving in both space and time. While integration or “time-lapse” imaging uses the full sample (of size N ), Persistence Mapping rejects ( N − 1)/ N of the data set and identifies the most relevant 1/ N values using themore » following rule: if a pixel reaches an extreme value, it retains that value until that value is exceeded. The simplest examples isolate minima and maxima, but any quantile or statistic can be used. This paper demonstrates how the technique has been used to extract the dynamics in long-term evolution of comet tails, erupting material, and EUV dimming regions.« less

  13. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  14. Automatic luminous reflections detector using global threshold with increased luminosity contrast in images

    NASA Astrophysics Data System (ADS)

    Silva, Ricardo Petri; Naozuka, Gustavo Taiji; Mastelini, Saulo Martiello; Felinto, Alan Salvany

    2018-01-01

    The incidence of luminous reflections (LR) in captured images can interfere with the color of the affected regions. These regions tend to oversaturate, becoming whitish and, consequently, losing the original color information of the scene. Decision processes that employ images acquired from digital cameras can be impaired by the LR incidence. Such applications include real-time video surgeries, facial, and ocular recognition. This work proposes an algorithm called contrast enhancement of potential LR regions, which is a preprocessing to increase the contrast of potential LR regions, in order to improve the performance of automatic LR detectors. In addition, three automatic detectors were compared with and without the employment of our preprocessing method. The first one is a technique already consolidated in the literature called the Chang-Tseng threshold. We propose two automatic detectors called adapted histogram peak and global threshold. We employed four performance metrics to evaluate the detectors, namely, accuracy, precision, exactitude, and root mean square error. The exactitude metric is developed by this work. Thus, a manually defined reference model was created. The global threshold detector combined with our preprocessing method presented the best results, with an average exactitude rate of 82.47%.

  15. Toward fidelity between specification and implementation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.; Morrison, Jeff; Wu, Yunqing

    1994-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  16. Verification and validation of a reliable multicast protocol

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Montgomery, Todd L.

    1995-01-01

    This paper describes the methods used to specify and implement a complex communications protocol that provides reliable delivery of data in multicast-capable, packet-switching telecommunication networks. The protocol, called the Reliable Multicasting Protocol (RMP), was developed incrementally by two complementary teams using a combination of formal and informal techniques in an attempt to ensure the correctness of the protocol implementation. The first team, called the Design team, initially specified protocol requirements using a variant of SCR requirements tables and implemented a prototype solution. The second team, called the V&V team, developed a state model based on the requirements tables and derived test cases from these tables to exercise the implementation. In a series of iterative steps, the Design team added new functionality to the implementation while the V&V team kept the state model in fidelity with the implementation through testing. Test cases derived from state transition paths in the formal model formed the dialogue between teams during development and served as the vehicles for keeping the model and implementation in fidelity with each other. This paper describes our experiences in developing our process model, details of our approach, and some example problems found during the development of RMP.

  17. A short review of variants calling for single-cell-sequencing data with applications.

    PubMed

    Wei, Zhuohui; Shu, Chang; Zhang, Changsheng; Huang, Jingying; Cai, Hongmin

    2017-11-01

    The field of single-cell sequencing is fleetly expanding, and many techniques have been developed in the past decade. With this technology, biologists can study not only the heterogeneity between two adjacent cells in the same tissue or organ, but also the evolutionary relationships and degenerative processes in a single cell. Calling variants is the main purpose in analyzing single cell sequencing (SCS) data. Currently, some popular methods used for bulk-cell-sequencing data analysis are tailored directly to be applied in dealing with SCS data. However, SCS requires an extra step of genome amplification to accumulate enough quantity for satisfying sequencing needs. The amplification yields large biases and thus raises challenge for using the bulk-cell-sequencing methods. In order to provide guidance for the development of specialized analyzed methods as well as using currently developed tools for SNS, this paper aims to bridge the gap. In this paper, we firstly introduced two popular genome amplification methods and compared their capabilities. Then we introduced a few popular models for calling single-nucleotide polymorphisms and copy-number variations. Finally, break-through applications of SNS were summarized to demonstrate its potential in researching cell evolution. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms.

    PubMed

    Puritz, Jonathan B; Hollenbeck, Christopher M; Gold, John R

    2014-01-01

    Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com.

  19. dDocent: a RADseq, variant-calling pipeline designed for population genomics of non-model organisms

    PubMed Central

    Hollenbeck, Christopher M.; Gold, John R.

    2014-01-01

    Restriction-site associated DNA sequencing (RADseq) has become a powerful and useful approach for population genomics. Currently, no software exists that utilizes both paired-end reads from RADseq data to efficiently produce population-informative variant calls, especially for non-model organisms with large effective population sizes and high levels of genetic polymorphism. dDocent is an analysis pipeline with a user-friendly, command-line interface designed to process individually barcoded RADseq data (with double cut sites) into informative SNPs/Indels for population-level analyses. The pipeline, written in BASH, uses data reduction techniques and other stand-alone software packages to perform quality trimming and adapter removal, de novo assembly of RAD loci, read mapping, SNP and Indel calling, and baseline data filtering. Double-digest RAD data from population pairings of three different marine fishes were used to compare dDocent with Stacks, the first generally available, widely used pipeline for analysis of RADseq data. dDocent consistently identified more SNPs shared across greater numbers of individuals and with higher levels of coverage. This is due to the fact that dDocent quality trims instead of filtering, incorporates both forward and reverse reads (including reads with INDEL polymorphisms) in assembly, mapping, and SNP calling. The pipeline and a comprehensive user guide can be found at http://dDocent.wordpress.com. PMID:24949246

  20. Infrared Database for Process Support Materials

    NASA Technical Reports Server (NTRS)

    Bennett, K. E.; Boothe, R. E.; Burns, H. D.

    2003-01-01

    Process support materials' compatibility with cleaning processes is critical to ensure final hardware cleanliness and that performance requirements are met. Previous discovery of potential contaminants in process materials shows the need for incoming materials testing and establishment of a process materials database. The Contamination Control Team of the Materials, Processes, and Manufacturing (MP&M) Department at Marshall Space Flight Center (MSFC) has initiated the development of such an infrared (IR) database, called the MSFC Process Materials IR database, of the common process support materials used at MSFC. These process support materials include solvents, wiper cloths, gloves, bagging materials, etc. Testing includes evaluation of the potential of gloves, wiper cloths, and other items to transfer contamination to handled articles in the absence of solvent exposure, and the potential for solvent exposure to induce material degradation. This Technical Memorandum (TM) summarizes the initial testing completed through December 2002. It is anticipated that additional testing will be conducted with updates provided in future TMs.Materials were analyzed using two different IR techniques: (1) Dry transference and (2) liquid extraction testing. The first of these techniques utilized the Nicolet Magna 750 IR spectrometer outfitted with a horizontal attenuated total reflectance (HATR) crystal accessory. The region from 650 to 4,000 wave numbers was analyzed, and 50 scans were performed per IR spectrum. A dry transference test was conducted by applying each sample with hand pressure to the HATR crystal to first obtain a spectrum of the parent material. The material was then removed from the HATR crystal and analyzed to determine the presence of any residues. If volatile, liquid samples were examined both prior to and following evaporation.The second technique was to perform an extraction test with each sample in five different solvents.Once the scans were complete for both the dry transference and the extraction tests, the residue from each scan was interpreted.

  1. Characterization of welded HP 9-4-30 steel for the advanced solid rocket motor

    NASA Technical Reports Server (NTRS)

    Watt, George William

    1990-01-01

    Solid rocket motor case materials must be high-strength, high-toughness, weldable alloys. The Advanced Solid Rocket Motor (ASRM) cases currently being developed will be made from a 9Ni-4Co quench and temper steel called HP 9-4-30. These ultra high-strength steels must be carefully processed to give a very clean material and a fine grained microstructure, which insures excellent ductility and toughness. The HP 9-4-30 steels are vacuum arc remelted and carbon deoxidized to give the cleanliness required. The ASRM case material will be formed into rings and then welded together to form the case segments. Welding is the desired joining technique because it results in a lower weight than other joining techniques. The mechanical and corrosion properties of the weld region material were fully studied.

  2. 3D Lunar Terrain Reconstruction from Apollo Images

    NASA Technical Reports Server (NTRS)

    Broxton, Michael J.; Nefian, Ara V.; Moratto, Zachary; Kim, Taemin; Lundy, Michael; Segal, Alkeksandr V.

    2009-01-01

    Generating accurate three dimensional planetary models is becoming increasingly important as NASA plans manned missions to return to the Moon in the next decade. This paper describes a 3D surface reconstruction system called the Ames Stereo Pipeline that is designed to produce such models automatically by processing orbital stereo imagery. We discuss two important core aspects of this system: (1) refinement of satellite station positions and pose estimates through least squares bundle adjustment; and (2) a stochastic plane fitting algorithm that generalizes the Lucas-Kanade method for optimal matching between stereo pair images.. These techniques allow us to automatically produce seamless, highly accurate digital elevation models from multiple stereo image pairs while significantly reducing the influence of image noise. Our technique is demonstrated on a set of 71 high resolution scanned images from the Apollo 15 mission

  3. Determining production level under uncertainty using fuzzy simulation and bootstrap technique, a case study

    NASA Astrophysics Data System (ADS)

    Hamidi, Mohammadreza; Shahanaghi, Kamran; Jabbarzadeh, Armin; Jahani, Ehsan; Pousti, Zahra

    2017-12-01

    In every production plant, it is necessary to have an estimation of production level. Sometimes there are many parameters affective in this estimation. In this paper, it tried to find an appropriate estimation of production level for an industrial factory called Barez in an uncertain environment. We have considered a part of production line, which has different production time for different kind of products, which means both environmental and system uncertainty. To solve the problem we have simulated the line and because of the uncertainty in the times, fuzzy simulation is considered. Required fuzzy numbers are estimated by the use of bootstrap technique. The results are used in production planning process by factory experts and have had satisfying consequences. Opinions of these experts about the efficiency of using this methodology, has been attached.

  4. Terrestrial Radiodetermination Performance and Cost

    DOT National Transportation Integrated Search

    1977-09-01

    The report summarizes information gathered during a study of the application of electronic techniques to geographical position determination on land and on inland waterways. Systems incorporating such techniques have been called terrestrial radiodete...

  5. Preliminary Analysis of Photoreading

    NASA Technical Reports Server (NTRS)

    McNamara, Danielle S.

    2000-01-01

    The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.

  6. The physics of a popsicle stick bomb

    NASA Astrophysics Data System (ADS)

    Sautel, Jérémy; Bourges, Andréane; Caussarieu, Aude; Plihon, Nicolas; Taberlet, Nicolas

    2017-10-01

    Popsicle sticks can be interlocked in the so-called "cobra weave" to form a chain under tension. When one end of the chain is released, the sticks rapidly disentangle, forming a traveling wave that propagates down the chain. In this paper, the properties of the traveling front are studied experimentally, and classical results from the theory of elasticity allow for a dimensional analysis of the height and speed of the traveling wave. The study presented here can help undergraduate students familiarize themselves with experimental techniques of image processing, and it also demonstrates the power of dimensional analysis and scaling laws.

  7. Ballistic imaging of the near field in a diesel spray

    NASA Astrophysics Data System (ADS)

    Linne, Mark; Paciaroni, Megan; Hall, Tyler; Parker, Terry

    2006-06-01

    We have developed an optical technique called ballistic imaging to view breakup of the near-field of an atomizing spray. In this paper, we describe the successful use of a time-gated ballistic imaging instrument to obtain single-shot images of core region breakup in a transient, single hole atomizing diesel fuel spray issuing into one atmosphere. We present a sequence of images taken at the nozzle for various times after start of injection, and a sequence taken at various positions downstream of the nozzle exit at a fixed time. These images contain signatures of periodic behavior, voids, and entrainment processes.

  8. An Evaluation of Understandability of Patient Journey Models in Mental Health

    PubMed Central

    2016-01-01

    Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006

  9. Habituation, Response to Novelty, and Dishabituation in Human Infants: Tests of a Dual-Process Theory of Visual Attention.

    ERIC Educational Resources Information Center

    Kaplan, Peter S.; Werner, John S.

    1986-01-01

    Tests infants' dual-process performance (a process mediating response decrements called habituation and a state-dependent process mediating response increments called sensitization) on visual habituation-dishabituation tasks. (HOD)

  10. Timing and teamwork--an observational pilot study of patients referred to a Rapid Response Team with the aim of identifying factors amenable to re-design of a Rapid Response System.

    PubMed

    Peebles, Emma; Subbe, Christian P; Hughes, Paul; Gemmell, Les

    2012-06-01

    Rapid Response Teams aim to accelerate recognition and treatment of acutely unwell patients. Delays in delivery might undermine efficiency of the intervention. Our understanding of the causes of these delays is, as yet, incomplete. To identify modifiable causes of delays in the treatment of critically ill patients outside intensive care with a focus on factors amenable to system design. Review of care records and direct observation with process mapping of care delivered to 17 acutely unwell patients attended by a Rapid Response Team in a District General Hospital in the United Kingdom. Delays were defined as processes with no added value for patient care. Essential diagnostic and therapeutic procedures accounted for only 31% of time of care processes. Causes for delays could be classified into themes as (1) delays in call-out of the Rapid Response Team, (2) problems with team cohesion including poor communication and team efficiency and (3) lack of resources including lack of first line antibiotics, essential equipment, experienced staff and critical care beds. We identified a number of potentially modifiable causes for delays in care of acutely ill patients. Improved process design could include automated call-outs, a dedicated kit for emergency treatment in relevant clinical areas, increased usage of standard operating procedures and staff training using crew resource management techniques. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. Asteroseismic inversions in the Kepler era: application to the Kepler Legacy sample

    NASA Astrophysics Data System (ADS)

    Buldgen, Gaël; Reese, Daniel; Dupret, Marc-Antoine

    2017-10-01

    In the past few years, the CoRoT and Kepler missions have carried out what is now called the space photometry revolution. This revolution is still ongoing thanks to K2 and will be continued by the Tess and Plato2.0 missions. However, the photometry revolution must also be followed by progress in stellar modelling, in order to lead to more precise and accurate determinations of fundamental stellar parameters such as masses, radii and ages. In this context, the long-lasting problems related to mixing processes in stellar interior is the main obstacle to further improvements of stellar modelling. In this contribution, we will apply structural asteroseismic inversion techniques to targets from the Kepler Legacy sample and analyse how these can help us constrain the fundamental parameters and mixing processes in these stars. Our approach is based on previous studies using the SOLA inversion technique [1] to determine integrated quantities such as the mean density [2], the acoustic radius, and core conditions indicators [3], and has already been successfully applied to the 16Cyg binary system [4]. We will show how this technique can be applied to the Kepler Legacy sample and how new indicators can help us to further constrain the chemical composition profiles of stars as well as provide stringent constraints on stellar ages.

  12. Measurement of Interfacial Profiles of Wavy Film Flow on Inclined Wall

    NASA Astrophysics Data System (ADS)

    Rosli, N.; Amagai, K.

    2016-02-01

    Falling liquid films on inclined wall present in many industrial processes such as in food processing, seawater desalination and electronic devices manufacturing industries. In order to ensure an optimal efficiency of the operation in these industries, a fundamental study on the interfacial flow profiles of the liquid film is of great importance. However, it is generally difficult to experimentally predict the interfacial profiles of liquid film flow on inclined wall due to the instable wavy flow that usually formed on the liquid film surface. In this paper, the liquid film surface velocity was measured by using a non-intrusive technique called as photochromic dye marking method. This technique utilizes the color change of liquid containing the photochromic dye when exposed to the UV light source. The movement of liquid film surface marked by the UV light was analyzed together with the wave passing over the liquid. As a result, the liquid film surface was found to slightly shrink its gradual movement when approached by the wave before gradually move again after the intersection with the wave.

  13. Electromagnetic Test-Facility characterization: an identification approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zicker, J.E.; Candy, J.V.

    The response of an object subjected to high energy, transient electromagnetic (EM) fields sometimes called electromagnetic pulses (EMP), is an important issue in the survivability of electronic systems (e.g., aircraft), especially when the field has been generated by a high altitude nuclear burst. The characterization of transient response information is a matter of national concern. In this report we discuss techniques to: (1) improve signal processing at a test facility; and (2) parameterize a particular object response. First, we discuss the application of identification-based signal processing techniques to improve signal levels at the Lawrence Livermore National Laboratory (LLNL) EM Transientmore » Test Facility. We identify models of test equipment and then use these models to deconvolve the input/output sequences for the object under test. A parametric model of the object is identified from this data. The model can be used to extrapolate the response to these threat level EMP. Also discussed is the development of a facility simulator (EMSIM) useful for experimental design and calibration and a deconvolution algorithm (DECONV) useful for removing probe effects from the measured data.« less

  14. High-Rate Assembly of Nanomaterials on Insulating Surfaces Using Electro-Fluidic Directed Assembly.

    PubMed

    Yilmaz, Cihan; Sirman, Asli; Halder, Aditi; Busnaina, Ahmed

    2017-08-22

    Conductive or semiconducting nanomaterials-based applications such as electronics and sensors often require direct placement of such nanomaterials on insulating surfaces. Most fluidic-based directed assembly techniques on insulating surfaces utilize capillary force and evaporation but are diffusion limited and slow. Electrophoretic-based assembly, on the other hand, is fast but can only be utilized for assembly on a conductive surface. Here, we present a directed assembly technique that enables rapid assembly of nanomaterials on insulating surfaces. The approach leverages and combines fluidic and electrophoretic assembly by applying the electric field through an insulating surface via a conductive film underneath. The approach (called electro-fluidic) yields an assembly process that is 2 orders of magnitude faster compared to fluidic assembly. By understanding the forces on the assembly process, we have demonstrated the controlled assembly of various types of nanomaterials that are conducting, semiconducting, and insulating including nanoparticles and single-walled carbon nanotubes on insulating rigid and flexible substrates. The presented approach shows great promise for making practical devices in miniaturized sensors and flexible electronics.

  15. New procedure for extraction of algal lipids from wet biomass: a green clean and scalable process.

    PubMed

    Dejoye Tanzi, Celine; Abert Vian, Maryline; Chemat, Farid

    2013-04-01

    A new procedure, called Simultaneous Distillation and Extraction Process (SDEP), for lipid extraction from wet microalgae (Nannochloropsis oculata and Dunaliella salina) was reported. This method does not require a pre-drying of the biomass and employs alternative solvents such as d-limonene, α-pinene and p-cymene. This procedure has been compared with Soxhlet extraction (Sox) and Bligh & Dyer method (B&D). For N. oculata, results showed that SDEP-cymene provided similar lipid yields to B&D (21.45% and 23.78%), while SDEP-limonene and pinene provided lower yields (18.73% and 18.75% respectively). For D. salina, SDEP-pinene provided the maximum lipid yield (3.29%) compared to the other solvents, which is quite close to B&D result (4.03%). No significant differences in terms of distribution of lipid classes and fatty acid composition have been obtained for different techniques. Evaluation of energy consumption indicates a substantial saving in the extraction cost by SDEP compared to the conventional extraction technique, Soxhlet. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Multigrid direct numerical simulation of the whole process of flow transition in 3-D boundary layers

    NASA Technical Reports Server (NTRS)

    Liu, Chaoqun; Liu, Zhining

    1993-01-01

    A new technology was developed in this study which provides a successful numerical simulation of the whole process of flow transition in 3-D boundary layers, including linear growth, secondary instability, breakdown, and transition at relatively low CPU cost. Most other spatial numerical simulations require high CPU cost and blow up at the stage of flow breakdown. A fourth-order finite difference scheme on stretched and staggered grids, a fully implicit time marching technique, a semi-coarsening multigrid based on the so-called approximate line-box relaxation, and a buffer domain for the outflow boundary conditions were all used for high-order accuracy, good stability, and fast convergence. A new fine-coarse-fine grid mapping technique was developed to keep the code running after the laminar flow breaks down. The computational results are in good agreement with linear stability theory, secondary instability theory, and some experiments. The cost for a typical case with 162 x 34 x 34 grid is around 2 CRAY-YMP CPU hours for 10 T-S periods.

  17. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth.

    PubMed

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year(-1), overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year(-1) occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique.

  18. Permanent Scatterer InSAR Analysis and Validation in the Gulf of Corinth

    PubMed Central

    Elias, Panagiotis; Kontoes, Charalabos; Papoutsis, Ioannis; Kotsis, Ioannis; Marinou, Aggeliki; Paradissis, Dimitris; Sakellariou, Dimitris

    2009-01-01

    The Permanent Scatterers Interferometric SAR technique (PSInSAR) is a method that accurately estimates the near vertical terrain deformation rates, of the order of ∼1 mm year-1, overcoming the physical and technical restrictions of classic InSAR. In this paper the method is strengthened by creating a robust processing chain, incorporating PSInSAR analysis together with algorithmic adaptations for Permanent Scatterer Candidates (PSCs) and Permanent Scatterers (PSs) selection. The processing chain, called PerSePHONE, was applied and validated in the geophysically active area of the Gulf of Corinth. The analysis indicated a clear subsidence trend in the north-eastern part of the gulf, with the maximum deformation of ∼2.5 mm year-1 occurring in the region north of the Gulf of Alkyonides. The validity of the results was assessed against geophysical/geological and geodetic studies conducted in the area, which include continuous seismic profiling data and GPS height measurements. All these observations converge to the same deformation pattern as the one derived by the PSInSAR technique. PMID:22389587

  19. Anomaly-based intrusion detection for SCADA systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-07-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper willmore » briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)« less

  20. Control algorithms for aerobraking in the Martian atmosphere

    NASA Technical Reports Server (NTRS)

    Ward, Donald T.; Shipley, Buford W., Jr.

    1991-01-01

    The Analytic Predictor Corrector (APC) and Energy Controller (EC) atmospheric guidance concepts were adapted to control an interplanetary vehicle aerobraking in the Martian atmosphere. Changes are made to the APC to improve its robustness to density variations. These changes include adaptation of a new exit phase algorithm, an adaptive transition velocity to initiate the exit phase, refinement of the reference dynamic pressure calculation and two improved density estimation techniques. The modified controller with the hybrid density estimation technique is called the Mars Hybrid Predictor Corrector (MHPC), while the modified controller with a polynomial density estimator is called the Mars Predictor Corrector (MPC). A Lyapunov Steepest Descent Controller (LSDC) is adapted to control the vehicle. The LSDC lacked robustness, so a Lyapunov tracking exit phase algorithm is developed to guide the vehicle along a reference trajectory. This algorithm, when using the hybrid density estimation technique to define the reference path, is called the Lyapunov Hybrid Tracking Controller (LHTC). With the polynomial density estimator used to define the reference trajectory, the algorithm is called the Lyapunov Tracking Controller (LTC). These four new controllers are tested using a six degree of freedom computer simulation to evaluate their robustness. The MHPC, MPC, LHTC, and LTC show dramatic improvements in robustness over the APC and EC.

  1. Blind source computer device identification from recorded VoIP calls for forensic investigation.

    PubMed

    Jahanirad, Mehdi; Anuar, Nor Badrul; Wahab, Ainuddin Wahid Abdul

    2017-03-01

    The VoIP services provide fertile ground for criminal activity, thus identifying the transmitting computer devices from recorded VoIP call may help the forensic investigator to reveal useful information. It also proves the authenticity of the call recording submitted to the court as evidence. This paper extended the previous study on the use of recorded VoIP call for blind source computer device identification. Although initial results were promising but theoretical reasoning for this is yet to be found. The study suggested computing entropy of mel-frequency cepstrum coefficients (entropy-MFCC) from near-silent segments as an intrinsic feature set that captures the device response function due to the tolerances in the electronic components of individual computer devices. By applying the supervised learning techniques of naïve Bayesian, linear logistic regression, neural networks and support vector machines to the entropy-MFCC features, state-of-the-art identification accuracy of near 99.9% has been achieved on different sets of computer devices for both call recording and microphone recording scenarios. Furthermore, unsupervised learning techniques, including simple k-means, expectation-maximization and density-based spatial clustering of applications with noise (DBSCAN) provided promising results for call recording dataset by assigning the majority of instances to their correct clusters. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  2. Terrestrial Radiodetermination Potential Users and Their Requirements

    DOT National Transportation Integrated Search

    1976-07-01

    The report summarizes information gathered during a preliminary study of the application of electronic techniques to geographical position determination on land and on inland waterways. Systems incorporating such techniques have been called terrestri...

  3. Interactive algebraic grid-generation technique

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Wiese, M. R.

    1986-01-01

    An algebraic grid generation technique and use of an associated interactive computer program are described. The technique, called the two boundary technique, is based on Hermite cubic interpolation between two fixed, nonintersecting boundaries. The boundaries are referred to as the bottom and top, and they are defined by two ordered sets of points. Left and right side boundaries which intersect the bottom and top boundaries may also be specified by two ordered sets of points. when side boundaries are specified, linear blending functions are used to conform interior interpolation to the side boundaries. Spacing between physical grid coordinates is determined as a function of boundary data and uniformly space computational coordinates. Control functions relating computational coordinates to parametric intermediate variables that affect the distance between grid points are embedded in the interpolation formulas. A versatile control function technique with smooth-cubic-spline functions is presented. The technique works best in an interactive graphics environment where computational displays and user responses are quickly exchanged. An interactive computer program based on the technique and called TBGG (two boundary grid generation) is also described.

  4. Calling depths of baleen whales from single sensor data: development of an autocorrelation method using multipath localization.

    PubMed

    Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M

    2013-09-01

    Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.

  5. Developing questionnaires for educational research: AMEE Guide No. 87

    PubMed Central

    La Rochelle, Jeffrey S.; Dezee, Kent J.; Gehlbach, Hunter

    2014-01-01

    In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure. PMID:24661014

  6. Developing questionnaires for educational research: AMEE Guide No. 87.

    PubMed

    Artino, Anthony R; La Rochelle, Jeffrey S; Dezee, Kent J; Gehlbach, Hunter

    2014-06-01

    In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure.

  7. Fabrication of large area Si cylindric drift detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, W.; Kraner, H.W.; Li, Z.

    1993-04-01

    Advanced Si drift detector, a large area cylindrical drift detector (CDD), processing steps, with the exception of the ion implantation, were carried out in the BNL class 100 cleanroom. The double-side planer process technique was developed for the fabrication of CDD. Important improvements of the double-side planer process in this fabrication are the introduction of Al implantation protection mask and the remaining of a 1000 Angstroms oxide layer in the p-window during the implantation. Another important design of the CDD is the structure called ``river,`` which ,allows the current generated on Si-SiO{sub 2} interface to ``flow`` into the guard anode,more » and thus can minimize the leakage current at the signed anode. The test result showed that most of the signal anodes have the leakage current about 0.3 nA/cm{sup 2} for the best detector.« less

  8. Immobilised lipases in the cosmetics industry.

    PubMed

    Ansorge-Schumacher, Marion B; Thum, Oliver

    2013-08-07

    Commercial products for personal care, generally perceived as cosmetics, have an important impact on everyday life worldwide. Accordingly, the market for both consumer products and specialty chemicals comprising their ingredients is considerable. Lipases have started to play a minor role as active ingredients in so-called 'functional cosmetics' as well as a major role as catalysts for the industrial production of various specialty esters, aroma compounds and active agents. Interestingly, both applications almost always require preparation by appropriate immobilisation techniques. In addition, for catalytic use special reactor concepts often have to be employed due to the mostly limited stability of these preparations. Nevertheless, these processes show distinct advantages based on process simplification, product quality and environmental footprint and are therefore apt to more and more replace traditional chemical processes. Here, for the first time a review on the various aspects of using immobilised lipases in the cosmetics industry is given.

  9. Intimate Debate Technique: Medicinal Use of Marijuana

    ERIC Educational Resources Information Center

    Herreid, Clyde Freeman; DeRei, Kristie

    2007-01-01

    Classroom debates used to be familiar exercises to students schooled in past generations. In this article, the authors describe the technique called "intimate debate". To cooperative learning specialists, the technique is known as "structured debate" or "constructive debate". It is a powerful method for dealing with case topics that involve…

  10. Experiment in Onboard Synthetic Aperture Radar Data Processing

    NASA Technical Reports Server (NTRS)

    Holland, Matthew

    2011-01-01

    Single event upsets (SEUs) are a threat to any computing system running on hardware that has not been physically radiation hardened. In addition to mandating the use of performance-limited, hardened heritage equipment, prior techniques for dealing with the SEU problem often involved hardware-based error detection and correction (EDAC). With limited computing resources, software- based EDAC, or any more elaborate recovery methods, were often not feasible. Synthetic aperture radars (SARs), when operated in the space environment, are interesting due to their relevance to NASAs objectives, but problematic in the sense of producing prodigious amounts of raw data. Prior implementations of the SAR data processing algorithm have been too slow, too computationally intensive, and require too much application memory for onboard execution to be a realistic option when using the type of heritage processing technology described above. This standard C-language implementation of SAR data processing is distributed over many cores of a Tilera Multicore Processor, and employs novel Radiation Hardening by Software (RHBS) techniques designed to protect the component processes (one per core) and their shared application memory from the sort of SEUs expected in the space environment. The source code includes calls to Tilera APIs, and a specialized Tilera compiler is required to produce a Tilera executable. The compiled application reads input data describing the position and orientation of a radar platform, as well as its radar-burst data, over time and writes out processed data in a form that is useful for analysis of the radar observations.

  11. Boostream: a dynamic fluid flow process to assemble nanoparticles at liquid interface

    NASA Astrophysics Data System (ADS)

    Delléa, Olivier; Lebaigue, Olivier

    2017-12-01

    CEA-LITEN develops an original process called Boostream® to manipulate, assemble and connect micro- or nanoparticles of various materials, sizes, shapes and functions to obtain monolayer colloidal crystals (MCCs). This process uses the upper surface of a liquid film flowing down a ramp to assemble particles in a manner that is close to the horizontal situation of a Langmuir-Blodgett film construction. In presence of particles at the liquid interface, the film down-flow configuration exhibits an unusual hydraulic jump which results from the fluid flow accommodation to the particle monolayer. In order to master our process, the fluid flow has been modeled and experimentally characterized by optical means, such as with the moiré technique that consists in observing the reflection of a succession of periodic black-and-red fringes on the liquid surface mirror. The fringe images are deformed when reflected by the curved liquid surface associated with the hydraulic jump, the fringe deformation being proportional to the local slope of the surface. This original experimental setup allowed us to get the surface profile in the jump region and to measure it along with the main process parameters (liquid flow rate, slope angle, temperature sensitive fluid properties such as dynamic viscosity or surface tension, particle sizes). This work presents the experimental setup and its simple model, the different experimental characterization techniques used and will focus on the way the hydraulic jump relies on the process parameters.

  12. EMPRESS: A European Project to Enhance Process Control Through Improved Temperature Measurement

    NASA Astrophysics Data System (ADS)

    Pearce, J. V.; Edler, F.; Elliott, C. J.; Rosso, L.; Sutton, G.; Andreu, A.; Machin, G.

    2017-08-01

    A new European project called EMPRESS, funded by the EURAMET program `European Metrology Program for Innovation and Research,' is described. The 3 year project, which started in the summer of 2015, is intended to substantially augment the efficiency of high-value manufacturing processes by improving temperature measurement techniques at the point of use. The project consortium has 18 partners and 5 external collaborators, from the metrology sector, high-value manufacturing, sensor manufacturing, and academia. Accurate control of temperature is key to ensuring process efficiency and product consistency and is often not achieved to the level required for modern processes. Enhanced efficiency of processes may take several forms including reduced product rejection/waste; improved energy efficiency; increased intervals between sensor recalibration/maintenance; and increased sensor reliability, i.e., reduced amount of operator intervention. Traceability of temperature measurements to the International Temperature Scale of 1990 (ITS-90) is a critical factor in establishing low measurement uncertainty and reproducible, consistent process control. Introducing such traceability in situ (i.e., within the industrial process) is a theme running through this project.

  13. Combining LCT tools for the optimization of an industrial process: material and energy flow analysis and best available techniques.

    PubMed

    Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares

    2011-09-15

    Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. A blind source separation approach for humpback whale song separation.

    PubMed

    Zhang, Zhenbin; White, Paul R

    2017-04-01

    Many marine mammal species are highly social and are frequently encountered in groups or aggregations. When conducting passive acoustic monitoring in such circumstances, recordings commonly contain vocalizations of multiple individuals which overlap in time and frequency. This paper considers the use of blind source separation as a method for processing these recordings to separate the calls of individuals. The example problem considered here is that of the songs of humpback whales. The high levels of noise and long impulse responses can make source separation in underwater contexts a challenging proposition. The approach present here is based on time-frequency masking, allied to a noise reduction process. The technique is assessed using simulated and measured data sets, and the results demonstrate the effectiveness of the method for separating humpback whale songs.

  15. Experimental analysis on semi-finishing machining of Ti6Al4V additively manufactured by direct melting laser sintering

    NASA Astrophysics Data System (ADS)

    Imbrogno, Stano; Bordin, Alberto; Bruschi, Stefania; Umbrello, Domenico

    2016-10-01

    The Additive Manufacturing (AM) techniques are particularly appealing especially for titanium aerospace and biomedical components because they permit to achieve a strong reduction of the buy-to-fly ratio. However, finishing machining operations are often necessary to reduce the uneven surface roughness and geometrics because of local missing accuracy. This work shows the influence of the cutting parameters, cutting speed and feed rate, on the cutting forces as well as on the thermal field observed in the cutting zone, during a turning operation carried out on bars made of Ti6Al4V obtained by the AM process called Direct Metal Laser Sintering (DMLS). Moreover, the sub-surface microstructure alterations due to the process are also showed and commented.

  16. Additive Manufacturing of Metal Structures at the Micrometer Scale.

    PubMed

    Hirt, Luca; Reiser, Alain; Spolenak, Ralph; Zambelli, Tomaso

    2017-05-01

    Currently, the focus of additive manufacturing (AM) is shifting from simple prototyping to actual production. One driving factor of this process is the ability of AM to build geometries that are not accessible by subtractive fabrication techniques. While these techniques often call for a geometry that is easiest to manufacture, AM enables the geometry required for best performance to be built by freeing the design process from restrictions imposed by traditional machining. At the micrometer scale, the design limitations of standard fabrication techniques are even more severe. Microscale AM thus holds great potential, as confirmed by the rapid success of commercial micro-stereolithography tools as an enabling technology for a broad range of scientific applications. For metals, however, there is still no established AM solution at small scales. To tackle the limited resolution of standard metal AM methods (a few tens of micrometers at best), various new techniques aimed at the micrometer scale and below are presently under development. Here, we review these recent efforts. Specifically, we feature the techniques of direct ink writing, electrohydrodynamic printing, laser-assisted electrophoretic deposition, laser-induced forward transfer, local electroplating methods, laser-induced photoreduction and focused electron or ion beam induced deposition. Although these methods have proven to facilitate the AM of metals with feature sizes in the range of 0.1-10 µm, they are still in a prototype stage and their potential is not fully explored yet. For instance, comprehensive studies of material availability and material properties are often lacking, yet compulsory for actual applications. We address these items while critically discussing and comparing the potential of current microscale metal AM techniques. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Digital computer technique for setup and checkout of an analog computer

    NASA Technical Reports Server (NTRS)

    Ambaruch, R.

    1968-01-01

    Computer program technique, called Analog Computer Check-Out Routine Digitally /ACCORD/, generates complete setup and checkout data for an analog computer. In addition, the correctness of the analog program implementation is validated.

  18. A new measure for gene expression biclustering based on non-parametric correlation.

    PubMed

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Defining Components of Team Leadership and Membership in Prehospital Emergency Medical Services.

    PubMed

    Crowe, Remle P; Wagoner, Robert L; Rodriguez, Severo A; Bentley, Melissa A; Page, David

    2017-01-01

    Teamwork is critical for patient and provider safety in high-stakes environments, including the setting of prehospital emergency medical services (EMS). We sought to describe the components of team leadership and team membership on a single patient call where multiple EMS providers are present. We conducted a two-day focus group with nine subject matter experts in crew resource management (CRM) and EMS using a structured nominal group technique (NGT). The specific question posed to the group was, "What are the specific components of team leadership and team membership on a single patient call where multiple EMS providers are present?" After round-robin submission of ideas and in-depth discussion of the meaning of each component, participants voted on the most important components of team leadership and team membership. Through the NGT process, we identified eight components of team leadership: a) creates an action plan; b) communicates; c) receives, processes, verifies, and prioritizes information; d) reconciles incongruent information; e) demonstrates confidence, compassion, maturity, command presence, and trustworthiness; f) takes charge; g) is accountable for team actions and outcomes; and h) assesses the situation and resources and modifies the plan. The eight essential components of team membership identified included: a) demonstrates followership, b) maintains situational awareness, c) demonstrates appreciative inquiry, d) does not freelance, e) is an active listener, f) accurately performs tasks in a timely manner, g) is safety conscious and advocates for safety at all times, and h) leaves ego and rank at the door. This study used a highly structured qualitative technique and subject matter experts to identify components of teamwork essential for prehospital EMS providers. These findings and may be used to help inform the development of future EMS training and assessment initiatives.

  20. Three-dimensional histology: tools and application to quantitative assessment of cell-type distribution in rabbit heart

    PubMed Central

    Burton, Rebecca A.B.; Lee, Peter; Casero, Ramón; Garny, Alan; Siedlecka, Urszula; Schneider, Jürgen E.; Kohl, Peter; Grau, Vicente

    2014-01-01

    Aims Cardiac histo-anatomical organization is a major determinant of function. Changes in tissue structure are a relevant factor in normal and disease development, and form targets of therapeutic interventions. The purpose of this study was to test tools aimed to allow quantitative assessment of cell-type distribution from large histology and magnetic resonance imaging- (MRI) based datasets. Methods and results Rabbit heart fixation during cardioplegic arrest and MRI were followed by serial sectioning of the whole heart and light-microscopic imaging of trichrome-stained tissue. Segmentation techniques developed specifically for this project were applied to segment myocardial tissue in the MRI and histology datasets. In addition, histology slices were segmented into myocytes, connective tissue, and undefined. A bounding surface, containing the whole heart, was established for both MRI and histology. Volumes contained in the bounding surface (called ‘anatomical volume’), as well as that identified as containing any of the above tissue categories (called ‘morphological volume’), were calculated. The anatomical volume was 7.8 cm3 in MRI, and this reduced to 4.9 cm3 after histological processing, representing an ‘anatomical’ shrinkage by 37.2%. The morphological volume decreased by 48% between MRI and histology, highlighting the presence of additional tissue-level shrinkage (e.g. an increase in interstitial cleft space). The ratio of pixels classified as containing myocytes to pixels identified as non-myocytes was roughly 6:1 (61.6 vs. 9.8%; the remaining fraction of 28.6% was ‘undefined’). Conclusion Qualitative and quantitative differentiation between myocytes and connective tissue, using state-of-the-art high-resolution serial histology techniques, allows identification of cell-type distribution in whole-heart datasets. Comparison with MRI illustrates a pronounced reduction in anatomical and morphological volumes during histology processing. PMID:25362175

  1. Topics in Library Technology: Labeling Techniques *

    PubMed Central

    Truelson, Stanley D.

    1966-01-01

    Labels which do not fit on the spines of books should be placed on the upper rather than lower left corner of the front cover, because the upper corner becomes visible first when a volume is tilted from the shelf. None of the past methods of marking call numbers on the spines or covers of books—direct hand lettering by pen, brush, or stylus; affixing cold release characters; embossing by hot type; or gluing labels which are handlettered, typed, or printed—nor even present automatic data processing systems have offered all the advantages of the relatively new Se-Lin labeling system: legibility, reasonable speed of application, automatic protective covering, permanent bonding, and no need for a skilled letterer. Labels seem unaesthetic to some librarians, but their advantages outweigh this consideration. When only one or a few copies of the same call number are required, Se-Lin is the best system now available for libraries marking over 1,000 books a year. PMID:5901359

  2. T regulatory cells: an overview and intervention techniques to modulate allergy outcome

    PubMed Central

    Nandakumar, Subhadra; Miller, Christopher WT; Kumaraguru, Uday

    2009-01-01

    Dysregulated immune response results in inflammatory symptoms in the respiratory mucosa leading to asthma and allergy in susceptible individuals. The T helper type 2 (Th2) subsets are primarily involved in this disease process. Nevertheless, there is growing evidence in support of T cells with regulatory potential that operates in non-allergic individuals. These regulatory T cells occur naturally are called natural T regulatory cells (nTregs) and express the transcription factor Foxp3. They are selected in the thymus and move to the periphery. The CD4 Th cells in the periphery can be induced to become regulatory T cells and hence called induced or adaptive T regulatory cells. These cells can make IL-10 or TGF-b or both, by which they attain most of their suppressive activity. This review gives an overview of the regulatory T cells, their role in allergic diseases and explores possible interventionist approaches to manipulate Tregs for achieving therapeutic goals. PMID:19284628

  3. Constraint-based scheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte

    1991-01-01

    The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

  4. Constraint-based scheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte

    1991-01-01

    The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocations for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its applications to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

  5. Constraint-based scheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte

    1993-01-01

    The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems.

  6. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    NASA Astrophysics Data System (ADS)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  7. Categorisation of full waveform data provided by laser scanning devices

    NASA Astrophysics Data System (ADS)

    Ullrich, Andreas; Pfennigbauer, Martin

    2011-11-01

    In 2004, a laser scanner device for commercial airborne laser scanning applications, the RIEGL LMS-Q560, was introduced to the market, making use of a radical alternative approach to the traditional analogue signal detection and processing schemes found in LIDAR instruments so far: digitizing the echo signals received by the instrument for every laser pulse and analysing these echo signals off-line in a so-called full waveform analysis in order to retrieve almost all information contained in the echo signal using transparent algorithms adaptable to specific applications. In the field of laser scanning the somewhat unspecific term "full waveform data" has since been established. We attempt a categorisation of the different types of the full waveform data found in the market. We discuss the challenges in echo digitization and waveform analysis from an instrument designer's point of view and we will address the benefits to be gained by using this technique, especially with respect to the so-called multi-target capability of pulsed time-of-flight LIDAR instruments.

  8. A community long-term hotline therapeutic intervention model for coping with the threat and trauma of war and terror.

    PubMed

    Gelkopf, Marc; Haimov, Sigal; Lapid, Liron

    2015-02-01

    Long-term tele-counseling can potentially be a potent intervention mode in war- and terror-related community crisis situations. We aimed to examine a unique long-term telephone-administered intervention, targeting community trauma-related crisis situations by use of various techniques and approaches. 142 participants were evaluated using a non-intrusive by-proxy methodology appraising counselors' standard verbatim reports. Various background measures and elements in the intervention were quantitatively assessed, along with symptomatology and functioning at the onset and end of intervention. About 1/4 of the wide variety of clients called for someone else in addition to themselves, and most called due to a past event rather than a present crisis situation. The intervention successfully reduced posttraumatic stress symptoms and improved functioning. Most interventions included psychosocial education with additional elements, e.g., self-help tools, and almost 60% included also in-depth processes. In sum, tele-counseling might be a viable and effective intervention model for community-related traumatic stress.

  9. Non-song social call bouts of migrating humpback whales

    PubMed Central

    Rekdahl, Melinda L.; Dunlop, Rebecca A.; Goldizen, Anne W.; Garland, Ellen C.; Biassoni, Nicoletta; Miller, Patrick; Noad, Michael J.

    2015-01-01

    The use of stereotyped calls within structured bouts has been described for a number of species and may increase the information potential of call repertoires. Humpback whales produce a repertoire of social calls, although little is known about the complexity or function of these calls. In this study, digital acoustic tag recordings were used to investigate social call use within bouts, the use of bouts across different social contexts, and whether particular call type combinations were favored. Call order within bouts was investigated using call transition frequencies and information theory techniques. Call bouts were defined through analysis of inter-call intervals, as any calls within 3.9 s of each other. Bouts were produced significantly more when new whales joined a group compared to groups that did not change membership, and in groups containing multiple adults escorting a female and calf compared to adult only groups. Although social calls tended to be produced in bouts, there were few repeated bout types. However, the order in which most call types were produced within bouts was non-random and dependent on the preceding call type. These bouts appear to be at least partially governed by rules for how individual components are combined. PMID:26093396

  10. Non-song social call bouts of migrating humpback whales.

    PubMed

    Rekdahl, Melinda L; Dunlop, Rebecca A; Goldizen, Anne W; Garland, Ellen C; Biassoni, Nicoletta; Miller, Patrick; Noad, Michael J

    2015-06-01

    The use of stereotyped calls within structured bouts has been described for a number of species and may increase the information potential of call repertoires. Humpback whales produce a repertoire of social calls, although little is known about the complexity or function of these calls. In this study, digital acoustic tag recordings were used to investigate social call use within bouts, the use of bouts across different social contexts, and whether particular call type combinations were favored. Call order within bouts was investigated using call transition frequencies and information theory techniques. Call bouts were defined through analysis of inter-call intervals, as any calls within 3.9 s of each other. Bouts were produced significantly more when new whales joined a group compared to groups that did not change membership, and in groups containing multiple adults escorting a female and calf compared to adult only groups. Although social calls tended to be produced in bouts, there were few repeated bout types. However, the order in which most call types were produced within bouts was non-random and dependent on the preceding call type. These bouts appear to be at least partially governed by rules for how individual components are combined.

  11. Fluence map optimization (FMO) with dose-volume constraints in IMRT using the geometric distance sorting method.

    PubMed

    Lan, Yihua; Li, Cunhua; Ren, Haozheng; Zhang, Yong; Min, Zhifang

    2012-10-21

    A new heuristic algorithm based on the so-called geometric distance sorting technique is proposed for solving the fluence map optimization with dose-volume constraints which is one of the most essential tasks for inverse planning in IMRT. The framework of the proposed method is basically an iterative process which begins with a simple linear constrained quadratic optimization model without considering any dose-volume constraints, and then the dose constraints for the voxels violating the dose-volume constraints are gradually added into the quadratic optimization model step by step until all the dose-volume constraints are satisfied. In each iteration step, an interior point method is adopted to solve each new linear constrained quadratic programming. For choosing the proper candidate voxels for the current dose constraint adding, a so-called geometric distance defined in the transformed standard quadratic form of the fluence map optimization model was used to guide the selection of the voxels. The new geometric distance sorting technique can mostly reduce the unexpected increase of the objective function value caused inevitably by the constraint adding. It can be regarded as an upgrading to the traditional dose sorting technique. The geometry explanation for the proposed method is also given and a proposition is proved to support our heuristic idea. In addition, a smart constraint adding/deleting strategy is designed to ensure a stable iteration convergence. The new algorithm is tested on four cases including head-neck, a prostate, a lung and an oropharyngeal, and compared with the algorithm based on the traditional dose sorting technique. Experimental results showed that the proposed method is more suitable for guiding the selection of new constraints than the traditional dose sorting method, especially for the cases whose target regions are in non-convex shapes. It is a more efficient optimization technique to some extent for choosing constraints than the dose sorting method. By integrating a smart constraint adding/deleting scheme within the iteration framework, the new technique builds up an improved algorithm for solving the fluence map optimization with dose-volume constraints.

  12. Inverse Function: Pre-Service Teachers' Techniques and Meanings

    ERIC Educational Resources Information Center

    Paoletti, Teo; Stevens, Irma E.; Hobson, Natalie L. F.; Moore, Kevin C.; LaForest, Kevin R.

    2018-01-01

    Researchers have argued teachers and students are not developing connected meanings for function inverse, thus calling for a closer examination of teachers' and students' inverse function meanings. Responding to this call, we characterize 25 pre-service teachers' inverse function meanings as inferred from our analysis of clinical interviews. After…

  13. Perspectives: A Challenging Patriotism

    ERIC Educational Resources Information Center

    Boyte, Harry C.

    2012-01-01

    In a time of alarm about the poisoning of electoral politics, public passions inflamed by sophisticated techniques of mass polarization, and fears that the country is losing control of its collective future, higher education is called upon to take leadership in "reinventing citizenship." It needs to respond to that call on a scale unprecedented in…

  14. Teaching Free Expression in Word and Example (Commentary).

    ERIC Educational Resources Information Center

    Merrill, John

    1991-01-01

    Suggests that the teaching of free expression may be the highest calling of a communications or journalism professor. Argues that freedom must be tempered by a sense of ethics. Calls upon teachers to encourage students to analyze the questions surrounding free expression. Describes techniques for scrutinizing journalistic myths. (SG)

  15. Strategic planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Griner, Carolyn S.

    1990-01-01

    The concept for utilization and operations planning for the International Space Station Freedom was developed in a NASA Space Station Operations Task Force in 1986. Since that time the concept has been further refined to definitize the process and products required to integrate the needs of the international user community with the operational capabilities of the Station in its evolving configuration. The keystone to the process is the development of individual plans by the partners, with the parameters and formats common to the degree that electronic communications techniques can be effectively utilized, while maintaining the proper level and location of configuration control. The integration, evaluation, and verification of the integrated plan, called the Consolidated Operations and Utilization Plan (COUP), is being tested in a multilateral environment to prove out the parameters, interfaces, and process details necessary to produce the first COUP for Space Station in 1991. This paper will describe the concept, process, and the status of the multilateral test case.

  16. Optical coherence tomography imaging based on non-harmonic analysis

    NASA Astrophysics Data System (ADS)

    Cao, Xu; Hirobayashi, Shigeki; Chong, Changho; Morosawa, Atsushi; Totsuka, Koki; Suzuki, Takuya

    2009-11-01

    A new processing technique called Non-Harmonic Analysis (NHA) is proposed for OCT imaging. Conventional Fourier-Domain OCT relies on the FFT calculation which depends on the window function and length. Axial resolution is counter proportional to the frame length of FFT that is limited by the swept range of the swept source in SS-OCT, or the pixel counts of CCD in SD-OCT degraded in FD-OCT. However, NHA process is intrinsically free from this trade-offs; NHA can resolve high frequency without being influenced by window function or frame length of sampled data. In this study, NHA process is explained and applied to OCT imaging and compared with OCT images based on FFT. In order to validate the benefit of NHA in OCT, we carried out OCT imaging based on NHA with the three different sample of onion-skin,human-skin and pig-eye. The results show that NHA process can realize practical image resolution that is equivalent to 100nm swept range only with less than half-reduced wavelength range.

  17. Managing meat tenderness.

    PubMed

    Thompson, John

    2002-11-01

    This paper discusses the management of meat tenderness using a carcass grading scheme which utilizes the concept of total quality management of those factors which impact on beef palatability. The scheme called Meat Standards Australia (MSA) has identified the Critical Control Points (CCPs) from the production, pre-slaughter, processing and value adding sectors of the beef supply chain and quantified their relative importance using large-scale consumer testing. These CCPs have been used to manage beef palatability in two ways. Firstly, CCPs from the pre-slaughter and processing sectors have been used as mandatory criteria for carcasses to be graded. Secondly, other CCPs from the production and processing sectors have been incorporated into a model to predict palatability for individual muscles. The evidence for the importance of CCPs from the production (breed, growth path and HGP implants), pre-slaughter and processing (pH/temperature window, alternative carcass suspension, marbling and ageing) sectors are reviewed and the accuracy of the model to predict palatability for specific muscle×cooking techniques is presented.

  18. Rain volume estimation over areas using satellite and radar data

    NASA Technical Reports Server (NTRS)

    Doneaud, A. A.; Vonderhaar, T. H.

    1985-01-01

    An investigation of the feasibility of rain volume estimation using satellite data following a technique recently developed with radar data called the Arera Time Integral was undertaken. Case studies were selected on the basis of existing radar and satellite data sets which match in space and time. Four multicell clusters were analyzed. Routines for navigation remapping amd smoothing of satellite images were performed. Visible counts were normalized for solar zenith angle. A radar sector of interest was defined to delineate specific radar echo clusters for each radar time throughout the radar echo cluster lifetime. A satellite sector of interest was defined by applying small adjustments to the radar sector using a manual processing technique. The radar echo area, the IR maximum counts and the IR counts matching radar echo areas were found to evolve similarly, except for the decaying phase of the cluster where the cirrus debris keeps the IR counts high.

  19. Fundamental limits of reconstruction-based superresolution algorithms under local translation.

    PubMed

    Lin, Zhouchen; Shum, Heung-Yeung

    2004-01-01

    Superresolution is a technique that can produce images of a higher resolution than that of the originally captured ones. Nevertheless, improvement in resolution using such a technique is very limited in practice. This makes it significant to study the problem: "Do fundamental limits exist for superresolution?" In this paper, we focus on a major class of superresolution algorithms, called the reconstruction-based algorithms, which compute high-resolution images by simulating the image formation process. Assuming local translation among low-resolution images, this paper is the first attempt to determine the explicit limits of reconstruction-based algorithms, under both real and synthetic conditions. Based on the perturbation theory of linear systems, we obtain the superresolution limits from the conditioning analysis of the coefficient matrix. Moreover, we determine the number of low-resolution images that are sufficient to achieve the limit. Both real and synthetic experiments are carried out to verify our analysis.

  20. Enhanced Labeling Techniques to Study the Cytoskeleton During Root Growth and Gravitropism

    NASA Technical Reports Server (NTRS)

    Blancaflor, Elison B.

    2005-01-01

    Gravity effects the growth and development of all living organisms. One of the most obvious manifestations of gravity's effects on biological systems lies in the ability of plants to direct their growth along a path that is dictated by the gravity vector (called gravitropism). When positioned horizontally, in florescence stems and hypocotyls in dicots, and pulvini in monocots, respond by bending upward whereas roots typically bend downward. Gravitropism allows plants to readjust their growth to maximize light absorption for photosynthesis and to more efficiently acquire water and nutrients form the soil. Despite its significance for plant survival, there are still major gaps in understanding the cellular and molecular processes by which plants respond to gravity. The major aim of this proposal was to develop improved fluorescence labeling techniques to aid in understanding how the cytoskeleton modulated plant responses to gravity.

  1. MONTE CARLO SIMULATION OF THE BREMSSTRAHLUNG RADIATION FOR THE MEASUREMENT OF AN INTERNAL CONTAMINATION WITH PURE-BETA EMITTERS IN VIVO.

    PubMed

    Fantínová, K; Fojtík, P; Malátová, I

    2016-09-01

    Rapid measurement techniques are required for a large-scale emergency monitoring of people. In vivo measurement of the bremsstrahlung radiation produced by incorporated pure-beta emitters can offer a rapid technique for the determination of such radionuclides in the human body. This work presents a method for the calibration of spectrometers, based on the use of UPh-02T (so-called IGOR) phantom and specific (90)Sr/(90)Y sources, which can account for recent as well as previous contaminations. The process of the whole- and partial-body counter calibration in combination with application of a Monte Carlo code offers readily extension also to other pure-beta emitters and various exposure scenarios. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Application of gamma imaging techniques for the characterisation of position sensitive gamma detectors

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Didierjean, F.; Duchêne, G.; Filliger, M.; Gerl, J.; Kojouharov, I.; Li, G.; Pietralla, N.; Schaffner, H.; Sigward, M.-H.

    2017-11-01

    A device to characterize position-sensitive germanium detectors has been implemented at GSI. The main component of this so called scanning table is a gamma camera that is capable of producing online 2D images of the scanned detector by means of a PET technique. To calibrate the gamma camera Compton imaging is employed. The 2D data can be processed further offline to obtain depth information. Of main interest is the response of the scanned detector in terms of the digitized pulse shapes from the preamplifier. This is an important input for pulse-shape analysis algorithms as they are in use for gamma tracking arrays in gamma spectroscopy. To validate the scanning table, a comparison of its results with a second scanning table implemented at the IPHC Strasbourg is envisaged. For this purpose a pixelated germanium detector has been scanned.

  3. Parallel computation in a three-dimensional elastic-plastic finite-element analysis

    NASA Technical Reports Server (NTRS)

    Shivakumar, K. N.; Bigelow, C. A.; Newman, J. C., Jr.

    1992-01-01

    A CRAY parallel processing technique called autotasking was implemented in a three-dimensional elasto-plastic finite-element code. The technique was evaluated on two CRAY supercomputers, a CRAY 2 and a CRAY Y-MP. Autotasking was implemented in all major portions of the code, except the matrix equations solver. Compiler directives alone were not able to properly multitask the code; user-inserted directives were required to achieve better performance. It was noted that the connect time, rather than wall-clock time, was more appropriate to determine speedup in multiuser environments. For a typical example problem, a speedup of 2.1 (1.8 when the solution time was included) was achieved in a dedicated environment and 1.7 (1.6 with solution time) in a multiuser environment on a four-processor CRAY 2 supercomputer. The speedup on a three-processor CRAY Y-MP was about 2.4 (2.0 with solution time) in a multiuser environment.

  4. Multiresolutional schemata for unsupervised learning of autonomous robots for 3D space operation

    NASA Technical Reports Server (NTRS)

    Lacaze, Alberto; Meystel, Michael; Meystel, Alex

    1994-01-01

    This paper describes a novel approach to the development of a learning control system for autonomous space robot (ASR) which presents the ASR as a 'baby' -- that is, a system with no a priori knowledge of the world in which it operates, but with behavior acquisition techniques that allows it to build this knowledge from the experiences of actions within a particular environment (we will call it an Astro-baby). The learning techniques are rooted in the recursive algorithm for inductive generation of nested schemata molded from processes of early cognitive development in humans. The algorithm extracts data from the environment and by means of correlation and abduction, it creates schemata that are used for control. This system is robust enough to deal with a constantly changing environment because such changes provoke the creation of new schemata by generalizing from experiences, while still maintaining minimal computational complexity, thanks to the system's multiresolutional nature.

  5. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  6. The MOF+ Technique: A Significant Synergic Effect Enables High Performance Chromate Removal.

    PubMed

    Luo, Ming Biao; Xiong, Yang Yang; Wu, Hui Qiong; Feng, Xue Feng; Li, Jian Qiang; Luo, Feng

    2017-12-18

    A significant synergic effect between a metal-organic framework (MOF) and Fe 2 SO 4 , the so-called MOF + technique, is exploited for the first time to remove toxic chromate from aqueous solutions. The results show that relative to the pristine MOF samples (no detectable chromate removal), the MOF + method enables super performance, giving a 796 Cr mg g -1 adsorption capacity. The value is almost eight-fold higher than the best value of established MOF adsorbents, and the highest value of all reported porous adsorbents for such use. The adsorption mechanism, unlike the anion-exchange process that dominates chromate removal in all other MOF adsorbents, as unveiled by X-ray photoelectron spectroscopy (XPS), scanning electron microscopy (SEM), and transmission electron microscopy (TEM), is due to the surface formation of Fe 0.75 Cr 0.25 (OH) 3 nanospheres on the MOF samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Underwater Photo-Elicitation: A New Experiential Marine Education Technique

    ERIC Educational Resources Information Center

    Andrews, Steve; Stocker, Laura; Oechel, Walter

    2018-01-01

    Underwater photo-elicitation is a novel experiential marine education technique that combines direct experience in the marine environment with the use of digital underwater cameras. A program called Show Us Your Ocean! (SUYO!) was created, utilising a mixed methodology (qualitative and quantitative methods) to test the efficacy of this technique.…

  8. Q-Technique and Graphics Research.

    ERIC Educational Resources Information Center

    Kahle, Roger R.

    Because Q-technique is as appropriate for use with visual and design items as for use with words, it is not stymied by the topics one is likely to encounter in graphics research. In particular Q-technique is suitable for studying the so-called "congeniality" of typography, for various copytesting usages, and for multivariate graphics research. The…

  9. Writing with Basals: A Sentence Combining Approach to Comprehension.

    ERIC Educational Resources Information Center

    Reutzel, D. Ray; Merrill, Jimmie D.

    Sentence combining techniques can be used with basal readers to help students develop writing skills. The first technique is addition, characterized by using the connecting word "and" to join two or more base sentences together. The second technique is called "embedding," and is characterized by putting parts of two or more base sentences together…

  10. Design and fabrication of a glovebox for the Plasma Hearth Process radioactive bench-scale system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahlquist, D.R.

    This paper presents some of the design considerations and fabrication techniques for building a glovebox for the Plasma Hearth Process (PHP) radioactive bench-scale system. The PHP radioactive bench-scale system uses a plasma torch to process a variety of radioactive materials into a final vitrified waste form. The processed waste will contain plutonium and trace amounts of other radioactive materials. The glovebox used in this system is located directly below the plasma chamber and is called the Hearth Handling Enclosure (HHE). The HHE is designed to maintain a confinement boundary between the processed waste and the operator. Operations that take placemore » inside the HHE include raising and lowering the hearth using a hydraulic lift table, transporting the hearth within the HHE using an overhead monorail and hoist system, sampling and disassembly of the processed waste and hearth, weighing the hearth, rebuilding a hearth, and sampling HEPA filters. The PHP radioactive bench-scale system is located at the TREAT facility at Argonne National Laboratory-West in Idaho Falls, Idaho.« less

  11. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    NASA Astrophysics Data System (ADS)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  12. A reliable control system for measurement on film thickness in copper chemical mechanical planarization system

    NASA Astrophysics Data System (ADS)

    Li, Hongkai; Qu, Zilian; Zhao, Qian; Tian, Fangxin; Zhao, Dewen; Meng, Yonggang; Lu, Xinchun

    2013-12-01

    In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to know the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules' GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.

  13. A reliable control system for measurement on film thickness in copper chemical mechanical planarization system.

    PubMed

    Li, Hongkai; Qu, Zilian; Zhao, Qian; Tian, Fangxin; Zhao, Dewen; Meng, Yonggang; Lu, Xinchun

    2013-12-01

    In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to know the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules' GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.

  14. Local residue coupling strategies by neural network for InSAR phase unwrapping

    NASA Astrophysics Data System (ADS)

    Refice, Alberto; Satalino, Giuseppe; Chiaradia, Maria T.

    1997-12-01

    Phase unwrapping is one of the toughest problems in interferometric SAR processing. The main difficulties arise from the presence of point-like error sources, called residues, which occur mainly in close couples due to phase noise. We present an assessment of a local approach to the resolution of these problems by means of a neural network. Using a multi-layer perceptron, trained with the back- propagation scheme on a series of simulated phase images, fashion the best pairing strategies for close residue couples. Results show that god efficiencies and accuracies can have been obtained, provided a sufficient number of training examples are supplied. Results show that good efficiencies and accuracies can be obtained, provided a sufficient number of training examples are supplied. The technique is tested also on real SAR ERS-1/2 tandem interferometric images of the Matera test site, showing a good reduction of the residue density. The better results obtained by use of the neural network as far as local criteria are adopted appear justified given the probabilistic nature of the noise process on SAR interferometric phase fields and allows to outline a specifically tailored implementation of the neural network approach as a very fast pre-processing step intended to decrease the residue density and give sufficiently clean images to be processed further by more conventional techniques.

  15. A reliable control system for measurement on film thickness in copper chemical mechanical planarization system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hongkai; Qu, Zilian; Zhao, Qian

    In recent years, a variety of film thickness measurement techniques for copper chemical mechanical planarization (CMP) are subsequently proposed. In this paper, the eddy-current technique is used. In the control system of the CMP tool developed in the State Key Laboratory of Tribology, there are in situ module and off-line module for measurement subsystem. The in situ module can get the thickness of copper film on wafer surface in real time, and accurately judge when the CMP process should stop. This is called end-point detection. The off-line module is used for multi-points measurement after CMP process, in order to knowmore » the thickness of remained copper film. The whole control system is structured with two levels, and the physical connection between the upper and the lower is achieved by the industrial Ethernet. The process flow includes calibration and measurement, and there are different algorithms for two modules. In the process of software development, C++ is chosen as the programming language, in combination with Qt OpenSource to design two modules’ GUI and OPC technology to implement the communication between the two levels. In addition, the drawing function is developed relying on Matlab, enriching the software functions of the off-line module. The result shows that the control system is running stably after repeated tests and practical operations for a long time.« less

  16. New GMO regulations for old: Determining a new future for EU crop biotechnology.

    PubMed

    Davison, John; Ammann, Klaus

    2017-01-02

    In this review, current EU GMO regulations are subjected to a point-by point analysis to determine their suitability for agriculture in modern Europe. Our analysis concerns present GMO regulations as well as suggestions for possible new regulations for genome editing and New Breeding Techniques (for which no regulations presently exist). Firstly, the present GMO regulations stem from the early days of recombinant DNA and are not adapted to current scientific understanding on this subject. Scientific understanding of GMOs has changed and these regulations are now, not only unfit for their original purpose, but, the purpose itself is now no longer scientifically valid. Indeed, they defy scientific, economic, and even common, sense. A major EU regulatory preconception is that GM crops are basically different from their parent crops. Thus, the EU regulations are "process based" regulations that discriminate against GMOs simply because they are GMOs. However current scientific evidence shows a blending of classical crops and their GMO counterparts with no clear demarcation line between them. Canada has a "product based" approach and determines the safety of each new crop variety independently of the process used to obtain it. We advise that the EC re-writes it outdated regulations and moves toward such a product based approach.  Secondly, over the last few years new genomic editing techniques (sometimes called New Breeding Techniques) have evolved. These techniques are basically mutagenesis techniques that can generate genomic diversity and have vast potential for crop improvement. They are not GMO based techniques (any more than mutagenesis is a GMO technique), since in many cases no new DNA is introduced. Thus they cannot simply be lumped together with GMOs (as many anti-GMO NGOs would prefer). The EU currently has no regulations to cover these new techniques. In this review, we make suggestions as to how these new gene edited crops may be regulated. The EU is at a turning point where the wrong decision could destroy European agricultural competitively for decades to come.

  17. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    PubMed

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  18. Novel MRF fluid for ultra-low roughness optical surfaces

    NASA Astrophysics Data System (ADS)

    Dumas, Paul; McFee, Charles

    2014-08-01

    Over the past few years there have been an increasing number of applications calling for ultra-low roughness (ULR) surfaces. A critical demand has been driven by EUV optics, EUV photomasks, X-Ray, and high energy laser applications. Achieving ULR results on complex shapes like aspheres and X-Ray mirrors is extremely challenging with conventional polishing techniques. To achieve both tight figure and roughness specifications, substrates typically undergo iterative global and local polishing processes. Typically the local polishing process corrects the figure or flatness but cannot achieve the required surface roughness, whereas the global polishing process produces the required roughness but degrades the figure. Magnetorheological Finishing (MRF) is a local polishing technique based on a magnetically-sensitive fluid that removes material through a shearing mechanism with minimal normal load, thus removing sub-surface damage. The lowest surface roughness produced by current MRF is close to 3 Å RMS. A new ULR MR fluid uses a nano-based cerium as the abrasive in a proprietary aqueous solution, the combination of which reliably produces under 1.5Å RMS roughness on Fused Silica as measured by atomic force microscopy. In addition to the highly convergent figure correction achieved with MRF, we show results of our novel MR fluid achieving <1.5Å RMS roughness on fused silica and other materials.

  19. Optimizing Instruction Scheduling and Register Allocation for Register-File-Connected Clustered VLIW Architectures

    PubMed Central

    Tang, Haijing; Wang, Siye; Zhang, Yanjun

    2013-01-01

    Clustering has become a common trend in very long instruction words (VLIW) architecture to solve the problem of area, energy consumption, and design complexity. Register-file-connected clustered (RFCC) VLIW architecture uses the mechanism of global register file to accomplish the inter-cluster data communications, thus eliminating the performance and energy consumption penalty caused by explicit inter-cluster data move operations in traditional bus-connected clustered (BCC) VLIW architecture. However, the limit number of access ports to the global register file has become an issue which must be well addressed; otherwise the performance and energy consumption would be harmed. In this paper, we presented compiler optimization techniques for an RFCC VLIW architecture called Lily, which is designed for encryption systems. These techniques aim at optimizing performance and energy consumption for Lily architecture, through appropriate manipulation of the code generation process to maintain a better management of the accesses to the global register file. All the techniques have been implemented and evaluated. The result shows that our techniques can significantly reduce the penalty of performance and energy consumption due to access port limitation of global register file. PMID:23970841

  20. NeuroPhysics: Studying how neurons create the perception of space-time using Physics' tools and techniques

    NASA Astrophysics Data System (ADS)

    Dhingra, Shonali; Sandler, Roman; Rios, Rodrigo; Vuong, Cliff; Mehta, Mayank

    All animals naturally perceive the abstract concept of space-time. A brain region called the Hippocampus is known to be important in creating these perceptions, but the underlying mechanisms are unknown. In our lab we employ several experimental and computational techniques from Physics to tackle this fundamental puzzle. Experimentally, we use ideas from Nanoscience and Materials Science to develop techniques to measure the activity of hippocampal neurons, in freely-behaving animals. Computationally, we develop models to study neuronal activity patterns, which are point processes that are highly stochastic and multidimensional. We then apply these techniques to collect and analyze neuronal signals from rodents while they're exploring space in Real World or Virtual Reality with various stimuli. Our findings show that under these conditions neuronal activity depends on various parameters, such as sensory cues including visual and auditory, and behavioral cues including, linear and angular, position and velocity. Further, neuronal networks create internally-generated rhythms, which influence perception of space and time. In totality, these results further our understanding of how the brain develops a cognitive map of our surrounding space, and keep track of time.

  1. Development of Super-Ensemble techniques for ocean analyses: the Mediterranean Sea case

    NASA Astrophysics Data System (ADS)

    Pistoia, Jenny; Pinardi, Nadia; Oddo, Paolo; Collins, Matthew; Korres, Gerasimos; Drillet, Yann

    2017-04-01

    Short-term ocean analyses for Sea Surface Temperature SST in the Mediterranean Sea can be improved by a statistical post-processing technique, called super-ensemble. This technique consists in a multi-linear regression algorithm applied to a Multi-Physics Multi-Model Super-Ensemble (MMSE) dataset, a collection of different operational forecasting analyses together with ad-hoc simulations produced by modifying selected numerical model parameterizations. A new linear regression algorithm based on Empirical Orthogonal Function filtering techniques is capable to prevent overfitting problems, even if best performances are achieved when we add correlation to the super-ensemble structure using a simple spatial filter applied after the linear regression. Our outcomes show that super-ensemble performances depend on the selection of an unbiased operator and the length of the learning period, but the quality of the generating MMSE dataset has the largest impact on the MMSE analysis Root Mean Square Error (RMSE) evaluated with respect to observed satellite SST. Lower RMSE analysis estimates result from the following choices: 15 days training period, an overconfident MMSE dataset (a subset with the higher quality ensemble members), and the least square algorithm being filtered a posteriori.

  2. ProteinAC: a frequency domain technique for analyzing protein dynamics

    NASA Astrophysics Data System (ADS)

    Bozkurt Varolgunes, Yasemin; Demir, Alper

    2018-03-01

    It is widely believed that the interactions of proteins with ligands and other proteins are determined by their dynamic characteristics as opposed to only static, time-invariant processes. We propose a novel computational technique, called ProteinAC (PAC), that can be used to analyze small scale functional protein motions as well as interactions with ligands directly in the frequency domain. PAC was inspired by a frequency domain analysis technique that is widely used in electronic circuit design, and can be applied to both coarse-grained and all-atom models. It can be considered as a generalization of previously proposed static perturbation-response methods, where the frequency of the perturbation becomes the key. We discuss the precise relationship of PAC to static perturbation-response schemes. We show that the frequency of the perturbation may be an important factor in protein dynamics. Perturbations at different frequencies may result in completely different response behavior while magnitude and direction are kept constant. Furthermore, we introduce several novel frequency dependent metrics that can be computed via PAC in order to characterize response behavior. We present results for the ferric binding protein that demonstrate the potential utility of the proposed techniques.

  3. Electronic Devices Based on Oxide Thin Films Fabricated by Fiber-to-Film Process.

    PubMed

    Meng, You; Liu, Ao; Guo, Zidong; Liu, Guoxia; Shin, Byoungchul; Noh, Yong-Young; Fortunato, Elvira; Martins, Rodrigo; Shan, Fukai

    2018-05-30

    Technical development for thin-film fabrication is essential for emerging metal-oxide (MO) electronics. Although impressive progress has been achieved in fabricating MO thin films, the challenges still remain. Here, we report a versatile and general thermal-induced nanomelting technique for fabricating MO thin films from the fiber networks, briefly called fiber-to-film (FTF) process. The high quality of the FTF-processed MO thin films was confirmed by various investigations. The FTF process is generally applicable to numerous technologically relevant MO thin films, including semiconducting thin films (e.g., In 2 O 3 , InZnO, and InZrZnO), conducting thin films (e.g., InSnO), and insulating thin films (e.g., AlO x ). By optimizing the fabrication process, In 2 O 3 /AlO x thin-film transistors (TFTs) were successfully integrated by fully FTF processes. High-performance TFT was achieved with an average mobility of ∼25 cm 2 /(Vs), an on/off current ratio of ∼10 7 , a threshold voltage of ∼1 V, and a device yield of 100%. As a proof of concept, one-transistor-driven pixel circuit was constructed, which exhibited high controllability over the light-emitting diodes. Logic gates based on fully FTF-processed In 2 O 3 /AlO x TFTs were further realized, which exhibited good dynamic logic responses and voltage amplification by a factor of ∼4. The FTF technique presented here offers great potential in large-area and low-cost manufacturing for flexible oxide electronics.

  4. An implementation and performance measurement of the progressive retry technique

    NASA Technical Reports Server (NTRS)

    Suri, Gaurav; Huang, Yennun; Wang, Yi-Min; Fuchs, W. Kent; Kintala, Chandra

    1995-01-01

    This paper describes a recovery technique called progressive retry for bypassing software faults in message-passing applications. The technique is implemented as reusable modules to provide application-level software fault tolerance. The paper describes the implementation of the technique and presents results from the application of progressive retry to two telecommunications systems. the results presented show that the technique is helpful in reducing the total recovery time for message-passing applications.

  5. Gene Profiling Technique to Accelerate Stem Cell Therapies for Eye Diseases

    MedlinePlus

    ... like RPE. They also use a technique called quantitative RT-PCR to measure the expression of genes ... higher in iPS cells than mature RPE. But quantitative RT-PCR only permits the simultaneous measurement of ...

  6. Towards Using Reo for Compliance-Aware Business Process Modeling

    NASA Astrophysics Data System (ADS)

    Arbab, Farhad; Kokash, Natallia; Meng, Sun

    Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.

  7. ToTem: a tool for variant calling pipeline optimization.

    PubMed

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  8. Overlay leaves litho: impact of non-litho processes on overlay and compensation

    NASA Astrophysics Data System (ADS)

    Ruhm, Matthias; Schulz, Bernd; Cotte, Eric; Seltmann, Rolf; Hertzsch, Tino

    2014-10-01

    According to the ITRS roadmap [1], the overlay requirement for the 28nm node is 8nm. If we compare this number with the performance given by tool vendors for their most advanced immersion systems (which is < 3nm), there seems to remain a large margin. Does that mean that today's leading edge Fab has an easy life? Unfortunately not, as other contributors affecting overlay are emerging. Mask contributions and so-called non-linear wafer distortions are known effects that can impact overlay quite significantly. Furthermore, it is often forgotten that downstream (post-litho) processes can impact the overlay as well. Thus, it can be required to compensate for the effects of subsequent processes already at the lithography operation. Within our paper, we will briefly touch on the wafer distortion topic and discuss the limitations of lithography compensation techniques such as higher order corrections versus solving the root cause of the distortions. The primary focus will be on the impact of the etch processes on the pattern placement error. We will show how individual layers can get affected differently by showing typical wafer signatures. However, in contrast to the above-mentioned wafer distortion topic, lithographic compensation techniques can be highly effective to reduce the placement error significantly towards acceptable levels (see Figure 1). Finally we will discuss the overall overlay budget for a 28nm contact to gate case by taking the impact of the individual process contributors into account.

  9. Entering dubious realms: Grover Krantz, science, and Sasquatch.

    PubMed

    Regal, Brian

    2009-01-01

    Physical anthropologist Grover Krantz (1931-2002) spent his career arguing that the anomalous North American primate called Sasquatch was a living animal. He attempted to prove the creature's existence by applying to the problem the techniques of physical anthropology: methodologies and theoretical models that were outside the experience of the amateur enthusiasts who dominated the field of anomalous primate studies. For his efforts, he was dismissed or ignored by academics who viewed the Sasquatch, also commonly called Bigfoot, as at best a relic of folklore and at worst a hoax, and Krantz's project as having dubious value. Krantz also received a negative reaction from amateur Sasquatch researchers, some of whom threatened and abused him. His career is best situated therefore as part of the discussion about the historical relationship between amateur naturalists and professional scientists. The literature on this relationship articulates a combining/displacement process: when a knowledge domain that has potential for contributions to science is created by amateurs, it will eventually combine with and then be taken over by professionals, with the result that amateur leadership is displaced. This paper contributes to that discussion by showing the process at work in Krantz's failed attempt to legitimize Bigfoot research by removing it from the amateur sphere and repositioning it in the professional world of anthropology.

  10. Insight on how fishing bats discern prey and adjust their mechanic and sensorial features during the attack sequence

    PubMed Central

    Aizpurua, Ostaizka; Alberdi, Antton; Aihartza, Joxerra; Garin, Inazio

    2015-01-01

    Several insectivorous bats have included fish in their diet, yet little is known about the processes underlying this trophic shift. We performed three field experiments with wild fishing bats to address how they manage to discern fish from insects and adapt their hunting technique to capture fish. We show that bats react only to targets protruding above the water and discern fish from insects based on prey disappearance patterns. Stationary fish trigger short and shallow dips and a terminal echolocation pattern with an important component of the narrowband and low frequency calls. When the fish disappears during the attack process, bats regulate their attack increasing the number of broadband and high frequency calls in the last phase of the echolocation as well as by lengthening and deepening their dips. These adjustments may allow bats to obtain more valuable sensorial information and to perform dips adjusted to the level of uncertainty on the location of the submerged prey. The observed ultrafast regulation may be essential for enabling fishing to become cost-effective in bats, and demonstrates the ability of bats to rapidly modify and synchronise their sensorial and motor features as a response to last minute stimulus variations. PMID:26196094

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, S.; Arend, N.; Lunkenheimer, P.

    The relaxational dynamics in glass-forming glycerol and glycerol mixed with LiCl is investigated using different neutron scattering techniques. The performed neutron spin echo experiments, which extend up to relatively long relaxation time scales of the order of 10 ns, should allow for the detection of contributions from the so-called excess wing. This phenomenon, whose microscopic origin is controversially discussed, arises in a variety of glass formers and, until now, was almost exclusively investigated by dielectric spectroscopy and light scattering. In conclusion, we show here that the relaxational process causing the excess wing can also be detected by neutron scattering, whichmore » directly couples to density fluctuations.« less

  12. Damage Detection Using Holography and Interferometry

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    2003-01-01

    This paper reviews classical approaches to damage detection using laser holography and interferometry. The paper then details the modern uses of electronic holography and neural-net-processed characteristic patterns to detect structural damage. The design of the neural networks and the preparation of the training sets are discussed. The use of a technique to optimize the training sets, called folding, is explained. Then a training procedure is detailed that uses the holography-measured vibration modes of the undamaged structures to impart damage-detection sensitivity to the neural networks. The inspections of an optical strain gauge mounting plate and an International Space Station cold plate are presented as examples.

  13. Problem solving with genetic algorithms and Splicer

    NASA Technical Reports Server (NTRS)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  14. Discover binding pathways using the sliding binding-box docking approach: application to binding pathways of oseltamivir to avian influenza H5N1 neuraminidase

    NASA Astrophysics Data System (ADS)

    Tran, Diem-Trang T.; Le, Ly T.; Truong, Thanh N.

    2013-08-01

    Drug binding and unbinding are transient processes which are hardly observed by experiment and difficult to analyze by computational techniques. In this paper, we employed a cost-effective method called "pathway docking" in which molecular docking was used to screen ligand-receptor binding free energy surface to reveal possible paths of ligand approaching protein binding pocket. A case study was applied on oseltamivir, the key drug against influenza a virus. The equilibrium pathways identified by this method are found to be similar to those identified in prior studies using highly expensive computational approaches.

  15. The robot's eyes - Stereo vision system for automated scene analysis

    NASA Technical Reports Server (NTRS)

    Williams, D. S.

    1977-01-01

    Attention is given to the robot stereo vision system which maintains the image produced by solid-state detector television cameras in a dynamic random access memory called RAPID. The imaging hardware consists of sensors (two solid-state image arrays using a charge injection technique), a video-rate analog-to-digital converter, the RAPID memory, and various types of computer-controlled displays, and preprocessing equipment (for reflexive actions, processing aids, and object detection). The software is aimed at locating objects and transversibility. An object-tracking algorithm is discussed and it is noted that tracking speed is in the 50-75 pixels/s range.

  16. Interdisciplinary Research at the Intersection of CALL, NLP, and SLA: Methodological Implications from an Input Enhancement Project

    ERIC Educational Resources Information Center

    Ziegler, Nicole; Meurers, Detmar; Rebuschat, Patrick; Ruiz, Simón; Moreno-Vega, José L.; Chinkina, Maria; Li, Wenjing; Grey, Sarah

    2017-01-01

    Despite the promise of research conducted at the intersection of computer-assisted language learning (CALL), natural language processing, and second language acquisition, few studies have explored the potential benefits of using intelligent CALL systems to deepen our understanding of the process and products of second language (L2) learning. The…

  17. The Relationships among Calling, Religiousness, and Dysfunctional Career Thoughts in Public University Students

    ERIC Educational Resources Information Center

    Rodriguez, Stefanie Josephine

    2011-01-01

    The purpose of the study was to examine the relationships among calling, religiousness, and dysfunctional career thoughts. Though the cognitive processes in the career decision-making process have been a focus of research in recent years, the relationship between career thoughts and calling has only been studied once and career thoughts'…

  18. Decomposition-Based Decision Making for Aerospace Vehicle Design

    NASA Technical Reports Server (NTRS)

    Borer, Nicholas K.; Mavris, DImitri N.

    2005-01-01

    Most practical engineering systems design problems have multiple and conflicting objectives. Furthermore, the satisfactory attainment level for each objective ( requirement ) is likely uncertain early in the design process. Systems with long design cycle times will exhibit more of this uncertainty throughout the design process. This is further complicated if the system is expected to perform for a relatively long period of time, as now it will need to grow as new requirements are identified and new technologies are introduced. These points identify a need for a systems design technique that enables decision making amongst multiple objectives in the presence of uncertainty. Traditional design techniques deal with a single objective or a small number of objectives that are often aggregates of the overarching goals sought through the generation of a new system. Other requirements, although uncertain, are viewed as static constraints to this single or multiple objective optimization problem. With either of these formulations, enabling tradeoffs between the requirements, objectives, or combinations thereof is a slow, serial process that becomes increasingly complex as more criteria are added. This research proposal outlines a technique that attempts to address these and other idiosyncrasies associated with modern aerospace systems design. The proposed formulation first recasts systems design into a multiple criteria decision making problem. The now multiple objectives are decomposed to discover the critical characteristics of the objective space. Tradeoffs between the objectives are considered amongst these critical characteristics by comparison to a probabilistic ideal tradeoff solution. The proposed formulation represents a radical departure from traditional methods. A pitfall of this technique is in the validation of the solution: in a multi-objective sense, how can a decision maker justify a choice between non-dominated alternatives? A series of examples help the reader to observe how this technique can be applied to aerospace systems design and compare the results of this so-called Decomposition-Based Decision Making to more traditional design approaches.

  19. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  20. Principle of the electrically induced Transient Current Technique

    NASA Astrophysics Data System (ADS)

    Bronuzzi, J.; Moll, M.; Bouvet, D.; Mapelli, A.; Sallese, J. M.

    2018-05-01

    In the field of detector development for High Energy Physics, the so-called Transient Current Technique (TCT) is used to characterize the electric field profile and the charge trapping inside silicon radiation detectors where particles or photons create electron-hole pairs in the bulk of a semiconductor device, as PiN diodes. In the standard approach, the TCT signal originates from the free carriers generated close to the surface of a silicon detector, by short pulses of light or by alpha particles. This work proposes a new principle of charge injection by means of lateral PN junctions implemented in one of the detector electrodes, called the electrical TCT (el-TCT). This technique is fully compatible with CMOS technology and therefore opens new perspectives for assessment of radiation detectors performances.

  1. Calculating phase equilibrium properties of plasma pseudopotential model using hybrid Gibbs statistical ensemble Monte-Carlo technique

    NASA Astrophysics Data System (ADS)

    Butlitsky, M. A.; Zelener, B. B.; Zelener, B. V.

    2015-11-01

    Earlier a two-component pseudopotential plasma model, which we called a “shelf Coulomb” model has been developed. A Monte-Carlo study of canonical NVT ensemble with periodic boundary conditions has been undertaken to calculate equations of state, pair distribution functions, internal energies and other thermodynamics properties of the model. In present work, an attempt is made to apply so-called hybrid Gibbs statistical ensemble Monte-Carlo technique to this model. First simulation results data show qualitatively similar results for critical point region for both methods. Gibbs ensemble technique let us to estimate the melting curve position and a triple point of the model (in reduced temperature and specific volume coordinates): T* ≈ 0.0476, v* ≈ 6 × 10-4.

  2. Designing a more efficient, effective and safe Medical Emergency Team (MET) service using data analysis

    PubMed Central

    Bilgrami, Irma; Bain, Christopher; Webb, Geoffrey I.; Orosz, Judit; Pilcher, David

    2017-01-01

    Introduction Hospitals have seen a rise in Medical Emergency Team (MET) reviews. We hypothesised that the commonest MET calls result in similar treatments. Our aim was to design a pre-emptive management algorithm that allowed direct institution of treatment to patients without having to wait for attendance of the MET team and to model its potential impact on MET call incidence and patient outcomes. Methods Data was extracted for all MET calls from the hospital database. Association rule data mining techniques were used to identify the most common combinations of MET call causes, outcomes and therapies. Results There were 13,656 MET calls during the 34-month study period in 7936 patients. The most common MET call was for hypotension [31%, (2459/7936)]. These MET calls were strongly associated with the immediate administration of intra-venous fluid (70% [1714/2459] v 13% [739/5477] p<0.001), unless the patient was located on a respiratory ward (adjusted OR 0.41 [95%CI 0.25–0.67] p<0.001), had a cardiac cause for admission (adjusted OR 0.61 [95%CI 0.50–0.75] p<0.001) or was under the care of the heart failure team (adjusted OR 0.29 [95%CI 0.19–0.42] p<0.001). Modelling the effect of a pre-emptive management algorithm for immediate fluid administration without MET activation on data from a test period of 24 months following the study period, suggested it would lead to a 68.7% (2541/3697) reduction in MET calls for hypotension and a 19.6% (2541/12938) reduction in total METs without adverse effects on patients. Conclusion Routinely collected data and analytic techniques can be used to develop a pre-emptive management algorithm to administer intravenous fluid therapy to a specific group of hypotensive patients without the need to initiate a MET call. This could both lead to earlier treatment for the patient and less total MET calls. PMID:29281665

  3. Teaching Business Management to Engineers: The Impact of Interactive Lectures

    ERIC Educational Resources Information Center

    Rambocas, Meena; Sastry, Musti K. S.

    2017-01-01

    Some education specialists are challenging the use of traditional strategies in classrooms and are calling for the use of contemporary teaching and learning techniques. In response to these calls, many field experiments that compare different teaching and learning strategies have been conducted. However, to date, little is known on the outcomes of…

  4. 78 FR 69705 - 60-Day Notice of Proposed Information Collection: Mortgagee's Application for Partial Settlement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION CONTACT: Steve... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  5. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  6. Enhanced rearrangement technique for secure data transmission: case study credit card process

    NASA Astrophysics Data System (ADS)

    Vyavahare, Tushar; Tekade, Darshana; Nayak, Saurabh; kumar, N. Suresh; Blessy Trencia Lincy, S. S.

    2017-11-01

    Encryption of data is very important in order to keep the data secure and make secure transactions and transmission of data. Such as online shopping. whenever we give our card details there is possibility of data being hacked or intruded. So to secure that we need to encrypt the data and decryption strategy should be known only to that particular bank. Therefore to achieve this objective RSA algorithm can be used. Where only intended sender and receiver can know about the encryption and decryption of data. To make the RSA technique more secure in this paper we propose the technique we call it Modified RSA. for which a transposition module is designed which uses Row Transposition method to encrypt the data. Before giving the card details to RSA the input will be given to this transposition module which will scrambles the data and rearranges it. Output of transposition will be then provided to the modified RSA which produces the cipher text to send over the network. Use of RSA and the transposition module will provide the dual security to whole system.

  7. Three-Dimensional Shape Measurements of Specular Objects Using Phase-Measuring Deflectometry

    PubMed Central

    Wang, Yuemin; Huang, Shujun; Liu, Yue; Chang, Caixia; Gao, Feng; Jiang, Xiangqian

    2017-01-01

    The fast development in the fields of integrated circuits, photovoltaics, the automobile industry, advanced manufacturing, and astronomy have led to the importance and necessity of quickly and accurately obtaining three-dimensional (3D) shape data of specular surfaces for quality control and function evaluation. Owing to the advantages of a large dynamic range, non-contact operation, full-field and fast acquisition, high accuracy, and automatic data processing, phase-measuring deflectometry (PMD, also called fringe reflection profilometry) has been widely studied and applied in many fields. Phase information coded in the reflected fringe patterns relates to the local slope and height of the measured specular objects. The 3D shape is obtained by integrating the local gradient data or directly calculating the depth data from the phase information. We present a review of the relevant techniques regarding classical PMD. The improved PMD technique is then used to measure specular objects having discontinuous and/or isolated surfaces. Some influential factors on the measured results are presented. The challenges and future research directions are discussed to further advance PMD techniques. Finally, the application fields of PMD are briefly introduced. PMID:29215600

  8. Real-time estimation of wildfire perimeters from curated crowdsourcing

    NASA Astrophysics Data System (ADS)

    Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin

    2016-04-01

    Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available “curated” crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires.

  9. Three-Dimensional Shape Measurements of Specular Objects Using Phase-Measuring Deflectometry.

    PubMed

    Zhang, Zonghua; Wang, Yuemin; Huang, Shujun; Liu, Yue; Chang, Caixia; Gao, Feng; Jiang, Xiangqian

    2017-12-07

    The fast development in the fields of integrated circuits, photovoltaics, the automobile industry, advanced manufacturing, and astronomy have led to the importance and necessity of quickly and accurately obtaining three-dimensional (3D) shape data of specular surfaces for quality control and function evaluation. Owing to the advantages of a large dynamic range, non-contact operation, full-field and fast acquisition, high accuracy, and automatic data processing, phase-measuring deflectometry (PMD, also called fringe reflection profilometry) has been widely studied and applied in many fields. Phase information coded in the reflected fringe patterns relates to the local slope and height of the measured specular objects. The 3D shape is obtained by integrating the local gradient data or directly calculating the depth data from the phase information. We present a review of the relevant techniques regarding classical PMD. The improved PMD technique is then used to measure specular objects having discontinuous and/or isolated surfaces. Some influential factors on the measured results are presented. The challenges and future research directions are discussed to further advance PMD techniques. Finally, the application fields of PMD are briefly introduced.

  10. An explicit approach to detecting and characterizing submersed aquatic vegetation using a single-beam digital echosounder

    NASA Astrophysics Data System (ADS)

    Sabol, Bruce M.

    2005-09-01

    There has been a longstanding need for an objective and cost-effective technique to detect, characterize, and quantify submersed aquatic vegetation at spatial scales between direct physical sampling and remote aerial-based imaging. Acoustic-based approaches for doing so are reviewed and an explicit approach, using a narrow, single-beam echosounder, is described in detail. This heuristic algorithm is based on the spatial distribution of a thresholded signal generated from a high-frequency, narrow-beam echosounder operated in a vertical orientation from a survey boat. The physical basis, rationale, and implementation of this algorithm are described, and data documenting performance are presented. Using this technique, it is possible to generate orders of magnitude more data than would be available using previous techniques with a comparable level of effort. Thus, new analysis and interpretation approaches are called for which can make full use of these data. Several analyses' examples are shown for environmental effects application studies. Current operational window and performance limitations are identified and thoughts on potential processing approaches to improve performance are discussed.

  11. Real-time estimation of wildfire perimeters from curated crowdsourcing

    PubMed Central

    Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin

    2016-01-01

    Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available “curated” crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires. PMID:27063569

  12. Real-time estimation of wildfire perimeters from curated crowdsourcing.

    PubMed

    Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin

    2016-04-11

    Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available "curated" crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires.

  13. Non-uniform sampling: post-Fourier era of NMR data collection and processing.

    PubMed

    Kazimierczuk, Krzysztof; Orekhov, Vladislav

    2015-11-01

    The invention of multidimensional techniques in the 1970s revolutionized NMR, making it the general tool of structural analysis of molecules and materials. In the most straightforward approach, the signal sampling in the indirect dimensions of a multidimensional experiment is performed in the same manner as in the direct dimension, i.e. with a grid of equally spaced points. This results in lengthy experiments with a resolution often far from optimum. To circumvent this problem, numerous sparse-sampling techniques have been developed in the last three decades, including two traditionally distinct approaches: the radial sampling and non-uniform sampling. This mini review discusses the sparse signal sampling and reconstruction techniques from the point of view of an underdetermined linear algebra problem that arises when a full, equally spaced set of sampled points is replaced with sparse sampling. Additional assumptions that are introduced to solve the problem, as well as the shape of the undersampled Fourier transform operator (visualized as so-called point spread function), are shown to be the main differences between various sparse-sampling methods. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Parallel halftoning technique using dot diffusion optimization

    NASA Astrophysics Data System (ADS)

    Molina-Garcia, Javier; Ponomaryov, Volodymyr I.; Reyes-Reyes, Rogelio; Cruz-Ramos, Clara

    2017-05-01

    In this paper, a novel approach for halftone images is proposed and implemented for images that are obtained by the Dot Diffusion (DD) method. Designed technique is based on an optimization of the so-called class matrix used in DD algorithm and it consists of generation new versions of class matrix, which has no baron and near-baron in order to minimize inconsistencies during the distribution of the error. Proposed class matrix has different properties and each is designed for two different applications: applications where the inverse-halftoning is necessary, and applications where this method is not required. The proposed method has been implemented in GPU (NVIDIA GeForce GTX 750 Ti), multicore processors (AMD FX(tm)-6300 Six-Core Processor and in Intel core i5-4200U), using CUDA and OpenCV over a PC with linux. Experimental results have shown that novel framework generates a good quality of the halftone images and the inverse halftone images obtained. The simulation results using parallel architectures have demonstrated the efficiency of the novel technique when it is implemented in real-time processing.

  15. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  16. Leveraging Call Center Logs for Customer Behavior Prediction

    NASA Astrophysics Data System (ADS)

    Parvathy, Anju G.; Vasudevan, Bintu G.; Kumar, Abhishek; Balakrishnan, Rajesh

    Most major businesses use business process outsourcing for performing a process or a part of a process including financial services like mortgage processing, loan origination, finance and accounting and transaction processing. Call centers are used for the purpose of receiving and transmitting a large volume of requests through outbound and inbound calls to customers on behalf of a business. In this paper we deal specifically with the call centers notes from banks. Banks as financial institutions provide loans to non-financial businesses and individuals. Their call centers act as the nuclei of their client service operations and log the transactions between the customer and the bank. This crucial conversation or information can be exploited for predicting a customer’s behavior which will in turn help these businesses to decide on the next action to be taken. Thus the banks save considerable time and effort in tracking delinquent customers to ensure minimum subsequent defaulters. Majority of the time the call center notes are very concise and brief and often the notes are misspelled and use many domain specific acronyms. In this paper we introduce a novel domain specific spelling correction algorithm which corrects the misspelled words in the call center logs to meaningful ones. We also discuss a procedure that builds the behavioral history sequences for the customers by categorizing the logs into one of the predefined behavioral states. We then describe a pattern based predictive algorithm that uses temporal behavioral patterns mined from these sequences to predict the customer’s next behavioral state.

  17. Critical incident technique: an innovative participatory approach to examine and document racial disparities in breast cancer healthcare services

    PubMed Central

    Yonas, Michael A.; Aronson, Robert; Schaal, Jennifer; Eng, Eugenia; Hardy, Christina; Jones, Nora

    2013-01-01

    Disproportionate and persistent inequities in quality of healthcare have been observed among persons of color in the United States. To understand and ultimately eliminate such inequities, several public health institutions have issued calls for innovative methods and approaches that examine determinants from the social, organizational and public policy contexts to inform the design of systems change interventions. The authors, including academic and community research partners in a community-based participatory research (CBPR) study, reflected together on the use and value of the critical incident technique (CIT) for exploring racial disparities in healthcare for women with breast cancer. Academic and community partners used initial large group discussion involving a large partnership of 35 academic and community researchers guided by principles of CBPR, followed by the efforts of a smaller interdisciplinary manuscript team of academic and community researchers to reflect, document summarize and translate this participatory research process, lessons learned and value added from using the CIT with principles of CBPR and Undoing Racism. The finding of this article is a discussion of the process, strengths and challenges of utilizing CIT with CBPR. The participation of community members at all levels of the research process including development, collection of the data and analysis of the data was enhanced by the CIT process. As the field of CBPR continues to mature, innovative processes which combine the expertise of community and academic partners can enhance the success of such partnerships. This report contributes to existing literature by illustrating a unique and participatory research application of CIT with principles of CBPR and Undoing Racism. Findings highlight the collaborative process used to identify and implement this novel method and the adaptability of this technique in the interdisciplinary exploration of system-level changes to understand and address disparities in breast cancer and cancer care. PMID:24000307

  18. Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing

    PubMed Central

    Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil

    2016-01-01

    The cellular process responsible for providing energy for most life on Earth, namely photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers. PMID:27274603

  19. Decorin content and near infrared spectroscopy analysis of dried collagenous biomaterial samples.

    PubMed

    Aldema-Ramos, Mila L; Castell, Joan Carles; Muir, Zerlina E; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-12-14

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalil, C.N.

    Formation of paraffin (wax) in cold deepwater flowlines is a major problem for offshore operators of such facilities. Petrobras faces this problem continuously in its deepwater operations in the Campos basin, offshore Brazil. Since 1990, through its Petrobras Research Center (CENPES), the company has developed, extensively field tested, and recently commercialized, a novel technique for chemically removing such wax depositions. The process involves mixing and introducing to the line, two inorganic salts and organic solvents. The ensuing chemical reaction--which both generates nitrogen and heats the inside of the blocked flowline--allows the solvent to dissolve and dislodge the buildup, which ismore » then flushed from the line. The process is called the Nitrogen Generation System (SGN). Petrobras/CENPES has recently formed a joint venture with the Brazilian service company Maritima Navegacao e Engenharia Ltda. to offer SGN services worldwide.« less

  1. The neural basis of body form and body action agnosia.

    PubMed

    Moro, Valentina; Urgesi, Cosimo; Pernigo, Simone; Lanteri, Paola; Pazzaglia, Mariella; Aglioti, Salvatore Maria

    2008-10-23

    Visual analysis of faces and nonfacial body stimuli brings about neural activity in different cortical areas. Moreover, processing body form and body action relies on distinct neural substrates. Although brain lesion studies show specific face processing deficits, neuropsychological evidence for defective recognition of nonfacial body parts is lacking. By combining psychophysics studies with lesion-mapping techniques, we found that lesions of ventromedial, occipitotemporal areas induce face and body recognition deficits while lesions involving extrastriate body area seem causatively associated with impaired recognition of body but not of face and object stimuli. We also found that body form and body action recognition deficits can be double dissociated and are causatively associated with lesions to extrastriate body area and ventral premotor cortex, respectively. Our study reports two category-specific visual deficits, called body form and body action agnosia, and highlights their neural underpinnings.

  2. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  3. Costing bias in economic evaluations.

    PubMed

    Frappier, Julie; Tremblay, Gabriel; Charny, Mark; Cloutier, L Martin

    2015-01-01

    Determining the cost-effectiveness of healthcare interventions is key to the decision-making process in healthcare. Cost comparisons are used to demonstrate the economic value of treatment options, to evaluate the impact on the insurer budget, and are often used as a key criterion in treatment comparison and comparative effectiveness; however, little guidance is available to researchers for establishing the costing of clinical events and resource utilization. Different costing methods exist, and the choice of underlying assumptions appears to have a significant impact on the results of the costing analysis. This editorial describes the importance of the choice of the costing technique and it's potential impact on the relative cost of treatment options. This editorial also calls for a more efficient approach to healthcare intervention costing in order to ensure the use of consistent costing in the decision-making process.

  4. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Stoughton, John W.; Mielke, Roland R.; Som, Sukhamony

    1990-01-01

    The performance modeling and enhancement for periodic execution of large-grain, decision-free algorithms in data flow architectures is examined. Applications include real-time implementation of control and signal processing algorithms where performance is required to be highly predictable. The mapping of algorithms onto the specified class of data flow architectures is realized by a marked graph model called ATAMM (Algorithm To Architecture Mapping Model). Performance measures and bounds are established. Algorithm transformation techniques are identified for performance enhancement and reduction of resource (computing element) requirements. A systematic design procedure is described for generating operating conditions for predictable performance both with and without resource constraints. An ATAMM simulator is used to test and validate the performance prediction by the design procedure. Experiments on a three resource testbed provide verification of the ATAMM model and the design procedure.

  5. Strategies for concurrent processing of complex algorithms in data driven architectures

    NASA Technical Reports Server (NTRS)

    Som, Sukhamoy; Stoughton, John W.; Mielke, Roland R.

    1990-01-01

    Performance modeling and performance enhancement for periodic execution of large-grain, decision-free algorithms in data flow architectures are discussed. Applications include real-time implementation of control and signal processing algorithms where performance is required to be highly predictable. The mapping of algorithms onto the specified class of data flow architectures is realized by a marked graph model called algorithm to architecture mapping model (ATAMM). Performance measures and bounds are established. Algorithm transformation techniques are identified for performance enhancement and reduction of resource (computing element) requirements. A systematic design procedure is described for generating operating conditions for predictable performance both with and without resource constraints. An ATAMM simulator is used to test and validate the performance prediction by the design procedure. Experiments on a three resource testbed provide verification of the ATAMM model and the design procedure.

  6. Directional frequency and recording (DIFAR) sensors in seafloor recorders to locate calling bowhead whales during their fall migration.

    PubMed

    Greene, Charles R; McLennan, Miles Wm; Norman, Robert G; McDonald, Trent L; Jakubczak, Ray S; Richardson, W John

    2004-08-01

    Bowhead whales, Balaena mysticetus, migrate west during fall approximately 10-75 km off the north coast of Alaska, passing the petroleum developments around Prudhoe Bay. Oil production operations on an artificial island 5 km offshore create sounds heard by some whales. As part of an effort to assess whether migrating whales deflect farther offshore at times with high industrial noise, an acoustical approach was selected for localizing calling whales. The technique incorporated DIFAR (directional frequency and recording) sonobuoy techniques. An array of 11 DASARs (directional autonomous seafloor acoustic recorders) was built and installed with unit-to-unit separation of 5 km. When two or more DASARs detected the same call, the whale location was determined from the bearing intersections. This article describes the acoustic methods used to determine the locations of the calling bowhead whales and shows the types and precision of the data acquired. Calibration transmissions at GPS-measured times and locations provided measures of the individual DASAR clock drift and directional orientation. The standard error of the bearing measurements at distances of 3-4 km was approximately 1.35 degrees after corrections for gain imbalance in the two directional sensors. During 23 days in 2002, 10,587 bowhead calls were detected and 8383 were localized.

  7. Variable horizon in a peridynamic medium

    DOE PAGES

    Silling, Stewart A.; Littlewood, David J.; Seleson, Pablo

    2015-12-10

    Here, a notion of material homogeneity is proposed for peridynamic bodies with variable horizon but constant bulk properties. A relation is derived that scales the force state according to the position-dependent horizon while keeping the bulk properties unchanged. Using this scaling relation, if the horizon depends on position, artifacts called ghost forces may arise in a body under a homogeneous deformation. These artifacts depend on the second derivative of the horizon and can be reduced by employing a modified equilibrium equation using a new quantity called the partial stress. Bodies with piecewise constant horizon can be modeled without ghost forcesmore » by using a simpler technique called a splice. As a limiting case of zero horizon, both the partial stress and splice techniques can be used to achieve local-nonlocal coupling. Computational examples, including dynamic fracture in a one-dimensional model with local-nonlocal coupling, illustrate the methods.« less

  8. Creating a web-enhanced interactive preclinic technique manual: case report and student response.

    PubMed

    Boberick, Kenneth G

    2004-12-01

    This article describes the development, use, and student response to an online manual developed with off-the-shelf software and made available using a web-based course management system (Blackboard) that was used to transform a freshman restorative preclinical technique course from a lecture-only course into an interactive web-enhanced course. The goals of the project were to develop and implement a web-enhanced interactive learning experience in a preclinical restorative technique course and shift preclinical education from a teacher-centered experience to a student-driven experience. The project was evaluated using an anonymous post-course survey (95 percent response rate) of 123 freshman students that assessed enabling (technical support and access to the technology), process (the actual experience and usability), and outcome criteria (acquisition and successful use of the knowledge gained and skills learned) of the online manual. Students responded favorably to sections called "slide galleries" where ideal and non-ideal examples of projects could be viewed. Causes, solutions, and preventive measures were provided for the errors shown. Sections called "slide series" provided cookbook directions allowing for self-paced and student-directed learning. Virtually all of the students, 99 percent, found the quality of the streaming videos adequate to excellent. Regarding Internet connections and video viewing, 65 percent of students successfully viewed the videos from a remote site; cable connections were the most reliable, dial-up connections were inadequate, and DSL connections were variable. Seventy-three percent of the students felt the videos were an effective substitute for in-class demonstrations. Students preferred video with sound over video with subtitles and preferred short video clips embedded in the text over compilation videos. The results showed it is possible to develop and implement web-enhanced and interactive dental education in a preclinical restorative technique course that successfully delivered information beyond the textual format.

  9. Product, not process! Explaining a basic concept in agricultural biotechnologies and food safety.

    PubMed

    Tagliabue, Giovanni

    2017-12-01

    Most life scientists have relentlessly recommended any evaluative approach of agri-food products to be based on examination of the phenotype, i.e. the actual characteristics of the food, feed and fiber varieties: the effects of any new cultivar (or micro-organism, animal) on our health are not dependent on the process(es), the techniques used to obtain it.The so-called "genetically modified organisms" ("GMOs"), on the other hand, are commonly framed as a group with special properties - most frequently seen as dubious, or even harmful.Some social scientists still believe that considering the process is a correct background for science-based understanding and regulation. To show that such an approach is utterly wrong, and to invite scientists, teachers and science communicators to explain this mistake to students, policy-makers and the public at large, we imagined a dialogue between a social scientist, who has a positive opinion about a certain weight that a process-based orientation should have in the risk assessment, and a few experts who offer plenty of arguments against that view. The discussion focuses on new food safety.

  10. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  11. Refrigerated fruit juices: quality and safety issues.

    PubMed

    Esteve, Maria Jose; Frígola, Ana

    2007-01-01

    Fruit juices are an important source of bioactive compounds, but techniques used for their processing and subsequent storage may cause alterations in their contents so they do not provide the benefits expected by the consumer. In recent years consumers have increasingly sought so-called "fresh" products (like fresh products), stored in refrigeration. This has led the food industry to develop alternative processing technologies to produce foods with a minimum of nutritional, physicochemical, or organoleptic changes induced by the technologies themselves. Attention has also focused on evaluating the microbiological or toxicological risks that may be involved in applying these processes, and their effect on food safety, in order to obtain safe products that do not present health risks. This concept of minimal processing is currently becoming a reality with conventional technologies (mild pasteurization) and nonthermal technologies, some recently introduced (pasteurization by high hydrostatic pressure) and some perhaps with a more important role in the future (pulsed electric fields). Nevertheless, processing is not the only factor that affects the quality of these products. It is also necessary to consider the conditions for refrigerated storage and to control time and temperature.

  12. Real-time high-velocity resolution color Doppler OCT

    NASA Astrophysics Data System (ADS)

    Westphal, Volker; Yazdanfar, Siavash; Rollins, Andrew M.; Izatt, Joseph A.

    2001-05-01

    Color Doppler optical coherence tomography (CDOCT), also called Optical Doppler Tomography) is a noninvasive optical imaging technique, which allows for micron-scale physiological flow mapping simultaneous with morphological OCT imaging. Current systems for real-time endoscopic optical coherence tomography (EOCT) would be enhanced by the capability to visualize sub-surface blood flow for applications in early cancer diagnosis and the management of bleeding ulcers. Unfortunately, previous implementations of CDOCT have either been sufficiently computationally expensive (employing Fourier or Hilbert transform techniques) to rule out real-time imaging of flow, or have been restricted to imaging of excessively high flow velocities when used in real time. We have developed a novel Doppler OCT signal-processing strategy capable of imaging physiological flow rates in real time. This strategy employs cross-correlation processing of sequential A-scans in an EOCT image, as opposed to autocorrelation processing as described previously. To measure Doppler shifts in the kHz range using this technique, it was necessary to stabilize the EOCT interferometer center frequency, eliminate parasitic phase noise, and to construct a digital cross correlation unit able to correlate signals of megahertz bandwidth by a fixed lag of up to a few ms. The performance of the color Doppler OCT system was demonstrated in a flow phantom, demonstrating a minimum detectable flow velocity of ~0.8 mm/s at a data acquisition rate of 8 images/second (with 480 A-scans/image) using a handheld probe. Dynamic flow as well as using it freehanded was shown. Flow was also detectable in a phantom in combination with a clinical usable endoscopic probe.

  13. Uncovering Pompeii: Examining Evidence.

    ERIC Educational Resources Information Center

    Yell, Michael M.

    2001-01-01

    Presents a lesson plan on Pompeii (Italy) for middle school students that utilizes a teaching technique called interactive presentation. Describes the technique's five phases: (1) discrepant event inquiry; (2) discussion/presentation; (3) cooperative learning activity; (4) writing for understanding activity; and (5) whole-class discussion and…

  14. A New Approach to Micro-arcsecond Astrometry with SIM Allowing Early Mission Narrow Angle Measurements of Compelling Astronomical Targets

    NASA Technical Reports Server (NTRS)

    Shaklan, Stuart; Pan, Xiaopei

    2004-01-01

    The Space Interferometry Mission (SIM) is capable of detecting and measuring the mass of terrestrial planets around stars other than our own. It can measure the mass of black holes and the visual orbits of radio and x-ray binary sources. SIM makes possible a new level of understanding of complex astrophysical processes. SIM achieves its high precision in the so-called narrow-angle regime. This is defined by a 1 degree diameter field in which the position of a target star is measured with respect to a set of reference stars. The observation is performed in two parts: first, SIM observes a grid of stars that spans the full sky. After a few years, repeated observations of the grid allow one to determine the orientation of the interferometer baseline. Second, throughout the mission, SIM periodically observes in the narrow-angle mode. Every narrow-angle observation is linked to the grid to determine the precise attitude and length of the baseline. The narrow angle process demands patience. It is not until five years after launch that SIM achieves its ultimate accuracy of 1 microarcsecond. The accuracy is degraded by a factor of approx. 2 at mid-mission. Our work proposes a technique for narrow angle astrometry that does not rely on the measurement of grid stars. This technique, called Gridless Narrow Angle Astrometry (GNAA) can obtain microarcsecond accuracy and can detect extra-solar planets and other exciting objects with a few days of observation. It can be applied as early as during the first six months of in-orbit calibration (IOC). The motivations for doing this are strong. First, and obviously, it is an insurance policy against a catastrophic mid-mission failure. Second, at the start of the mission, with several space-based interferometers in the planning or implementation phase, NASA will be eager to capture the public's imagination with interferometric science. Third, early results and a technique that can duplicate those results throughout the mission will give the analysts important experience in the proper use and calibration of SIM.

  15. Alarm Fatigue vs User Expectations Regarding Context-Aware Alarm Handling in Hospital Environments Using CallMeSmart.

    PubMed

    Solvoll, Terje; Arntsen, Harald; Hartvigsen, Gunnar

    2017-01-01

    Surveys and research show that mobile communication systems in hospital settings are old and cause frequent interruptions. In the quest to remedy this, an Android based communication system called CallMeSmart tries to encapsulate most of the frequent communication into one hand held device focusing on reducing interruptions and at the same time make the workday easier for healthcare workers. The objective of CallMeSmart is to use context-awareness techniques to automatically monitor the availability of physicians' and nurses', and use this information to prevent or route phone calls, text messages, pages and alarms that would otherwise compromise patient care. In this paper, we present the results from interviewing nurses on alarm fatigue and their expectations regarding context-aware alarm handling using CallMeSmart.

  16. Sex differences in the representation of call stimuli in a songbird secondary auditory area

    PubMed Central

    Giret, Nicolas; Menardy, Fabien; Del Negro, Catherine

    2015-01-01

    Understanding how communication sounds are encoded in the central auditory system is critical to deciphering the neural bases of acoustic communication. Songbirds use learned or unlearned vocalizations in a variety of social interactions. They have telencephalic auditory areas specialized for processing natural sounds and considered as playing a critical role in the discrimination of behaviorally relevant vocal sounds. The zebra finch, a highly social songbird species, forms lifelong pair bonds. Only male zebra finches sing. However, both sexes produce the distance call when placed in visual isolation. This call is sexually dimorphic, is learned only in males and provides support for individual recognition in both sexes. Here, we assessed whether auditory processing of distance calls differs between paired males and females by recording spiking activity in a secondary auditory area, the caudolateral mesopallium (CLM), while presenting the distance calls of a variety of individuals, including the bird itself, the mate, familiar and unfamiliar males and females. In males, the CLM is potentially involved in auditory feedback processing important for vocal learning. Based on both the analyses of spike rates and temporal aspects of discharges, our results clearly indicate that call-evoked responses of CLM neurons are sexually dimorphic, being stronger, lasting longer, and conveying more information about calls in males than in females. In addition, how auditory responses vary among call types differ between sexes. In females, response strength differs between familiar male and female calls. In males, temporal features of responses reveal a sensitivity to the bird's own call. These findings provide evidence that sexual dimorphism occurs in higher-order processing areas within the auditory system. They suggest a sexual dimorphism in the function of the CLM, contributing to transmit information about the self-generated calls in males and to storage of information about the bird's auditory experience in females. PMID:26578918

  17. Sex differences in the representation of call stimuli in a songbird secondary auditory area.

    PubMed

    Giret, Nicolas; Menardy, Fabien; Del Negro, Catherine

    2015-01-01

    Understanding how communication sounds are encoded in the central auditory system is critical to deciphering the neural bases of acoustic communication. Songbirds use learned or unlearned vocalizations in a variety of social interactions. They have telencephalic auditory areas specialized for processing natural sounds and considered as playing a critical role in the discrimination of behaviorally relevant vocal sounds. The zebra finch, a highly social songbird species, forms lifelong pair bonds. Only male zebra finches sing. However, both sexes produce the distance call when placed in visual isolation. This call is sexually dimorphic, is learned only in males and provides support for individual recognition in both sexes. Here, we assessed whether auditory processing of distance calls differs between paired males and females by recording spiking activity in a secondary auditory area, the caudolateral mesopallium (CLM), while presenting the distance calls of a variety of individuals, including the bird itself, the mate, familiar and unfamiliar males and females. In males, the CLM is potentially involved in auditory feedback processing important for vocal learning. Based on both the analyses of spike rates and temporal aspects of discharges, our results clearly indicate that call-evoked responses of CLM neurons are sexually dimorphic, being stronger, lasting longer, and conveying more information about calls in males than in females. In addition, how auditory responses vary among call types differ between sexes. In females, response strength differs between familiar male and female calls. In males, temporal features of responses reveal a sensitivity to the bird's own call. These findings provide evidence that sexual dimorphism occurs in higher-order processing areas within the auditory system. They suggest a sexual dimorphism in the function of the CLM, contributing to transmit information about the self-generated calls in males and to storage of information about the bird's auditory experience in females.

  18. White blood cell segmentation by circle detection using electromagnetism-like optimization.

    PubMed

    Cuevas, Erik; Oliva, Diego; Díaz, Margarita; Zaldivar, Daniel; Pérez-Cisneros, Marco; Pajares, Gonzalo

    2013-01-01

    Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC) images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO) algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability.

  19. Adapting hierarchical bidirectional inter prediction on a GPU-based platform for 2D and 3D H.264 video coding

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sánchez, Rafael; Martínez, José Luis; Cock, Jan De; Fernández-Escribano, Gerardo; Pieters, Bart; Sánchez, José L.; Claver, José M.; de Walle, Rik Van

    2013-12-01

    The H.264/AVC video coding standard introduces some improved tools in order to increase compression efficiency. Moreover, the multi-view extension of H.264/AVC, called H.264/MVC, adopts many of them. Among the new features, variable block-size motion estimation is one which contributes to high coding efficiency. Furthermore, it defines a different prediction structure that includes hierarchical bidirectional pictures, outperforming traditional Group of Pictures patterns in both scenarios: single-view and multi-view. However, these video coding techniques have high computational complexity. Several techniques have been proposed in the literature over the last few years which are aimed at accelerating the inter prediction process, but there are no works focusing on bidirectional prediction or hierarchical prediction. In this article, with the emergence of many-core processors or accelerators, a step forward is taken towards an implementation of an H.264/AVC and H.264/MVC inter prediction algorithm on a graphics processing unit. The results show a negligible rate distortion drop with a time reduction of up to 98% for the complete H.264/AVC encoder.

  20. White Blood Cell Segmentation by Circle Detection Using Electromagnetism-Like Optimization

    PubMed Central

    Oliva, Diego; Díaz, Margarita; Zaldivar, Daniel; Pérez-Cisneros, Marco; Pajares, Gonzalo

    2013-01-01

    Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC) images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO) algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability. PMID:23476713

  1. Single-Molecule Imaging of Cellular Signaling

    NASA Astrophysics Data System (ADS)

    De Keijzer, Sandra; Snaar-Jagalska, B. Ewa; Spaink, Herman P.; Schmidt, Thomas

    Single-molecule microscopy is an emerging technique to understand the function of a protein in the context of its natural environment. In our laboratory this technique has been used to study the dynamics of signal transduction in vivo. A multitude of signal transduction cascades are initiated by interactions between proteins in the plasma membrane. These cascades start by binding a ligand to its receptor, thereby activating downstream signaling pathways which finally result in complex cellular responses. To fully understand these processes it is important to study the initial steps of the signaling cascades. Standard biological assays mostly call for overexpression of the proteins and high concentrations of ligand. This sets severe limits to the interpretation of, for instance, the time-course of the observations, given the large temporal spread caused by the diffusion-limited binding processes. Methods and limitations of single-molecule microscopy for the study of cell signaling are discussed on the example of the chemotactic signaling of the slime-mold Dictyostelium discoideum. Single-molecule studies, as reviewed in this chapter, appear to be one of the essential methodologies for the full spatiotemporal clarification of cellular signaling, one of the ultimate goals in cell biology.

  2. Manufacturability: from design to SPC limits through "corner-lot" characterization

    NASA Astrophysics Data System (ADS)

    Hogan, Timothy J.; Baker, James C.; Wesneski, Lisa; Black, Robert S.; Rothenbury, Dave

    2004-12-01

    Texas Instruments" Digital Micro-mirror Device, is used in a wide variety of optical display applications ranging from fixed and portable projectors to high-definition television (HDTV) to digital cinema projection systems. A new DMD pixel architecture, called "FTP", was designed and qualified by Texas Instruments DLPTMTM Group in 2003 to meet increased performance objectives for brightness and contrast ratio. Coordination between design, test and fabrication groups was required to balance pixel performance requirements and manufacturing capability. "Corner Lot" designed experiments (DOE) were used to verify "fabrication space" available for the pixel design. The corner lot technique allows confirmation of manufacturability projections early in the design/qualification cycle. Through careful design and analysis of the corner-lot DOE, a balance of critical dimension (cd) "budgets" is possible so that specification and process control limits can be established that meet both customer and factory requirements. The application of corner-lot DOE is illustrated in a case history of the DMD "FTP" pixel. The process for balancing test parameter requirements with multiple critical dimension budgets is shown. MEMS/MOEMS device design and fabrication can use similar techniques to achieve agressive design-to-qualification goals.

  3. Manufacturability: from design to SPC limits through "corner-lot" characterization

    NASA Astrophysics Data System (ADS)

    Hogan, Timothy J.; Baker, James C.; Wesneski, Lisa; Black, Robert S.; Rothenbury, Dave

    2005-01-01

    Texas Instruments" Digital Micro-mirror Device, is used in a wide variety of optical display applications ranging from fixed and portable projectors to high-definition television (HDTV) to digital cinema projection systems. A new DMD pixel architecture, called "FTP", was designed and qualified by Texas Instruments DLPTMTM Group in 2003 to meet increased performance objectives for brightness and contrast ratio. Coordination between design, test and fabrication groups was required to balance pixel performance requirements and manufacturing capability. "Corner Lot" designed experiments (DOE) were used to verify "fabrication space" available for the pixel design. The corner lot technique allows confirmation of manufacturability projections early in the design/qualification cycle. Through careful design and analysis of the corner-lot DOE, a balance of critical dimension (cd) "budgets" is possible so that specification and process control limits can be established that meet both customer and factory requirements. The application of corner-lot DOE is illustrated in a case history of the DMD "FTP" pixel. The process for balancing test parameter requirements with multiple critical dimension budgets is shown. MEMS/MOEMS device design and fabrication can use similar techniques to achieve agressive design-to-qualification goals.

  4. 78 FR 70957 - 60-Day Notice of Proposed Information Collection: HUD-Owned Real Estate Good Neighbor Next Door...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ... calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION CONTACT: Ivery W... number through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... automated collection techniques or other forms of information technology, e.g., permitting electronic...

  5. 78 FR 67384 - 60-Day Notice of Proposed Information Collection: FHA-Insured Mortgage Loan Servicing Involving...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... hearing or speech impairments may access this number through TTY by calling the toll-free Federal Relay... calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available documents submitted to... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  6. 78 FR 75364 - 60-Day Notice of Proposed Information Collection: Application for FHA Insured Mortgages

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-11

    ... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. FOR FURTHER INFORMATION... through TTY by calling the toll-free Federal Relay Service at (800) 877-8339. Copies of available... techniques or other forms of information technology, e.g., permitting electronic submission of responses. HUD...

  7. Encourage Students to Read through the Use of Data Visualization

    ERIC Educational Resources Information Center

    Bandeen, Heather M.; Sawin, Jason E.

    2012-01-01

    Instructors are always looking for new ways to engage students in reading assignments. The authors present a few techniques that rely on a web-based data visualization tool called Wordle (wordle.net). Wordle creates word frequency representations called word clouds. The larger a word appears within a cloud, the more frequently it occurs within a…

  8. Individual identity and affective valence in marmoset calls: in vivo brain imaging with vocal sound playback.

    PubMed

    Kato, Masaki; Yokoyama, Chihiro; Kawasaki, Akihiro; Takeda, Chiho; Koike, Taku; Onoe, Hirotaka; Iriki, Atsushi

    2018-05-01

    As with humans, vocal communication is an important social tool for nonhuman primates. Common marmosets (Callithrix jacchus) often produce whistle-like 'phee' calls when they are visually separated from conspecifics. The neural processes specific to phee call perception, however, are largely unknown, despite the possibility that these processes involve social information. Here, we examined behavioral and whole-brain mapping evidence regarding the detection of individual conspecific phee calls using an audio playback procedure. Phee calls evoked sound exploratory responses when the caller changed, indicating that marmosets can discriminate between caller identities. Positron emission tomography with [ 18 F] fluorodeoxyglucose revealed that perception of phee calls from a single subject was associated with activity in the dorsolateral prefrontal, medial prefrontal, orbitofrontal cortices, and the amygdala. These findings suggest that these regions are implicated in cognitive and affective processing of salient social information. However, phee calls from multiple subjects induced brain activation in only some of these regions, such as the dorsolateral prefrontal cortex. We also found distinctive brain deactivation and functional connectivity associated with phee call perception depending on the caller change. According to changes in pupillary size, phee calls from a single subject induced a higher arousal level compared with those from multiple subjects. These results suggest that marmoset phee calls convey information about individual identity and affective valence depending on the consistency or variability of the caller. Based on the flexible perception of the call based on individual recognition, humans and marmosets may share some neural mechanisms underlying conspecific vocal perception.

  9. Comparison of fMRI paradigms assessing visuospatial processing: Robustness and reproducibility

    PubMed Central

    Herholz, Peer; Zimmermann, Kristin M.; Westermann, Stefan; Frässle, Stefan; Jansen, Andreas

    2017-01-01

    The development of brain imaging techniques, in particular functional magnetic resonance imaging (fMRI), made it possible to non-invasively study the hemispheric lateralization of cognitive brain functions in large cohorts. Comprehensive models of hemispheric lateralization are, however, still missing and should not only account for the hemispheric specialization of individual brain functions, but also for the interactions among different lateralized cognitive processes (e.g., language and visuospatial processing). This calls for robust and reliable paradigms to study hemispheric lateralization for various cognitive functions. While numerous reliable imaging paradigms have been developed for language, which represents the most prominent left-lateralized brain function, the reliability of imaging paradigms investigating typically right-lateralized brain functions, such as visuospatial processing, has received comparatively less attention. In the present study, we aimed to establish an fMRI paradigm that robustly and reliably identifies right-hemispheric activation evoked by visuospatial processing in individual subjects. In a first study, we therefore compared three frequently used paradigms for assessing visuospatial processing and evaluated their utility to robustly detect right-lateralized brain activity on a single-subject level. In a second study, we then assessed the test-retest reliability of the so-called Landmark task–the paradigm that yielded the most robust results in study 1. At the single-voxel level, we found poor reliability of the brain activation underlying visuospatial attention. This suggests that poor signal-to-noise ratios can become a limiting factor for test-retest reliability. This represents a common detriment of fMRI paradigms investigating visuospatial attention in general and therefore highlights the need for careful considerations of both the possibilities and limitations of the respective fMRI paradigm–in particular, when being interested in effects at the single-voxel level. Notably, however, when focusing on the reliability of measures of hemispheric lateralization (which was the main goal of study 2), we show that hemispheric dominance (quantified by the lateralization index, LI, with |LI| >0.4) of the evoked activation could be robustly determined in more than 62% and, if considering only two categories (i.e., left, right), in more than 93% of our subjects. Furthermore, the reliability of the lateralization strength (LI) was “fair” to “good”. In conclusion, our results suggest that the degree of right-hemispheric dominance during visuospatial processing can be reliably determined using the Landmark task, both at the group and single-subject level, while at the same time stressing the need for future refinements of experimental paradigms and more sophisticated fMRI data acquisition techniques. PMID:29059201

  10. Adiabatic reduction of a model of stochastic gene expression with jump Markov process.

    PubMed

    Yvinec, Romain; Zhuge, Changjing; Lei, Jinzhi; Mackey, Michael C

    2014-04-01

    This paper considers adiabatic reduction in a model of stochastic gene expression with bursting transcription considered as a jump Markov process. In this model, the process of gene expression with auto-regulation is described by fast/slow dynamics. The production of mRNA is assumed to follow a compound Poisson process occurring at a rate depending on protein levels (the phenomena called bursting in molecular biology) and the production of protein is a linear function of mRNA numbers. When the dynamics of mRNA is assumed to be a fast process (due to faster mRNA degradation than that of protein) we prove that, with appropriate scalings in the burst rate, jump size or translational rate, the bursting phenomena can be transmitted to the slow variable. We show that, depending on the scaling, the reduced equation is either a stochastic differential equation with a jump Poisson process or a deterministic ordinary differential equation. These results are significant because adiabatic reduction techniques seem to have not been rigorously justified for a stochastic differential system containing a jump Markov process. We expect that the results can be generalized to adiabatic methods in more general stochastic hybrid systems.

  11. Communication and cooperation in underwater acoustic networks

    NASA Astrophysics Data System (ADS)

    Yerramalli, Srinivas

    In this thesis, we present a study of several problems related to underwater point to point communications and network formation. We explore techniques to improve the achievable data rate on a point to point link using better physical layer techniques and then study sensor cooperation which improves the throughput and reliability in an underwater network. Robust point-to-point communications in underwater networks has become increasingly critical in several military and civilian applications related to underwater communications. We present several physical layer signaling and detection techniques tailored to the underwater channel model to improve the reliability of data detection. First, a simplified underwater channel model in which the time scale distortion on each path is assumed to be the same (single scale channel model in contrast to a more general multi scale model). A novel technique, which exploits the nature of OFDM signaling and the time scale distortion, called Partial FFT Demodulation is derived. It is observed that this new technique has some unique interference suppression properties and performs better than traditional equalizers in several scenarios of interest. Next, we consider the multi scale model for the underwater channel and assume that single scale processing is performed at the receiver. We then derive optimized front end pre-processing techniques to reduce the interference caused during single scale processing of signals transmitted on a multi-scale channel. We then propose an improvised channel estimation technique using dictionary optimization methods for compressive sensing and show that significant performance gains can be obtained using this technique. In the next part of this thesis, we consider the problem of sensor node cooperation among rational nodes whose objective is to improve their individual data rates. We first consider the problem of transmitter cooperation in a multiple access channel and investigate the stability of the grand coalition of transmitters using tools from cooperative game theory and show that the grand coalition in both the asymptotic regimes of high and low SNR. Towards studying the problem of receiver cooperation for a broadcast channel, we propose a game theoretic model for the broadcast channel and then derive a game theoretic duality between the multiple access and the broadcast channel and show that how the equilibria of the broadcast channel are related to the multiple access channel and vice versa.

  12. Comparison of VRX CT scanners geometries

    NASA Astrophysics Data System (ADS)

    DiBianca, Frank A.; Melnyk, Roman; Duckworth, Christopher N.; Russ, Stephan; Jordan, Lawrence M.; Laughter, Joseph S.

    2001-06-01

    A technique called Variable-Resolution X-ray (VRX) detection greatly increases the spatial resolution in computed tomography (CT) and digital radiography (DR) as the field size decreases. The technique is based on a principle called `projective compression' that allows both the resolution element and the sampling distance of a CT detector to scale with the subject or field size. For very large (40 - 50 cm) field sizes, resolution exceeding 2 cy/mm is possible and for very small fields, microscopy is attainable with resolution exceeding 100 cy/mm. This paper compares the benefits obtainable with two different VRX detector geometries: the single-arm geometry and the dual-arm geometry. The analysis is based on Monte Carlo simulations and direct calculations. The results of this study indicate that the dual-arm system appears to have more advantages than the single-arm technique.

  13. Spectrum transformation for divergent iterations

    NASA Technical Reports Server (NTRS)

    Gupta, Murli M.

    1991-01-01

    Certain spectrum transformation techniques are described that can be used to transform a diverging iteration into a converging one. Two techniques are considered called spectrum scaling and spectrum enveloping and how to obtain the optimum values of the transformation parameters is discussed. Numerical examples are given to show how this technique can be used to transform diverging iterations into converging ones; this technique can also be used to accelerate the convergence of otherwise convergent iterations.

  14. Optimization of Online Searching by Pre-Recording the Search Statements: A Technique for the HP-2645A Terminal.

    ERIC Educational Resources Information Center

    Oberhauser, O. C.; Stebegg, K.

    1982-01-01

    Describes the terminal's capabilities, ways to store and call up lines of statements, cassette tapes needed during searches, and master tape's use for login storage. Advantages of the technique and two sources are listed. (RBF)

  15. Classification by diagnosing all absorption features (CDAF) for the most abundant minerals in airborne hyperspectral images

    NASA Astrophysics Data System (ADS)

    Mobasheri, Mohammad Reza; Ghamary-Asl, Mohsen

    2011-12-01

    Imaging through hyperspectral technology is a powerful tool that can be used to spectrally identify and spatially map materials based on their specific absorption characteristics in electromagnetic spectrum. A robust method called Tetracorder has shown its effectiveness at material identification and mapping, using a set of algorithms within an expert system decision-making framework. In this study, using some stages of Tetracorder, a technique called classification by diagnosing all absorption features (CDAF) is introduced. This technique enables one to assign a class to the most abundant mineral in each pixel with high accuracy. The technique is based on the derivation of information from reflectance spectra of the image. This can be done through extraction of spectral absorption features of any minerals from their respected laboratory-measured reflectance spectra, and comparing it with those extracted from the pixels in the image. The CDAF technique has been executed on the AVIRIS image where the results show an overall accuracy of better than 96%.

  16. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  17. Real-time processing of radar return on a parallel computer

    NASA Technical Reports Server (NTRS)

    Aalfs, David D.

    1992-01-01

    NASA is working with the FAA to demonstrate the feasibility of pulse Doppler radar as a candidate airborne sensor to detect low altitude windshears. The need to provide the pilot with timely information about possible hazards has motivated a demand for real-time processing of a radar return. Investigated here is parallel processing as a means of accommodating the high data rates required. A PC based parallel computer, called the transputer, is used to investigate issues in real time concurrent processing of radar signals. A transputer network is made up of an array of single instruction stream processors that can be networked in a variety of ways. They are easily reconfigured and software development is largely independent of the particular network topology. The performance of the transputer is evaluated in light of the computational requirements. A number of algorithms have been implemented on the transputers in OCCAM, a language specially designed for parallel processing. These include signal processing algorithms such as the Fast Fourier Transform (FFT), pulse-pair, and autoregressive modelling, as well as routing software to support concurrency. The most computationally intensive task is estimating the spectrum. Two approaches have been taken on this problem, the first and most conventional of which is to use the FFT. By using table look-ups for the basis function and other optimizing techniques, an algorithm has been developed that is sufficient for real time. The other approach is to model the signal as an autoregressive process and estimate the spectrum based on the model coefficients. This technique is attractive because it does not suffer from the spectral leakage problem inherent in the FFT. Benchmark tests indicate that autoregressive modeling is feasible in real time.

  18. Moditored unsaturated soil transport processes as a support for large scale soil and water management

    NASA Astrophysics Data System (ADS)

    Vanclooster, Marnik

    2010-05-01

    The current societal demand for sustainable soil and water management is very large. The drivers of global and climate change exert many pressures on the soil and water ecosystems, endangering appropriate ecosystem functioning. The unsaturated soil transport processes play a key role in soil-water system functioning as it controls the fluxes of water and nutrients from the soil to plants (the pedo-biosphere link), the infiltration flux of precipitated water to groundwater and the evaporative flux, and hence the feed back from the soil to the climate system. Yet, unsaturated soil transport processes are difficult to quantify since they are affected by huge variability of the governing properties at different space-time scales and the intrinsic non-linearity of the transport processes. The incompatibility of the scales between the scale at which processes reasonably can be characterized, the scale at which the theoretical process correctly can be described and the scale at which the soil and water system need to be managed, calls for further development of scaling procedures in unsaturated zone science. It also calls for a better integration of theoretical and modelling approaches to elucidate transport processes at the appropriate scales, compatible with the sustainable soil and water management objective. Moditoring science, i.e the interdisciplinary research domain where modelling and monitoring science are linked, is currently evolving significantly in the unsaturated zone hydrology area. In this presentation, a review of current moditoring strategies/techniques will be given and illustrated for solving large scale soil and water management problems. This will also allow identifying research needs in the interdisciplinary domain of modelling and monitoring and to improve the integration of unsaturated zone science in solving soil and water management issues. A focus will be given on examples of large scale soil and water management problems in Europe.

  19. An empirical analysis of the corporate call decision

    NASA Astrophysics Data System (ADS)

    Carlson, Murray Dean

    1998-12-01

    In this thesis we provide insights into the behavior of financial managers of utility companies by studying their decisions to redeem callable preferred shares. In particular, we investigate whether or not an option pricing based model of the call decision, with managers who maximize shareholder value, does a better job of explaining callable preferred share prices and call decisions than do other models of the decision. In order to perform these tests, we extend an empirical technique introduced by Rust (1987) to include the use of information from preferred share prices in addition to the call decisions. The model we develop to value the option embedded in a callable preferred share differs from standard models in two ways. First, as suggested in Kraus (1983), we explicitly account for transaction costs associated with a redemption. Second, we account for state variables that are observed by the decision makers but not by the preferred shareholders. We interpret these unobservable state variables as the benefits and costs associated with a change in capital structure that can accompany a call decision. When we add this variable, our empirical model changes from one which predicts exactly when a share should be called to one which predicts the probability of a call as the function of the observable state. These two modifications of the standard model result in predictions of calls, and therefore of callable preferred share prices, that are consistent with several previously unexplained features of the data; we show that the predictive power of the model is improved in a statistical sense by adding these features to the model. The pricing and call probability functions from our model do a good job of describing call decisions and preferred share prices for several utilities. Using data from shares of the Pacific Gas and Electric Co. (PGE) we obtain reasonable estimates for the transaction costs associated with a call. Using a formal empirical test, we are able to conclude that the managers of the Pacific Gas and Electric Company clearly take into account the value of the option to delay the call when making their call decisions. Overall, the model seems to be robust to tests of its specification and does a better job of describing the data than do simpler models of the decision making process. Limitations in the data do not allow us to perform the same tests in a larger cross-section of utility companies. However, we are able to estimate transaction cost parameters for many firms and these do not seem to vary significantly from those of PGE. This evidence does not cause us to reject our hypothesis that managerial behavior is consistent with a model in which managers maximize shareholder value.

  20. Genetics Home Reference: peroxisomal acyl-CoA oxidase deficiency

    MedlinePlus

    ... of certain fat molecules called very long-chain fatty acids (VLCFAs). Specifically, it is involved in the first step of a process called the peroxisomal fatty acid beta-oxidation pathway. This process shortens the VLCFA ...

  1. Coarse-grained computation for particle coagulation and sintering processes by linking Quadrature Method of Moments with Monte-Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou Yu, E-mail: yzou@Princeton.ED; Kavousanakis, Michail E., E-mail: mkavousa@Princeton.ED; Kevrekidis, Ioannis G., E-mail: yannis@Princeton.ED

    2010-07-20

    The study of particle coagulation and sintering processes is important in a variety of research studies ranging from cell fusion and dust motion to aerosol formation applications. These processes are traditionally simulated using either Monte-Carlo methods or integro-differential equations for particle number density functions. In this paper, we present a computational technique for cases where we believe that accurate closed evolution equations for a finite number of moments of the density function exist in principle, but are not explicitly available. The so-called equation-free computational framework is then employed to numerically obtain the solution of these unavailable closed moment equations bymore » exploiting (through intelligent design of computational experiments) the corresponding fine-scale (here, Monte-Carlo) simulation. We illustrate the use of this method by accelerating the computation of evolving moments of uni- and bivariate particle coagulation and sintering through short simulation bursts of a constant-number Monte-Carlo scheme.« less

  2. Using Rapid Improvement Events for Disaster After-Action Reviews: Experience in a Hospital Information Technology Outage and Response.

    PubMed

    Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly

    2018-02-01

    The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.

  3. Rapid prototyping of microstructures in polydimethylsiloxane (PDMS) by direct UV-lithography.

    PubMed

    Scharnweber, Tim; Truckenmüller, Roman; Schneider, Andrea M; Welle, Alexander; Reinhardt, Martina; Giselbrecht, Stefan

    2011-04-07

    Microstructuring of polydimethylsiloxane (PDMS) is a key step for many lab-on-a-chip (LOC) applications. In general, the structure is generated by casting the liquid prepolymer against a master. The production of the master in turn calls for special equipment and know how. Furthermore, a given master only allows the reproduction of the defined structure. We report on a simple, cheap and practical method to produce microstructures in already cured PDMS by direct UV-lithography followed by chemical development. Due to the available options during the lithographic process like multiple exposures, the method offers a high design flexibility granting easy access to complex and stepped structures. Furthermore, no master is needed and the use of pre-cured PDMS allows processing at ambient (light) conditions. Features down to approximately 5 µm and a depth of 10 µm can be realised. As a proof of principle, we demonstrate the feasibility of the process by applying the structures to various established soft lithography techniques.

  4. On the hydrophilicity of electrodes for capacitive energy extraction

    NASA Astrophysics Data System (ADS)

    Lian, Cheng; Kong, Xian; Liu, Honglai; Wu, Jianzhong

    2016-11-01

    The so-called Capmix technique for energy extraction is based on the cyclic expansion of electrical double layers to harvest dissipative energy arising from the salinity difference between freshwater and seawater. Its optimal performance requires a careful selection of the electrical potentials for the charging and discharging processes, which must be matched with the pore characteristics of the electrode materials. While a number of recent studies have examined the effects of the electrode pore size and geometry on the capacitive energy extraction processes, there is little knowledge on how the surface properties of the electrodes affect the thermodynamic efficiency. In this work, we investigate the Capmix processes using the classical density functional theory for a realistic model of electrolyte solutions. The theoretical predictions allow us to identify optimal operation parameters for capacitive energy extraction with porous electrodes of different surface hydrophobicity. In agreement with recent experiments, we find that the thermodynamic efficiency can be much improved by using most hydrophilic electrodes.

  5. On the hydrophilicity of electrodes for capacitive energy extraction

    DOE PAGES

    Lian, Cheng; East China Univ. of Science and Technology, Shanghai; Kong, Xian; ...

    2016-09-14

    The so-called Capmix technique for energy extraction is based on the cyclic expansion of electrical double layers to harvest dissipative energy arising from the salinity difference between freshwater and seawater. Its optimal performance requires a careful selection of the electrical potentials for the charging and discharging processes, which must be matched with the pore characteristics of the electrode materials. While a number of recent studies have examined the effects of the electrode pore size and geometry on the capacitive energy extraction processes, there is little knowledge on how the surface properties of the electrodes affect the thermodynamic efficiency. In thismore » paper, we investigate the Capmix processes using the classical density functional theory for a realistic model of electrolyte solutions. The theoretical predictions allow us to identify optimal operation parameters for capacitive energy extraction with porous electrodes of different surface hydrophobicity. Finally, in agreement with recent experiments, we find that the thermodynamic efficiency can be much improved by using most hydrophilic electrodes.« less

  6. Document segmentation for high-quality printing

    NASA Astrophysics Data System (ADS)

    Ancin, Hakan

    1997-04-01

    A technique to segment dark texts on light background of mixed mode color documents is presented. This process does not perceptually change graphics and photo regions. Color documents are scanned and printed from various media which usually do not have clean background. This is especially the case for the printouts generated from thin magazine samples, these printouts usually include text and figures form the back of the page, which is called bleeding. Removal of bleeding artifacts improves the perceptual quality of the printed document and reduces the color ink usage. By detecting the light background of the document, these artifacts are removed from background regions. Also detection of dark text regions enables the halftoning algorithms to use true black ink for the black text pixels instead of composite black. The processed document contains sharp black text on white background, resulting improved perceptual quality and better ink utilization. The described method is memory efficient and requires a small number of scan lines of high resolution color documents during processing.

  7. A comparison of field-dependent rheological properties between spherical and plate-like carbonyl iron particles-based magneto-rheological fluids

    NASA Astrophysics Data System (ADS)

    Tan Shilan, Salihah; Amri Mazlan, Saiful; Ido, Yasushi; Hajalilou, Abdollah; Jeyadevan, Balachandran; Choi, Seung-Bok; Azhani Yunus, Nurul

    2016-09-01

    This work proposes different sizes of the plate-like particles from conventional spherical carbonyl iron (CI) particles by adjusting milling time in the ball mill process. The ball mill process to make the plate-like particles is called a solid-state powder processing technique which involves repeated welding, fracturing and re-welding of powder particles in a high-energy ball mill. The effect of ball milling process on the magnetic behavior of CI particles is firstly investigated by vibrating sample magnetometer. It is found form this investigation that the plate-like particles have higher saturation magnetization (about 8%) than that of the spherical particles. Subsequently, for the investigation on the sedimentation behavior the cylindrical measurement technique is used. It is observed from this measurement that the plate-like particles show slower sedimentation rate compared to the spherical particles indicating higher stability of the MR fluid. The field-dependent rheological properties of MR fluids based on the plate-like particles are then investigated with respect to the milling time which is directly connected to the size of the plate-like particles. In addition, the field-dependent rheological properties such as the yield stress are evaluated and compared between the plate-like particles based MR fluids and the spherical particles based MR fluid. It is found that the yield shear stress of the plate-like particles based MR fluid is increased up to 270% compared to the spherical particles based MR fluid.

  8. Metal-assisted exfoliation (MAE): green process for transferring graphene to flexible substrates and templating of sub-nanometer plasmonic gaps (Presentation Recording)

    NASA Astrophysics Data System (ADS)

    Zaretski, Aliaksandr V.; Marin, Brandon C.; Moetazedi, Herad; Dill, Tyler J.; Jibril, Liban; Kong, Casey; Tao, Andrea R.; Lipomi, Darren J.

    2015-09-01

    This paper describes a new technique, termed "metal-assisted exfoliation," for the scalable transfer of graphene from catalytic copper foils to flexible polymeric supports. The process is amenable to roll-to-roll manufacturing, and the copper substrate can be recycled. We then demonstrate the use of single-layer graphene as a template for the formation of sub-nanometer plasmonic gaps using a scalable fabrication process called "nanoskiving." These gaps are formed between parallel gold nanowires in a process that first produces three-layer thin films with the architecture gold/single-layer graphene/gold, and then sections the composite films with an ultramicrotome. The structures produced can be treated as two gold nanowires separated along their entire lengths by an atomically thin graphene nanoribbon. Oxygen plasma etches the sandwiched graphene to a finite depth; this action produces a sub-nanometer gap near the top surface of the junction between the wires that is capable of supporting highly confined optical fields. The confinement of light is confirmed by surface-enhanced Raman spectroscopy measurements, which indicate that the enhancement of the electric field arises from the junction between the gold nanowires. These experiments demonstrate nanoskiving as a unique and easy-to-implement fabrication technique that is capable of forming sub-nanometer plasmonic gaps between parallel metallic nanostructures over long, macroscopic distances. These structures could be valuable for fundamental investigations as well as applications in plasmonics and molecular electronics.

  9. Addressee Errors in ATC Communications: The Call Sign Problem

    NASA Technical Reports Server (NTRS)

    Monan, W. P.

    1983-01-01

    Communication errors involving aircraft call signs were portrayed in reports of 462 hazardous incidents voluntarily submitted to the ASRS during an approximate four-year period. These errors resulted in confusion, disorder, and uncoordinated traffic conditions and produced the following types of operational anomalies: altitude deviations, wrong-way headings, aborted takeoffs, go arounds, runway incursions, missed crossing altitude restrictions, descents toward high terrain, and traffic conflicts in flight and on the ground. Analysis of the report set resulted in identification of five categories of errors involving call signs: (1) faulty radio usage techniques, (2) call sign loss or smearing due to frequency congestion, (3) confusion resulting from similar sounding call signs, (4) airmen misses of call signs leading to failures to acknowledge or readback, and (5) controller failures regarding confirmation of acknowledgements or readbacks. These error categories are described in detail and several associated hazard mitigating measures that might be aken are considered.

  10. Adaptation of warrant price with Black Scholes model and historical volatility

    NASA Astrophysics Data System (ADS)

    Aziz, Khairu Azlan Abd; Idris, Mohd Fazril Izhar Mohd; Saian, Rizauddin; Daud, Wan Suhana Wan

    2015-05-01

    This project discusses about pricing warrant in Malaysia. The Black Scholes model with non-dividend approach and linear interpolation technique was applied in pricing the call warrant. Three call warrants that are listed in Bursa Malaysia were selected randomly from UiTM's datastream. The finding claims that the volatility for each call warrants are different to each other. We have used the historical volatility which will describes the price movement by which an underlying share is expected to fluctuate within a period. The Black Scholes model price that was obtained by the model will be compared with the actual market price. Mispricing the call warrants will contribute to under or over valuation price. Other variables like interest rate, time to maturity date, exercise price and underlying stock price are involves in pricing call warrants as well as measuring the moneyness of call warrants.

  11. Trick or Technique?

    ERIC Educational Resources Information Center

    Sheard, Michael

    2009-01-01

    More often than one might at first imagine, a simple trick involving integration by parts can be used to compute indefinite integrals in unexpected and amusing ways. A systematic look at the trick illuminates the question of whether the trick is useful enough to be called an actual technique of integration.

  12. Big pharma and the problem of disease inflation.

    PubMed

    Gabriel, Joseph M; Goldberg, Daniel S

    2014-01-01

    Over the course of the past decade, critics have increasingly called attention to the corrosive influence of the pharmaceutical industry on both biomedical research and the practice of medicine. Critics describe the industry's use of ghostwriting and other unethical techniques to expand their markets as evidence that medical science is all-too-frequently subordinated to the goals of corporate profit. While we do not dispute this perspective, we argue that it is imperative to also recognize that the goals of medical science and industry profit are now tightly wed to one another. As a result, medical science now operates to expand disease definitions, lower diagnostic thresholds, and otherwise advance the goals of corporate profit through the redefinition and expansion of what it means to be ill. We suggest that this process has led to a variety of ethical problems that are not fully captured by current critiques of ghostwriting and other troubling practices by the pharmaceutical industry. In our conclusion, we call for physicians, ethicists, and other concerned observers to embrace a more fundamental critique of the relationship between biomedical science and corporate profit.

  13. Cold Calling and Web Postings: Do They Improve Students' Preparation and Learning in Statistics?

    ERIC Educational Resources Information Center

    Levy, Dan

    2014-01-01

    Getting students to prepare well for class is a common challenge faced by instructors all over the world. This study investigates the effects that two frequently used techniques to increase student preparation--web postings and cold calling--have on student outcomes. The study is based on two experiments and a qualitative study conducted in a…

  14. Supporting the Growing Needs of the GIS Industry

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Visual Learning Systems, Inc. (VLS), of Missoula, Montana, has developed a commercial software application called Feature Analyst. Feature Analyst was conceived under a Small Business Innovation Research (SBIR) contract with NASA's Stennis Space Center, and through the Montana State University TechLink Center, an organization funded by NASA and the U.S. Department of Defense to link regional companies with Federal laboratories for joint research and technology transfer. The software provides a paradigm shift to automated feature extraction, as it utilizes spectral, spatial, temporal, and ancillary information to model the feature extraction process; presents the ability to remove clutter; incorporates advanced machine learning techniques to supply unparalleled levels of accuracy; and includes an exceedingly simple interface for feature extraction.

  15. Mechanisms of Virus Assembly

    PubMed Central

    Perlmutter, Jason D.; Hagan, Michael F.

    2015-01-01

    Viruses are nanoscale entities containing a nucleic acid genome encased in a protein shell called a capsid, and in some cases surrounded by a lipid bilayer membrane. This review summarizes the physics that govern the processes by which capsids assembles within their host cells and in vitro. We describe the thermodynamics and kinetics for assembly of protein subunits into icosahedral capsid shells, and how these are modified in cases where the capsid assembles around a nucleic acid or on a lipid bilayer. We present experimental and theoretical techniques that have been used to characterize capsid assembly, and we highlight aspects of virus assembly which are likely to receive significant attention in the near future. PMID:25532951

  16. Time-frequency representation of a highly nonstationary signal via the modified Wigner distribution

    NASA Technical Reports Server (NTRS)

    Zoladz, T. F.; Jones, J. H.; Jong, J.

    1992-01-01

    A new signal analysis technique called the modified Wigner distribution (MWD) is presented. The new signal processing tool has been very successful in determining time frequency representations of highly non-stationary multicomponent signals in both simulations and trials involving actual Space Shuttle Main Engine (SSME) high frequency data. The MWD departs from the classic Wigner distribution (WD) in that it effectively eliminates the cross coupling among positive frequency components in a multiple component signal. This attribute of the MWD, which prevents the generation of 'phantom' spectral peaks, will undoubtedly increase the utility of the WD for real world signal analysis applications which more often than not involve multicomponent signals.

  17. A discrete trinomial model for the birth and death of stock financial bubbles

    NASA Astrophysics Data System (ADS)

    Di Persio, Luca; Guida, Francesco

    2017-11-01

    The present work proposes a novel way to model the dynamic of financial bubbles. In particular we exploit the so called trinomial tree technique, which is mainly inspired by the typical market order book (MOB) structure. According to the typical MOB rules, we exploit a bottom-up approach to derive the relevant generator process for the financial quantities characterizing the market we are considering. Our proposal pays attention in considering the real world changes in probability levels characterizing the bid-ask preferences, focusing the attention on the market movements. In particular, we show that financial bubbles are originated by these movements which also act amplify their growth.

  18. A Robust and Engineerable Self-Assembling Protein Template for the Synthesis and Patterning of Ordered Nanoparticle Arrays

    NASA Technical Reports Server (NTRS)

    McMillan, R. Andrew; Howard, Jeanie; Zaluzec, Nestor J.; Kagawa, Hiromi K.; Li, Yi-Fen; Paavola, Chad D.; Trent, Jonathan D.

    2004-01-01

    Self-assembling biomolecules that form highly ordered structures have attracted interest as potential alternatives to conventional lithographic processes for patterning materials. Here we introduce a general technique for patterning materials on the nanoscale using genetically modified protein cage structures called chaperonins that self-assemble into crystalline templates. Constrained chemical synthesis of transition metal nanoparticles is specific to templates genetically functionalized with poly-Histidine sequences. These arrays of materials are ordered by the nanoscale structure of the crystallized protein. This system may be easily adapted to pattern a variety of materials given the rapidly growing list of peptide sequences selected by screening for specificity for inorganic materials.

  19. Basic elements of light water reactor fuel rod design. [FUELROD code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisman, J.; Eckart, R.

    1981-06-01

    Basic design techniques and equations are presented to allow students to understand and perform preliminary fuel design for normal reactor conditions. Each of the important design considerations is presented and discussed in detail. These include the interaction between fuel pellets and cladding and the changes in fuel and cladding that occur during the operating lifetime of the fuel. A simple, student-oriented, fuel rod design computer program, called FUELROD, is described. The FUELROD program models the in-pile pellet cladding interaction and allows a realistic exploration of the effect of various design parameters. By use of FUELROD, the student can gain anmore » appreciation of the fuel rod design process. 34 refs.« less

  20. EMGAN: A computer program for time and frequency domain reduction of electromyographic data

    NASA Technical Reports Server (NTRS)

    Hursta, W. N.

    1975-01-01

    An experiment in electromyography utilizing surface electrode techniques was developed for the Apollo-Soyuz test project. This report describes the computer program, EMGAN, which was written to provide first order data reduction for the experiment. EMG signals are produced by the membrane depolarization of muscle fibers during a muscle contraction. Surface electrodes detect a spatially summated signal from a large number of muscle fibers commonly called an interference pattern. An interference pattern is usually so complex that analysis through signal morphology is extremely difficult if not impossible. It has become common to process EMG interference patterns in the frequency domain. Muscle fatigue and certain myopathic conditions are recognized through changes in muscle frequency spectra.

Top