Sample records for process characterization tool

  1. Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT

  2. Launch Vehicle Design Process Characterization Enables Design/Project Tool

    NASA Technical Reports Server (NTRS)

    Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Robinson, Nancy (Technical Monitor)

    2001-01-01

    The objectives of the project described in this viewgraph presentation included the following: (1) Provide an overview characterization of the launch vehicle design process; and (2) Delineate design/project tool to identify, document, and track pertinent data.

  3. Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding

    NASA Astrophysics Data System (ADS)

    Güpner, Michael; Patschger, Andreas; Bliedtner, Jens

    Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.

  4. [Present-day metal-cutting tools and working conditions].

    PubMed

    Kondratiuk, V P

    1990-01-01

    Polyfunctional machine-tools of a processing centre type are characterized by a set of hygienic advantages as compared to universal machine-tools. But low degree of mechanization and automation of some auxiliary processes, and constructional defects which decrease the ergonomic characteristics of the tools, involve labour intensity in multi-machine processing. The article specifies techniques of allowable noise level assessment, and proposes hygienic recommendations, some of which have been introduced into practice.

  5. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  6. Cognitive learning: a machine learning approach for automatic process characterization from design

    NASA Astrophysics Data System (ADS)

    Foucher, J.; Baderot, J.; Martinez, S.; Dervilllé, A.; Bernard, G.

    2018-03-01

    Cutting edge innovation requires accurate and fast process-control to obtain fast learning rate and industry adoption. Current tools available for such task are mainly manual and user dependent. We present in this paper cognitive learning, which is a new machine learning based technique to facilitate and to speed up complex characterization by using the design as input, providing fast training and detection time. We will focus on the machine learning framework that allows object detection, defect traceability and automatic measurement tools.

  7. Setting up a proper power spectral density (PSD) and autocorrelation analysis for material and process characterization

    NASA Astrophysics Data System (ADS)

    Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.

    2018-03-01

    Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations

  8. Friction stir weld tools having fine grain structure

    DOEpatents

    Grant, Glenn J.; Frye, John G.; Kim, Jin Yong; Lavender, Curt A.; Weil, Kenneth Scott

    2016-03-15

    Tools for friction stir welding can be made with fewer process steps, lower cost techniques, and/or lower cost ingredients than other state-of-the-art processes by utilizing improved compositions and processes of fabrication. Furthermore, the tools resulting from the improved compositions and processes of fabrication can exhibit better distribution and homogeneity of chemical constituents, greater strength, and/or increased durability. In one example, a friction stir weld tool includes tungsten and rhenium and is characterized by carbide and oxide dispersoids, by carbide particulates, and by grains that comprise a solid solution of the tungsten and rhenium. The grains do not exceed 10 micrometers in diameter.

  9. Using 3D Printing for Rapid Prototyping of Characterization Tools for Investigating Powder Blend Behavior.

    PubMed

    Hirschberg, Cosima; Boetker, Johan P; Rantanen, Jukka; Pein-Hackelbusch, Miriam

    2018-02-01

    There is an increasing need to provide more detailed insight into the behavior of particulate systems. The current powder characterization tools are developed empirically and in many cases, modification of existing equipment is difficult. More flexible tools are needed to provide understanding of complex powder behavior, such as mixing process and segregation phenomenon. An approach based on the fast prototyping of new powder handling geometries and interfacing solutions for process analytical tools is reported. This study utilized 3D printing for rapid prototyping of customized geometries; overall goal was to assess mixing process of powder blends at small-scale with a combination of spectroscopic and mechanical monitoring. As part of the segregation evaluation studies, the flowability of three different paracetamol/filler-blends at different ratios was investigated, inter alia to define the percolation thresholds. Blends with a paracetamol wt% above the percolation threshold were subsequently investigated in relation to their segregation behavior. Rapid prototyping using 3D printing allowed designing two funnels with tailored flow behavior (funnel flow) of model formulations, which could be monitored with an in-line near-infrared (NIR) spectrometer. Calculating the root mean square (RMS) of the scores of the two first principal components of the NIR spectra visualized spectral variation as a function of process time. In a same setup, mechanical properties (basic flow energy) of the powder blend were monitored during blending. Rapid prototyping allowed for fast modification of powder testing geometries and easy interfacing with process analytical tools, opening new possibilities for more detailed powder characterization.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunshah, R.F.; Shabaik, A.H.

    The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides, ultrafine grain cermets. The deposits are characterized by hardness, microstructure and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets, Al/sub 2/O/sub 3/ and VC-TiC alloy carbides is given. Tools of different coating characteristics are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the costing, tool temperature, and cutting forces. Tool life tests show coated high speed steel tools show a 300% improvement in tool life.more » (Author) (GRA)« less

  11. Machinability of Green Powder Metallurgy Components: Part I. Characterization of the Influence of Tool Wear

    NASA Astrophysics Data System (ADS)

    Robert-Perron, Etienne; Blais, Carl; Pelletier, Sylvain; Thomas, Yannig

    2007-06-01

    The green machining process is an interesting approach for solving the mediocre machining behavior of high-performance powder metallurgy (PM) steels. This process appears as a promising method for extending tool life and reducing machining costs. Recent improvements in binder/lubricant technologies have led to high green strength systems that enable green machining. So far, tool wear has been considered negligible when characterizing the machinability of green PM specimens. This inaccurate assumption may lead to the selection of suboptimum cutting conditions. The first part of this study involves the optimization of the machining parameters to minimize the effects of tool wear on the machinability in turning of green PM components. The second part of our work compares the sintered mechanical properties of components machined in green state with other machined after sintering.

  12. New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools

    NASA Astrophysics Data System (ADS)

    Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo

    1999-09-01

    As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.

  13. Cluster Tool for In Situ Processing and Comprehensive Characterization of Thin Films at High Temperatures.

    PubMed

    Wenisch, Robert; Lungwitz, Frank; Hanf, Daniel; Heller, René; Zscharschuch, Jens; Hübner, René; von Borany, Johannes; Abrasonis, Gintautas; Gemming, Sibylle; Escobar-Galindo, Ramon; Krause, Matthias

    2018-06-13

    A new cluster tool for in situ real-time processing and depth-resolved compositional, structural and optical characterization of thin films at temperatures from -100 to 800 °C is described. The implemented techniques comprise magnetron sputtering, ion irradiation, Rutherford backscattering spectrometry, Raman spectroscopy, and spectroscopic ellipsometry. The capability of the cluster tool is demonstrated for a layer stack MgO/amorphous Si (∼60 nm)/Ag (∼30 nm), deposited at room temperature and crystallized with partial layer exchange by heating up to 650 °C. Its initial and final composition, stacking order, and structure were monitored in situ in real time and a reaction progress was defined as a function of time and temperature.

  14. Evolution of a Benthic Imaging System From a Towed Camera to an Automated Habitat Characterization System

    DTIC Science & Technology

    2008-09-01

    automated processing of images for color correction, segmentation of foreground targets from sediment and classification of targets to taxonomic category...element in the development of HabCam as a tool for habitat characterization is the automated processing of images for color correction, segmentation of

  15. Characterization of the interfacial heat transfer coefficient for hot stamping processes

    NASA Astrophysics Data System (ADS)

    Luan, Xi; Liu, Xiaochuan; Fang, Haomiao; Ji, Kang; El Fakir, Omer; Wang, LiLiang

    2016-08-01

    In hot stamping processes, the interfacial heat transfer coefficient (IHTC) between the forming tools and hot blank is an essential parameter which determines the quenching rate of the process and hence the resulting material microstructure. The present work focuses on the characterization of the IHTC between an aluminium alloy 7075-T6 blank and two different die materials, cast iron (G3500) and H13 die steel, at various contact pressures. It was found that the IHTC between AA7075 and cast iron had values 78.6% higher than that obtained between AA7075 and H13 die steel. Die materials and contact pressures had pronounced effects on the IHTC, suggesting that the IHTC can be used to guide the selection of stamping tool materials and the precise control of processing parameters.

  16. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  17. Recent advances in synthetic biology of cyanobacteria.

    PubMed

    Sengupta, Annesha; Pakrasi, Himadri B; Wangikar, Pramod P

    2018-05-09

    Cyanobacteria are attractive hosts that can be engineered for the photosynthetic production of fuels, fine chemicals, and proteins from CO 2 . Moreover, the responsiveness of these photoautotrophs towards different environmental signals, such as light, CO 2 , diurnal cycle, and metals make them potential hosts for the development of biosensors. However, engineering these hosts proves to be a challenging and lengthy process. Synthetic biology can make the process of biological engineering more predictable through the use of standardized biological parts that are well characterized and tools to assemble them. While significant progress has been made with model heterotrophic organisms, many of the parts and tools are not portable in cyanobacteria. Therefore, efforts are underway to develop and characterize parts derived from cyanobacteria. In this review, we discuss the reported parts and tools with the objective to develop cyanobacteria as cell factories or biosensors. We also discuss the issues related to characterization, tunability, portability, and the need to develop enabling technologies to engineer this "green" chassis.

  18. Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.

    PubMed

    Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda

    2015-08-31

    The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.

  19. Characterization of Decision Making Behaviors Associated with Human Systems Integration (HSI) Design Tradeoffs: Subject Matter Expert Interviews

    DTIC Science & Technology

    2014-11-18

    this research was to characterize the naturalistic decision making process used in Naval Aviation acquisition to assess cost, schedule and...Naval Aviation acquisitions can be identified, which can support the future development of new processes and tools for training and decision making...part of Department of Defense acquisition processes , HSI ensures that operator, maintainer and sustainer considerations are incorporated into

  20. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  1. LANDSCAPE ASSESSMENT TOOLS FOR WATERSHED CHARACTERIZATION

    EPA Science Inventory

    A combination of process-based, empirical and statistical models has been developed to assist states in their efforts to assess water quality, locate impairments over large areas, and calculate TMDL allocations. By synthesizing outputs from a number of these tools, LIPS demonstr...

  2. Exposure Estimation and Interpretation of Occupational Risk: Enhanced Information for the Occupational Risk Manager

    PubMed Central

    Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa

    2015-01-01

    The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336

  3. ESH assessment of advanced lithography materials and processes

    NASA Astrophysics Data System (ADS)

    Worth, Walter F.; Mallela, Ram

    2004-05-01

    The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.

  4. Using natural language processing techniques to inform research on nanotechnology.

    PubMed

    Lewinski, Nastassja A; McInnes, Bridget T

    2015-01-01

    Literature in the field of nanotechnology is exponentially increasing with more and more engineered nanomaterials being created, characterized, and tested for performance and safety. With the deluge of published data, there is a need for natural language processing approaches to semi-automate the cataloguing of engineered nanomaterials and their associated physico-chemical properties, performance, exposure scenarios, and biological effects. In this paper, we review the different informatics methods that have been applied to patent mining, nanomaterial/device characterization, nanomedicine, and environmental risk assessment. Nine natural language processing (NLP)-based tools were identified: NanoPort, NanoMapper, TechPerceptor, a Text Mining Framework, a Nanodevice Analyzer, a Clinical Trial Document Classifier, Nanotoxicity Searcher, NanoSifter, and NEIMiner. We conclude with recommendations for sharing NLP-related tools through online repositories to broaden participation in nanoinformatics.

  5. Tool path strategy and cutting process monitoring in intelligent machining

    NASA Astrophysics Data System (ADS)

    Chen, Ming; Wang, Chengdong; An, Qinglong; Ming, Weiwei

    2018-06-01

    Intelligent machining is a current focus in advanced manufacturing technology, and is characterized by high accuracy and efficiency. A central technology of intelligent machining—the cutting process online monitoring and optimization—is urgently needed for mass production. In this research, the cutting process online monitoring and optimization in jet engine impeller machining, cranio-maxillofacial surgery, and hydraulic servo valve deburring are introduced as examples of intelligent machining. Results show that intelligent tool path optimization and cutting process online monitoring are efficient techniques for improving the efficiency, quality, and reliability of machining.

  6. DATA QUALITY OBJECTIVES-FOUNDATION OF A SUCCESSFUL MONITORING PROGRAM

    EPA Science Inventory

    The data quality objectives (DQO) process is a fundamental site characterization tool and the foundation of a successful monitoring program. The DQO process is a systematic planning approach based on the scientific method of inquiry. The process identifies the goals of data col...

  7. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  8. Engineering Property Prediction Tools for Tailored Polymer Composite Structures (49465)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Kunc, Vlastimil

    2009-12-29

    Process and constitutive models as well as characterization tools and testing methods were developed to determine stress-strain responses, damage development, strengths and creep of long-fiber thermoplastics (LFTs). The developed models were implemented in Moldflow and ABAQUS and have been validated against LFT data obtained experimentally.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunshah, R.F.; Shabaik, A.H.

    The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides and ultrafine grain cermets. The deposits are characterized by hardness, microstructure, microprobe analysis for chemistry and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets and Al/sub 2/O/sub 3/ are given. High speed steel tool coated with TiC, TiC-Ni and TaC are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the coating tool temperature, and cutting forces. Tool life tests show coated high speed steel toolsmore » having 150 to 300% improvement in tool life compared to uncoated tools. Variability in the quality of the ground edge on high speed steel inserts produce a great scatter in the machining evaluation data.« less

  10. Non Destructive Analysis of Fsw Welds using Ultrasonic Signal Analysis

    NASA Astrophysics Data System (ADS)

    Pavan Kumar, T.; Prabhakar Reddy, P.

    2017-08-01

    Friction Stir Welding is an evolving metal joining technique and is mostly used in joining materials which cannot be easily joined by other available welding techniques. It is a technique which can be used for welding dissimilar materials also. The strength of the weld joint is determined by the way in which these material are mixing with each other, since we are not using any filler material for the welding process the intermixing has a significant importance. The complication with the friction stir welding process is that there are many process parameters which effect this intermixing process such as tool geometry, rotating speed of the tool, transverse speed etc., In this study an attempt is made to compare the material flow and weld quality of various weldments by changing the parameters. Ultrasonic signal Analysis is used to characterize the microstructure of the weldments. use of ultrasonic waves is a non destructive, accurate and fast way of characterization of microstructure. In this method the relationship between the ultrasonic measured parameters and microstructures are evaluated using background echo and backscattered signal process techniques. The ultrasonic velocity and attenuation measurements are dependent on the elastic modulus and any change in the microstructure is reflected in the ultrasonic velocity. An insight into material flow is essential to determine the quality of the weld. Hence an attempt is made in this study to know the relationship between tool geometry and the pattern of material flow and resulting weld quality the experiments are conducted to weld dissimilar aluminum alloys and the weldments are characterized using and ultra Sonic signal processing. Characterization is also done using Scanning Electron Microscopy. It is observed that there is a good correlation between the ultrasonic signal processing results and Scanning Electron Microscopy on the observed precipitates. Tensile tests and hardness tests are conducted on the weldments and compared for determining the weld quality.

  11. Investigation of fatigue strength of tool steels in sheet-bulk metal forming

    NASA Astrophysics Data System (ADS)

    Pilz, F.; Gröbel, D.; Merklein, M.

    2018-05-01

    To encounter trends regarding an efficient production of complex functional components in forming technology, the process class of sheet-bulk metal forming (SBMF) can be applied. SBMF is characterized by the application of bulk forming operations on sheet metal, often in combination with sheet forming operations [1]. The combination of these conventional process classes leads to locally varying load conditions. The resulting load conditions cause high tool loads, which lead to a reduced tool life, and an uncontrolled material flow. Several studies have shown that locally modified tool surfaces, so-called tailored surfaces, have the potential to control the material flow and thus to increase the die filling of functional elements [2]. A combination of these modified tool surfaces and high tool loads in SBMF is furthermore critical for the tool life and leads to fatigue. Tool fatigue is hardly predictable and due to a lack of data [3], a challenge in tool design. Thus, it is necessary to provide such data for tool steels used in SBMF. The aim of this study is the investigation of the influence of tailored surfaces on the fatigue strength of the powder metallurgical tool steel ASP2023 (1.3344, AISI M3:2), which is typically used in cold forging applications, with a hardness 60 HRC ± 1 HRC. To conduct this investigation, the rotating bending test is chosen. As tailored surfaces, a DLC-coating and a surface manufactured by a high-feed-milling process are chosen. As reference a polished surface which is typical for cold forging tools is used. Before the rotating bending test, the surface integrity is characterized by measuring topography and residual stresses. After testing, the determined values of the surface integrity are correlated with the reached fracture load cycle to derive functional relations. Based on the gained results the investigated tailored surfaces are evaluated regarding their feasibility to modify tool surfaces within SBMF.

  12. Using natural language processing techniques to inform research on nanotechnology

    PubMed Central

    Lewinski, Nastassja A

    2015-01-01

    Summary Literature in the field of nanotechnology is exponentially increasing with more and more engineered nanomaterials being created, characterized, and tested for performance and safety. With the deluge of published data, there is a need for natural language processing approaches to semi-automate the cataloguing of engineered nanomaterials and their associated physico-chemical properties, performance, exposure scenarios, and biological effects. In this paper, we review the different informatics methods that have been applied to patent mining, nanomaterial/device characterization, nanomedicine, and environmental risk assessment. Nine natural language processing (NLP)-based tools were identified: NanoPort, NanoMapper, TechPerceptor, a Text Mining Framework, a Nanodevice Analyzer, a Clinical Trial Document Classifier, Nanotoxicity Searcher, NanoSifter, and NEIMiner. We conclude with recommendations for sharing NLP-related tools through online repositories to broaden participation in nanoinformatics. PMID:26199848

  13. Real time software tools and methodologies

    NASA Technical Reports Server (NTRS)

    Christofferson, M. J.

    1981-01-01

    Real time systems are characterized by high speed processing and throughput as well as asynchronous event processing requirements. These requirements give rise to particular implementations of parallel or pipeline multitasking structures, of intertask or interprocess communications mechanisms, and finally of message (buffer) routing or switching mechanisms. These mechanisms or structures, along with the data structue, describe the essential character of the system. These common structural elements and mechanisms are identified, their implementation in the form of routines, tasks or macros - in other words, tools are formalized. The tools developed support or make available the following: reentrant task creation, generalized message routing techniques, generalized task structures/task families, standardized intertask communications mechanisms, and pipeline and parallel processing architectures in a multitasking environment. Tools development raise some interesting prospects in the areas of software instrumentation and software portability. These issues are discussed following the description of the tools themselves.

  14. An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.

    2003-01-01

    This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.

  15. Modeling and evaluating of surface roughness prediction in micro-grinding on soda-lime glass considering tool characterization

    NASA Astrophysics Data System (ADS)

    Cheng, Jun; Gong, Yadong; Wang, Jinsheng

    2013-11-01

    The current research of micro-grinding mainly focuses on the optimal processing technology for different materials. However, the material removal mechanism in micro-grinding is the base of achieving high quality processing surface. Therefore, a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography is proposed in this paper. The differences of material removal mechanism between convention grinding process and micro-grinding process are analyzed. Topography characterization has been done on micro-grinding tools which are fabricated by electroplating. Models of grain density generation and grain interval are built, and new predicting model of micro-grinding surface roughness is developed. In order to verify the precision and application effect of the surface roughness prediction model proposed, a micro-grinding orthogonally experiment on soda-lime glass is designed and conducted. A series of micro-machining surfaces which are 78 nm to 0.98 μm roughness of brittle material is achieved. It is found that experimental roughness results and the predicting roughness data have an evident coincidence, and the component variable of describing the size effects in predicting model is calculated to be 1.5×107 by reverse method based on the experimental results. The proposed model builds a set of distribution to consider grains distribution densities in different protrusion heights. Finally, the characterization of micro-grinding tools which are used in the experiment has been done based on the distribution set. It is concluded that there is a significant coincidence between surface prediction data from the proposed model and measurements from experiment results. Therefore, the effectiveness of the model is demonstrated. This paper proposes a novel method for predicting surface roughness in micro-grinding of hard brittle materials considering micro-grinding tool grains protrusion topography, which would provide significant research theory and experimental reference of material removal mechanism in micro-grinding of soda-lime glass.

  16. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  17. Measurement technique for in situ characterizing aberrations of projection optics in lithographic tools.

    PubMed

    Wang, Fan; Wang, Xiangzhao; Ma, Mingying

    2006-08-20

    As the feature size decreases, degradation of image quality caused by wavefront aberrations of projection optics in lithographic tools has become a serious problem in the low-k1 process. We propose a novel measurement technique for in situ characterizing aberrations of projection optics in lithographic tools. Considering the impact of the partial coherence illumination, we introduce a novel algorithm that accurately describes the pattern displacement and focus shift induced by aberrations. Employing the algorithm, the measurement condition is extended from three-beam interference to two-, three-, and hybrid-beam interferences. The experiments are performed to measure the aberrations of projection optics in an ArF scanner.

  18. Developing Decontamination Tools and Approaches to ...

    EPA Pesticide Factsheets

    Developing Decontamination Tools and Approaches to Address Indoor Pesticide Contamination from Improper Bed Bug Treatments The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  19. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  20. Tribological investigations of the applicability of surface functionalization for dry extrusion processes

    NASA Astrophysics Data System (ADS)

    Teller, Marco; Prünte, Stephan; Ross, Ingo; Temmler, André; Schneider, Jochen M.; Hirt, Gerhard

    2017-10-01

    Cold extrusion processes are characterized by large relative contact stresses combined with a severe surface enlargement of the workpiece. Under these process conditions a high risk for galling of workpiece material to the tool steel occurs especially in processing of aluminum and aluminum alloys. In order to reduce adhesive wear lubricants for separation of workpiece and tool surfaces are used. As a consequence additional process steps (e.g. preparation and cleaning of workpieces) are necessary. Thus, the realization of a dry forming process is aspired from an environmental and economic perspective. In this paper a surface functionalization with self-assembled-monolayers (SAM) of the tool steels AISI D2 (DIN 1.2379) and AISI H11 (DIN 1.2343) is evaluated by a process-oriented tribological test. The tribological experiment is able to resemble and scale the process conditions of cold extrusion related to relative contact stress and surface enlargement for the forming of pure aluminum (Al99.5). The effect of reduced relative contact stress, surface enlargement and relative velocity on adhesive wear and tool lifetime is evaluated. Similar process conditions are achievable by different die designs with decreased extrusion ratios and adjusted die angles. The effect of surface functionalization critically depends on the substrate material. The different microstructure and the resulting differences in surface chemistry of the two tested tool steels appear to affect the performance of the tool surface functionalization with SAM.

  1. Study of the joining of polycarbonate panels in butt joint configuration through friction stir welding

    NASA Astrophysics Data System (ADS)

    Astarita, Antonello; Boccarusso, Luca; Carrino, Luigi; Durante, Massimo; Minutolo, Fabrizio Memola Capece; Squillace, Antonino

    2018-05-01

    Polycarbonate sheets, 3 mm thick, were successfully friction stir welded in butt joint configuration. Aiming to study the feasibility of the process and the influence of the process parameters joints under different processing conditions, obtained by varying the tool rotational speed and the tool travel speed, were realized. Tensile tests were carried out to characterize the joints. Moreover the forces arising during the process were recorded and carefully studied. The experimental outcomes proved the feasibility of the process when the process parameters are properly set, joints retaining more than 70% of the UTS of the base material were produced. The trend of the forces was described and explained, the influence of the process parameters was also introduced.

  2. Recycling-Oriented Product Characterization for Electric and Electronic Equipment as a Tool to Enable Recycling of Critical Metals

    NASA Astrophysics Data System (ADS)

    Rotter, Vera Susanne; Chancerel, Perrine; Ueberschaar, Maximilian

    To establish a knowledge base for new recycling processes of critical elements, recycling-orientated product characterization for Electric and Electronic Equipment (EEE) can be used as a tool. This paper focuses on necessary data and procedures for a successful characterization and provides information about existing scientific work. The usage of this tool is illustrated for two application: Hard Disk Drives (HDD) and Liquid Crystal Display (LCD) panels. In the first case it could be shown that Neodymium and other Rare Earth Elements are concentrated in magnets (25% by weight) and contribute largely to the end demand of Neodymium. Nevertheless, recycling is limited by the difficult liberation and competing other target metals contained in HDD. In the second case it could be shown that also for this application the usage of Indium is concentrated in LCDs, but unlike in magnets the concentration is lower (200 ppm). The design of LCDs with two glued glass layers and the Indium-Tin-Oxide layer in between make the Indium inaccessible for hydro-metallurgical recovery, the glass content puts energetic limitations on pyro-metallurgical processes. For the future technical development of recycling infrastructure we need an in depth understanding of product design and recycling relevant parameters for product characterization focusing on new target metals. This product-centered approach allows also re-think traditional "design for recycling" approaches.

  3. High-Resolution Characterization of UMo Alloy Microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devaraj, Arun; Kovarik, Libor; Joshi, Vineet V.

    2016-11-30

    This report highlights the capabilities and procedure for high-resolution characterization of UMo fuels in PNNL. Uranium-molybdenum (UMo) fuel processing steps, from casting to forming final fuel, directly affect the microstructure of the fuel, which in turn dictates the in-reactor performance of the fuel under irradiation. In order to understand the influence of processing on UMo microstructure, microstructure characterization techniques are necessary. Higher-resolution characterization techniques like transmission electron microscopy (TEM) and atom probe tomography (APT) are needed to interrogate the details of the microstructure. The findings from TEM and APT are also directly beneficial for developing predictive multiscale modeling tools thatmore » can predict the microstructure as a function of process parameters. This report provides background on focused-ion-beam–based TEM and APT sample preparation, TEM and APT analysis procedures, and the unique information achievable through such advanced characterization capabilities for UMo fuels, from a fuel fabrication capability viewpoint.« less

  4. Tribological characterization of the drill pipe tool joints reconditioned by using welding technologies

    NASA Astrophysics Data System (ADS)

    Caltaru, M.; Badicioiu, M.; Ripeanu, R. G.; Dinita, A.; Minescu, M.; Laudacescu, E.

    2018-01-01

    Drill pipe is a seamless steel pipe with upset ends fitted with special threaded ends that are known as tool joints. During drilling operations, the wall thickness of the drill pipe and the outside diameter of the tool joints will be gradually reduced due to wear. The present research work investigate the possibility of reconditioning the drill pipe tool joints by hardbanding with a new metal-cored coppered flux cored wire, Cr-Mo alloyed, using the gas metal active welding process, taking into considerations two different hardbanding technologies, consisting in: hardbanding drill pipe tool joints after removing the old hardbanding material and surface reconstruction with a compensation material (case A), and hardbanding tool joint drill pipe, without removing the old hardbanding material (case B). The present paper brings forward the experimental researches regarding the tribological characterization of the reconditioned drill pipe tool joint by performing macroscopic analyses, metallographic analyses, Vickers hardness measurement, chemical composition measurement and wear tests conducted on ball on disk friction couples, in order to certify the quality of the hardbanding obtained by different technological approaches, to validate the optimum technology.

  5. Validating Signs and Symptoms From An Actual Mass Casualty Incident to Characterize An Irritant Gas Syndrome Agent (IGSA) Exposure: A First Step in The Development of a Novel IGSA Triage Algorithm.

    PubMed

    Culley, Joan M; Richter, Jane; Donevant, Sara; Tavakoli, Abbas; Craig, Jean; DiNardi, Salvatore

    2017-07-01

    • Chemical exposures daily pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for Irritant Gas Syndrome Agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. • This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA Syndrome. Validating signs/symptoms is the first step in developing new emergency department informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Chemical exposures can pose a significant threat to life. Rapid assessment by first responders/emergency nurses is required to reduce death and disability. Currently, no informatics tools for irritant gas syndrome agents (IGSA) exposures exist to process victims efficiently, continuously monitor for latent signs/symptoms, or make triage recommendations. This study describes the first step in developing ED informatics tools for chemical incidents: validation of signs/symptoms that characterize an IGSA syndrome. Data abstracted from 146 patients treated for chlorine exposure in one emergency department during a 2005 train derailment and 152 patients not exposed to chlorine (a comparison group) were mapped to 93 possible signs/symptoms within 2 tools (WISER and CHEMM-IST) designed to assist emergency responders/emergency nurses with managing hazardous material exposures. Inferential statistics (χ 2 /Fisher's exact test) and diagnostics tests were used to examine mapped signs/symptoms of persons who were and were not exposed to chlorine. Three clusters of signs/symptoms are statistically associated with an IGSA syndrome (P < .01): respiratory (shortness of breath, wheezing, coughing, and choking); chest discomfort (tightness, pain, and burning), and eye, nose and/or throat (pain, irritation, and burning). The syndrome requires the presence of signs/symptoms from at least 2 of these clusters. The latency period must also be considered for exposed/potentially exposed persons. This study uses actual patient data from a chemical incident to characterize and validate signs/symptoms of an IGSA syndrome. Validating signs/symptoms is the first step in developing new ED informatics tools with the potential to revolutionize the process by which emergency nurses manage triage victims of chemical incidents. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.

  6. Chromatography process development in the quality by design paradigm I: Establishing a high-throughput process development platform as a tool for estimating "characterization space" for an ion exchange chromatography step.

    PubMed

    Bhambure, R; Rathore, A S

    2013-01-01

    This article describes the development of a high-throughput process development (HTPD) platform for developing chromatography steps. An assessment of the platform as a tool for establishing the "characterization space" for an ion exchange chromatography step has been performed by using design of experiments. Case studies involving use of a biotech therapeutic, granulocyte colony-stimulating factor have been used to demonstrate the performance of the platform. We discuss the various challenges that arise when working at such small volumes along with the solutions that we propose to alleviate these challenges to make the HTPD data suitable for empirical modeling. Further, we have also validated the scalability of this platform by comparing the results from the HTPD platform (2 and 6 μL resin volumes) against those obtained at the traditional laboratory scale (resin volume, 0.5 mL). We find that after integration of the proposed correction factors, the HTPD platform is capable of performing the process optimization studies at 170-fold higher productivity. The platform is capable of providing semi-quantitative assessment of the effects of the various input parameters under consideration. We think that platform such as the one presented is an excellent tool for examining the "characterization space" and reducing the extensive experimentation at the traditional lab scale that is otherwise required for establishing the "design space." Thus, this platform will specifically aid in successful implementation of quality by design in biotech process development. This is especially significant in view of the constraints with respect to time and resources that the biopharma industry faces today. Copyright © 2013 American Institute of Chemical Engineers.

  7. A flexible system for the estimation of infiltration and hydraulic resistance parameters in surface irrigation

    USDA-ARS?s Scientific Manuscript database

    Critical to the use of modeling tools for the hydraulic analysis of surface irrigation systems is characterizing the infiltration and hydraulic resistance process. Since those processes are still not well understood, various formulations are currently used to represent them. A software component h...

  8. Biology Diagrams: Tools To Think With.

    ERIC Educational Resources Information Center

    Kindfield, Ann C. H.

    Subcellular processes like meiosis are frequently problematic for learners because they are complex and, except for the extent that they can be observed under a light microscope, occur outside of our direct experience. More detailed characterization of what underlies various degrees of student understanding of a process is required to more fully…

  9. Manufacturing and advanced characterization of sub-25nm diameter CD-AFM probes with sub-10nm tip edges radius

    NASA Astrophysics Data System (ADS)

    Foucher, Johann; Filippov, Pavel; Penzkofer, Christian; Irmer, Bernd; Schmidt, Sebastian W.

    2013-04-01

    Atomic force microscopy (AFM) is increasingly used in the semiconductor industry as a versatile monitoring tool for highly critical lithography and etching process steps. Applications range from the inspection of the surface roughness of new materials, over accurate depth measurements to the determination of critical dimension structures. The aim to address the rapidly growing demands on measurement uncertainty and throughput more and more shifts the focus of attention to the AFM tip, which represents the crucial link between AFM tool and the sample to be monitored. Consequently, in order to reach the AFM tool's full potential, the performance of the AFM tip has to be considered as a determining parameter. Currently available AFM tips made from silicon are generally limited by their diameter, radius, and sharpness, considerably restricting the AFM measurement capabilities on sub-30nm spaces. In addition to that, there's lack of adequate characterization structures to accurately characterize sub-25nm tip diameters. Here, we present and discuss a recently introduced AFM tip design (T-shape like design) with precise tip diameters down to 15nm and tip radii down to 5nm fabricated from amorphous, high density diamond-like carbon (HDC/DLC) using electron beam induced processing (EBIP). In addition to that advanced design, we propose a new characterizer structure, which allows for accurate characterization and design control of sub-25nm tip diameters and sub-10nm tip edges radii. We demonstrate the potential advantages of combining a small tip shape design, i.e. tip diameter and tip edge radius, and an advanced tip characterizer for the semiconductor industry by the measurement of advanced lithography patterns.

  10. How To Produce and Characterize Transgenic Plants.

    ERIC Educational Resources Information Center

    Savka, Michael A.; Wang, Shu-Yi; Wilson, Mark

    2002-01-01

    Explains the process of establishing transgenic plants which is a very important tool in plant biology and modern agriculture. Produces transgenic plants with the ability to synthesize opines. (Contains 17 references.) (YDS)

  11. Characterization of delamination and transverse cracking in graphite/epoxy laminates by acoustic emission

    NASA Technical Reports Server (NTRS)

    Garg, A.; Ishaei, O.

    1983-01-01

    Efforts to characterize and differentiate between two major failure processes in graphite/epoxy composites - transverse cracking and Mode I delamination are described. Representative laminates were tested in uniaxial tension and flexure. The failure processes were monitored and identified by acoustic emission (AE). The effect of moisture on AE was also investigated. Each damage process was found to have a distinctive AE output that is significantly affected by moisture conditions. It is concluded that AE can serve as a useful tool for detecting and identifying failure modes in composite structures in laboratory and in service environments.

  12. Doing accelerator physics using SDDS, UNIX, and EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borland, M.; Emery, L.; Sereno, N.

    1995-12-31

    The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less

  13. Intentional defect array wafers: their practical use in semiconductor control and monitoring systems

    NASA Astrophysics Data System (ADS)

    Emami, Iraj; McIntyre, Michael; Retersdorf, Michael

    2003-07-01

    In the competitive world of semiconductor manufacturing today, control of the process and manufacturing equipment is paramount to success of the business. Consistent with the need for rapid development of process technology, is a need for development wiht respect to equipment control including defect metrology tools. Historical control methods for defect metrology tools included a raw count of defects detected on a characterized production or test wafer with little or not regard to the attributes of the detected defects. Over time, these characterized wafers degrade with multiple passes on the tools and handling requiring the tool owner to create and characterize new samples periodically. With the complex engineering software analysis systems used today, there is a strong reliance on the accuracy of defect size, location, and classification in order to provide the best value when correlating the in line to sort type of data. Intentional Defect Array (IDA) wafers were designed and manufacturered at International Sematech (ISMT) in Austin, Texas and is a product of collaboration between ISMT member companies and suppliers of advanced defect inspection equipment. These wafers provide the use with known defect types and sizes in predetermined locations across the entire wafer. The wafers are designed to incorporate several desired flows and use critical dimensions consistent with current and future technology nodes. This paper briefly describes the design of the IDA wafer and details many practical applications in the control of advanced defect inspection equipment.

  14. Automated Tow Placement Processing and Characterization of Composites

    NASA Technical Reports Server (NTRS)

    Prabhakaran, R.

    2004-01-01

    The project had one of the initial objectives as automated tow placement (ATP), in which a robot was used to place a collimated band of pre-impregnated ribbons or a wide preconsolidated tape onto a tool surface. It was proposed to utilize the Automated Tow Placement machine that was already available and to fabricate carbon fiber reinforced PEEK (polyether-ether-ketone) matrix composites. After initial experiments with the fabrication of flat plates, composite cylinders were to be fabricated. Specimens from the fabricated parts were to be tested for mechanical characterization. A second objective was to conduct various types of tests for characterizing composite specimens cured by different fabrication processes.

  15. Microstructural Evolution in Friction Stir Welding of Ti-6Al-4V

    NASA Technical Reports Server (NTRS)

    Rubisoff, H.; Querin, J.; Magee, D.; Schneider, J.

    2008-01-01

    Friction stir welding (FSW) is a thermo-mechanical process that utilizes a nonconsumable rotating pin tool to consolidate a weld joint. In the conventional FSW process, the pin tool is responsible for generating both the heat required to soften the material and the forces necessary to deform and combine the weld seam. As such, the geometry of the pin tool is important to the quality of the weld and the process parameters required to produce the weld. Because the geometry of the pin tool is limitless, a reduced set of pin tools was formed to systematically study their effect on the weldment with respect to mechanical properties and resultant microstructure. In this study 0deg, 15deg, 30deg, 45deg, and 60deg tapered, microwave sintered, tungsten carbide (WC) pin tools were used to FSW Ti-6Al-4V. Transverse sections of the weld were used to test for mechanical properties and to document the microstructure using optical microscopy. X-ray diffraction (XRD) was also used to characterize the microstructure in the welds. FSW results for the 45deg and 60deg pin tools are reported in this paper.

  16. Tools for understanding landscapes: combining large-scale surveys to characterize change. Chapter 9.

    Treesearch

    W. Keith Moser; Janine Bolliger; Don C. Bragg; Mark H. Hansen; Mark A. Hatfield; Timothy A. Nigh; Lisa A. Schulte

    2008-01-01

    All landscapes change continuously. Since change is perceived and interpreted through measures of scale, any quantitative analysis of landscapes must identify and describe the spatiotemporal mosaics shaped by large-scale structures and processes. This process is controlled by core influences, or "drivers," that shape the change and affect the outcome...

  17. Wavelet Applications for Flight Flutter Testing

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Marty; Freudinger, Lawrence C.

    1999-01-01

    Wavelets present a method for signal processing that may be useful for analyzing responses of dynamical systems. This paper describes several wavelet-based tools that have been developed to improve the efficiency of flight flutter testing. One of the tools uses correlation filtering to identify properties of several modes throughout a flight test for envelope expansion. Another tool uses features in time-frequency representations of responses to characterize nonlinearities in the system dynamics. A third tool uses modulus and phase information from a wavelet transform to estimate modal parameters that can be used to update a linear model and reduce conservatism in robust stability margins.

  18. Machining of Fibre Reinforced Plastic Composite Materials.

    PubMed

    Caggiano, Alessandra

    2018-03-18

    Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented.

  19. Machining of Fibre Reinforced Plastic Composite Materials

    PubMed Central

    2018-01-01

    Fibre reinforced plastic composite materials are difficult to machine because of the anisotropy and inhomogeneity characterizing their microstructure and the abrasiveness of their reinforcement components. During machining, very rapid cutting tool wear development is experienced, and surface integrity damage is often produced in the machined parts. An accurate selection of the proper tool and machining conditions is therefore required, taking into account that the phenomena responsible for material removal in cutting of fibre reinforced plastic composite materials are fundamentally different from those of conventional metals and their alloys. To date, composite materials are increasingly used in several manufacturing sectors, such as the aerospace and automotive industry, and several research efforts have been spent to improve their machining processes. In the present review, the key issues that are concerning the machining of fibre reinforced plastic composite materials are discussed with reference to the main recent research works in the field, while considering both conventional and unconventional machining processes and reporting the more recent research achievements. For the different machining processes, the main results characterizing the recent research works and the trends for process developments are presented. PMID:29562635

  20. Achieving optimum diffraction based overlay performance

    NASA Astrophysics Data System (ADS)

    Leray, Philippe; Laidler, David; Cheng, Shaunee; Coogans, Martyn; Fuchs, Andreas; Ponomarenko, Mariya; van der Schaar, Maurits; Vanoppen, Peter

    2010-03-01

    Diffraction Based Overlay (DBO) metrology has been shown to have significantly reduced Total Measurement Uncertainty (TMU) compared to Image Based Overlay (IBO), primarily due to having no measurable Tool Induced Shift (TIS). However, the advantages of having no measurable TIS can be outweighed by increased susceptibility to WIS (Wafer Induced Shift) caused by target damage, process non-uniformities and variations. The path to optimum DBO performance lies in having well characterized metrology targets, which are insensitive to process non-uniformities and variations, in combination with optimized recipes which take advantage of advanced DBO designs. In this work we examine the impact of different degrees of process non-uniformity and target damage on DBO measurement gratings and study their impact on overlay measurement accuracy and precision. Multiple wavelength and dual polarization scatterometry are used to characterize the DBO design performance over the range of process variation. In conclusion, we describe the robustness of DBO metrology to target damage and show how to exploit the measurement capability of a multiple wavelength, dual polarization scatterometry tool to ensure the required measurement accuracy for current and future technology nodes.

  1. Landsat 8 on-orbit characterization and calibration system

    USGS Publications Warehouse

    Micijevic, Esad; Morfitt, Ron; Choate, Michael J.

    2011-01-01

    The Landsat Data Continuity Mission (LDCM) is planning to launch the Landsat 8 satellite in December 2012, which continues an uninterrupted record of consistently calibrated globally acquired multispectral images of the Earth started in 1972. The satellite will carry two imaging sensors: the Operational Land Imager (OLI) and the Thermal Infrared Sensor (TIRS). The OLI will provide visible, near-infrared and short-wave infrared data in nine spectral bands while the TIRS will acquire thermal infrared data in two bands. Both sensors have a pushbroom design and consequently, each has a large number of detectors to be characterized. Image and calibration data downlinked from the satellite will be processed by the U.S. Geological Survey (USGS) Earth Resources Observation and Science (EROS) Center using the Landsat 8 Image Assessment System (IAS), a component of the Ground System. In addition to extracting statistics from all Earth images acquired, the IAS will process and trend results from analysis of special calibration acquisitions, such as solar diffuser, lunar, shutter, night, lamp and blackbody data, and preselected calibration sites. The trended data will be systematically processed and analyzed, and calibration and characterization parameters will be updated using both automatic and customized manual tools. This paper describes the analysis tools and the system developed to monitor and characterize on-orbit performance and calibrate the Landsat 8 sensors and image data products.

  2. Characterization of photosynthetically active duckweed (Wolffia australiana) in vitro culture by Respiration Activity Monitoring System (RAMOS).

    PubMed

    Rechmann, Henrik; Friedrich, Andrea; Forouzan, Dara; Barth, Stefan; Schnabl, Heide; Biselli, Manfred; Boehm, Robert

    2007-06-01

    The feasibility of oxygen transfer rate (OTR) measurement to non-destructively monitor plant propagation and vitality of photosynthetically active plant in vitro culture of duckweed (Wolffia australiana, Lemnaceae) was tested using Respiration Activity Monitoring System (RAMOS). As a result, OTR proofed to be a sensitive indicator for plant vitality. The culture characterization under day/night light conditions, however, revealed a complex interaction between oxygen production and consumption, rendering OTR measurement an unsuitable tool to track plant propagation. However, RAMOS was found to be a useful tool in preliminary studies for process development of photosynthetically active plant in vitro cultures.

  3. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  4. Implementation of Systematic Review Tools in IRIS | Science ...

    EPA Pesticide Factsheets

    Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view

  5. A study of the relationship between the performance and dependability of a fault-tolerant computer

    NASA Technical Reports Server (NTRS)

    Goswami, Kumar K.

    1994-01-01

    This thesis studies the relationship by creating a tool (FTAPE) that integrates a high stress workload generator with fault injection and by using the tool to evaluate system performance under error conditions. The workloads are comprised of processes which are formed from atomic components that represent CPU, memory, and I/O activity. The fault injector is software-implemented and is capable of injecting any memory addressable location, including special registers and caches. This tool has been used to study a Tandem Integrity S2 Computer. Workloads with varying numbers of processes and varying compositions of CPU, memory, and I/O activity are first characterized in terms of performance. Then faults are injected into these workloads. The results show that as the number of concurrent processes increases, the mean fault latency initially increases due to increased contention for the CPU. However, for even higher numbers of processes (less than 3 processes), the mean latency decreases because long latency faults are paged out before they can be activated.

  6. Digital modeling of end-mill cutting tools for FEM applications from the active cutting contour

    NASA Astrophysics Data System (ADS)

    Salguero, Jorge; Marcos, M.; Batista, M.; Gómez, A.; Mayuet, P.; Bienvenido, R.

    2012-04-01

    A very current technique in the research field of machining by material removal is the use of simulations using the Finite Element Method (FEM). Nevertheless, and although is widely used in processes that allows approximations to orthogonal cutting, such as shaping, is scarcely used in more complexes processes, such as milling. This fact is due principally to the complex geometry of the cutting tools in these processes, and the need to realize the studi es in an oblique cutting configuration. This paper shows a methodology for the geometrical characterization of commercial endmill cutting tools, by the extraction of the cutting tool contour, making use of optical metrology, and using this geometry to model the active cutting zone with a 3D CAD software. This model is easily exportable to different CAD formats, such as IGES or STEP, and importable from FEM software, where is possible to study the behavior in service of the same ones.

  7. Improved maize reference genome with single-molecule technologies

    USDA-ARS?s Scientific Manuscript database

    Complete and accurate reference genomes and annotations provide fundamental tools for characterization of genetic and functional variation. These resources facilitate elucidation of biological processes and support translation of research findings into improved and sustainable agricultural technolog...

  8. Soil chemical insights provided through vibrational spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Vibrational spectroscopy techniques provide a powerful approach to study environmental materials and processes. These multifunctional analysis tools can be used to probe molecular vibrations of solid, liquid, and gaseous samples for characterizing materials, elucidating reaction mechanisms, and exam...

  9. Surface plasmon resonance as a tool for ligand-binding assay reagent characterization in bioanalysis of biotherapeutics.

    PubMed

    Duo, Jia; Bruno, JoAnne; Kozhich, Alexander; David-Brown, Donata; Luo, Linlin; Kwok, Suk; Santockyte, Rasa; Haulenbeek, Jonathan; Liu, Rong; Hamuro, Lora; Peterson, Jon E; Piccoli, Steven; DeSilva, Binodh; Pillutla, Renuka; Zhang, Yan J

    2018-04-01

    Ligand-binding assay (LBA) performance depends on quality reagents. Strategic reagent screening and characterization is critical to LBA development, optimization and validation. Application of advanced technologies expedites the reagent screening and assay development process. By evaluating surface plasmon resonance technology that offers high-throughput kinetic information, this article aims to provide perspectives on applying the surface plasmon resonance technology to strategic LBA critical reagent screening and characterization supported by a number of case studies from multiple biotherapeutic programs.

  10. Characterization and measurement of polymer wear

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.; Aron, P. R.

    1984-01-01

    Analytical tools which characterize the polymer wear process are discussed. The devices discussed include: visual observation of polymer wear with SEM, the quantification with surface profilometry and ellipsometry, to study the chemistry with AES, XPS and SIMS, to establish interfacial polymer orientation and accordingly bonding with QUARTIR, polymer state with Raman spectroscopy and stresses that develop in polymer films using a X-ray double crystal camera technique.

  11. Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.

    2001-01-01

    Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.

  12. Power-law statistics of neurophysiological processes analyzed using short signals

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.

    2018-04-01

    We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.

  13. Fabrication and Characterization of High Temperature Resin/Carbon Nanofiber Composites

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Working, Dennis C.; Criss, Jim M.; Siochi, Emilie J.; Conell, John W.

    2005-01-01

    As part of ongoing efforts to develop multifunctional advanced composites, blends of PETI-330 and carbon nanofibers (CNF) were prepared and characterized. Dry mixing techniques were employed and the effect of CNF loading level on melt viscosity was determined. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, samples containing 30 and 40 wt% CNF were scaled up to approx.300 g and used to fabricate moldings 10.2 cm x 15.2 cm x 0.32 cm thick. The moldings were fabricated by injecting the mixtures at 260-280 C into a stainless steel tool followed by curing for 1 h at 371 C. The tool was designed to impart high shear during the injection process in an attempt to achieve some alignment of CNFs in the flow direction. Moldings were obtained that were subsequently characterized for thermal, mechanical and electrical properties. The degree of dispersion and alignment of CNFs were investigated using high-resolution scanning electron microscopy. The preparation and preliminary characterization of PETI-330/CNF composites will be discussed.

  14. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Bjorkman, Gerry; Kooney, Alex; Russell, Carolyn

    2003-01-01

    The weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  15. Friction stir processing on high carbon steel U12

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarasov, S. Yu., E-mail: tsy@ispms.ru; Rubtsov, V. E., E-mail: rvy@ispms.ru; National Research Tomsk Polytechnic University, Tomsk, 634050

    2015-10-27

    Friction stir processing (FSP) of high carbon steel (U12) samples has been carried out using a milling machine and tools made of cemented tungsten carbide. The FSP tool has been made in the shape of 5×5×1.5 mm. The microstructural characterization of obtained stir zone and heat affected zone has been carried out. Microhardness at the level of 700 MPa has been obtained in the stir zone with microstructure consisting of large grains and cementitte network. This high-level of microhardness is explained by bainitic reaction developing from decarburization of austenitic grains during cementite network formation.

  16. Joint properties of a tool machining process to guarantee fluid-proof abilities

    NASA Astrophysics Data System (ADS)

    Bataille, C.; Deltombe, R.; Jourani, A.; Bigerelle, M.

    2017-12-01

    This study addressed the impact of rod surface topography in contact with reciprocating seals. Rods were tooled with and without centreless grinding. All rods tooled with centreless grinding were fluid-proof, in contrast to rods tooled without centreless grinding that either had leaks or were fluid-proof. A method was developed to analyse the machining signature, and the software Mesrug™ was used in order to discriminate roughness parameters that can be used to characterize the sealing functionality. According to this surface roughness analysis, a fluid-proof rod tooled without centreless grinding presents aperiodic large plateaus, and the relevant roughness parameter for characterizing the sealing functionality is the density of summits S DS. Increasing the density of summits counteracts leakage, which may be because motif decomposition integrates three topographical components: circularity (perpendicular long-wave roughness), longitudinal waviness, and roughness thanks to the Wolf pruning algorithm. A 3D analytical contact model was applied to analyse the contact area of each type of sample with the seal surface. This model provides a leakage probability, and the results were consistent with the interpretation of the topographical analysis.

  17. Nondestructive SEM for surface and subsurface wafer imaging

    NASA Technical Reports Server (NTRS)

    Propst, Roy H.; Bagnell, C. Robert; Cole, Edward I., Jr.; Davies, Brian G.; Dibianca, Frank A.; Johnson, Darryl G.; Oxford, William V.; Smith, Craig A.

    1987-01-01

    The scanning electron microscope (SEM) is considered as a tool for both failure analysis as well as device characterization. A survey is made of various operational SEM modes and their applicability to image processing methods on semiconductor devices.

  18. Process Definition and Modeling Guidebook. Version 01.00.02

    DTIC Science & Technology

    1992-12-01

    material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from

  19. Machine tools error characterization and compensation by on-line measurement of artifact

    NASA Astrophysics Data System (ADS)

    Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili

    2009-11-01

    Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.

  20. Filtration Characterization Method as Tool to Assess Membrane Bioreactor Sludge Filterability—The Delft Experience

    PubMed Central

    Lousada-Ferreira, Maria; Krzeminski, Pawel; Geilvoet, Stefan; Moreau, Adrien; Gil, Jose A.; Evenblij, Herman; van Lier, Jules B.; van der Graaf, Jaap H. J. M.

    2014-01-01

    Prevention and removal of fouling is often the most energy intensive process in Membrane Bioreactors (MBRs), responsible for 40% to 50% of the total specific energy consumed in submerged MBRs. In the past decade, methods were developed to quantify and qualify fouling, aiming to support optimization in MBR operation. Therefore, there is a need for an evaluation of the lessons learned and how to proceed. In this article, five different methods for measuring MBR activated sludge filterability and critical flux are described, commented and evaluated. Both parameters characterize the fouling potential in full-scale MBRs. The article focuses on the Delft Filtration Characterization method (DFCm) as a convenient tool to characterize sludge properties, namely on data processing, accuracy, reproducibility, reliability, and applicability, defining the boundaries of the DFCm. Significant progress was made concerning fouling measurements in particular by using straight forward approaches focusing on the applicability of the obtained results. Nevertheless, a fouling measurement method is still to be defined which is capable of being unequivocal, concerning the fouling parameters definitions; practical and simple, in terms of set-up and operation; broad and useful, in terms of obtained results. A step forward would be the standardization of the aforementioned method to assess the sludge filtration quality. PMID:24957174

  1. Stand-Alone Measurements and Characterization | Photovoltaic Research |

    Science.gov Websites

    Science and Technology Facility cluster tools offer powerful capabilities for measuring and characterizing Characterization tool suite are supplemented by the Integrated Measurements and Characterization cluster tool the Integrated M&C cluster tool using a mobile transport pod, which can keep samples under vacuum

  2. Friction Stir Process Mapping Methodology

    NASA Technical Reports Server (NTRS)

    Kooney, Alex; Bjorkman, Gerry; Russell, Carolyn; Smelser, Jerry (Technical Monitor)

    2002-01-01

    In FSW (friction stir welding), the weld process performance for a given weld joint configuration and tool setup is summarized on a 2-D plot of RPM vs. IPM. A process envelope is drawn within the map to identify the range of acceptable welds. The sweet spot is selected as the nominal weld schedule. The nominal weld schedule is characterized in the expected manufacturing environment. The nominal weld schedule in conjunction with process control ensures a consistent and predictable weld performance.

  3. Emerging developments in the standardized chemical characterization of indoor air quality.

    PubMed

    Nehr, Sascha; Hösen, Elisabeth; Tanabe, Shin-Ichi

    2017-01-01

    Despite the fact that the special characteristics of indoor air pollution make closed environments quite different from outdoor environments, the conceptual ideas for assessing air quality indoors and outdoors are similar. Therefore, the elaboration of International Standards for air quality characterization in view of controlling indoor air quality should resort to this common basis. In this short review we describe the possibilities of standardization of tools dedicated to indoor air quality characterization with a focus on the tools permitting to study the indoor air chemistry. The link between indoor exposure and health as well as the critical processes driving the indoor air quality are introduced. Available International Standards for the assessment of indoor air quality are depicted. The standards comprise requirements for the sampling on site, the analytical procedures, and the determination of material emissions. To date, these standardized procedures assure that indoor air, settled dust and material samples are analyzed in a comparable manner. However, existing International Standards exclusively specify conventional, event-driven target-screening using discontinuous measurement methods for long-lived pollutants. Therefore, this review draws a parallel between physico-chemical processes in indoor and outdoor environments. The achievements in atmospheric sciences also improve our understanding of indoor environments. The community of atmospheric scientists can be both ideal and supporter for researchers in the area of indoor air quality characterization. This short review concludes with propositions for future standardization activities for the chemical characterization of indoor air quality. Future standardization efforts should focus on: (i) the elaboration of standardized measurement methods and measurement strategies for online monitoring of long-lived and short-lived pollutants, (ii) the assessment of the potential and the limitations of non-target screening, (iii) the paradigm shift from event-driven investigations to systematic approaches to characterize indoor environments, and (iv) the development of tools for policy implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. NEW GIS WATERSHED ANALYSIS TOOLS FOR SOIL CHARACTERIZATION AND EROSION AND SEDIMENTATION MODELING

    EPA Science Inventory

    A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed which utilizes a suite of automated scripts and a pair of processing-intensive executable programs operating on a personal computer platform.

  5. Spray-formed tooling

    NASA Astrophysics Data System (ADS)

    McHugh, K. M.; Key, J. F.

    The United States Council for Automotive Research (USCAR) has formed a partnership with the Idaho National Engineering Laboratory (INEL) to develop a process for the rapid production of low-cost tooling based on spray forming technology developed at the INEL. Phase 1 of the program will involve bench-scale system development, materials characterization, and process optimization. In Phase 2, prototype systems will be designed, constructed, evaluated, and optimized. Process control and other issues that influence commercialization will be addressed during this phase of the project. Technology transfer to USCAR, or a tooling vendor selected by USCAR, will be accomplished during Phase 3. The approach INEL is using to produce tooling, such as plastic injection molds and stamping dies, combines rapid solidification processing and net-shape materials processing into a single step. A bulk liquid metal is pressure-fed into a de Laval spray nozzle transporting a high velocity, high temperature inert gas. The gas jet disintegrates the metal into fine droplets and deposits them onto a tool pattern made from materials such as plastic, wax, clay, ceramics, and metals. The approach is compatible with solid freeform fabrication techniques such as stereolithography, selective laser sintering, and laminated object manufacturing. Heat is extracted rapidly, in-flight, by convection as the spray jet entrains cool inert gas to produce undercooled and semi-solid droplets. At the pattern, the droplets weld together while replicating the shape and surface features of the pattern. Tool formation is rapid; deposition rates in excess of 1 ton/h have been demonstrated for bench-scale nozzles.

  6. An open data mining framework for the analysis of medical images: application on obstructive nephropathy microscopy images.

    PubMed

    Doukas, Charalampos; Goudas, Theodosis; Fischer, Simon; Mierswa, Ingo; Chatziioannou, Aristotle; Maglogiannis, Ilias

    2010-01-01

    This paper presents an open image-mining framework that provides access to tools and methods for the characterization of medical images. Several image processing and feature extraction operators have been implemented and exposed through Web Services. Rapid-Miner, an open source data mining system has been utilized for applying classification operators and creating the essential processing workflows. The proposed framework has been applied for the detection of salient objects in Obstructive Nephropathy microscopy images. Initial classification results are quite promising demonstrating the feasibility of automated characterization of kidney biopsy images.

  7. Genomics screens for metastasis genes

    PubMed Central

    Yan, Jinchun; Huang, Qihong

    2014-01-01

    Metastasis is responsible for most cancer mortality. The process of metastasis is complex, requiring the coordinated expression and fine regulation of many genes in multiple pathways in both the tumor and host tissues. Identification and characterization of the genetic programs that regulate metastasis is critical to understanding the metastatic process and discovering molecular targets for the prevention and treatment of metastasis. Genomic approaches and functional genomic analyses can systemically discover metastasis genes. In this review, we summarize the genetic tools and methods that have been used to identify and characterize the genes that play critical roles in metastasis. PMID:22684367

  8. Electromagnetic nondestructive evaluation of tempering process in AISI D2 tool steel

    NASA Astrophysics Data System (ADS)

    Kahrobaee, Saeed; Kashefi, Mehrdad

    2015-05-01

    The present paper investigates the potential of using eddy current technique as a reliable nondestructive tool to detect microstructural changes during the different stages of tempering treatment in AISI D2 tool steel. Five stages occur in tempering of the steel: precipitation of ɛ carbides, formation of cementite, retained austenite decomposition, secondary hardening effect and spheroidization of carbides. These stages were characterized by destructive methods, including dilatometry, differential scanning calorimetry, X-ray diffraction, scanning electron microscopic observations, and hardness measurements. The microstructural changes alter the electrical resistivity/magnetic saturation, which, in turn, influence the eddy current signals. Two EC parameters, induced voltage sensed by pickup coil and impedance point detected by excitation coil, were evaluated as a function of tempering temperature to characterize the microstructural features, nondestructively. The study revealed that a good correlation exists between the EC parameters and the microstructural changes.

  9. The Linguistic Correlates of Conversational Deception: Comparing Natural Language Processing Technologies

    ERIC Educational Resources Information Center

    Duran, Nicholas D.; Hall, Charles; McCarthy, Philip M.; McNamara, Danielle S.

    2010-01-01

    The words people use and the way they use them can reveal a great deal about their mental states when they attempt to deceive. The challenge for researchers is how to reliably distinguish the linguistic features that characterize these hidden states. In this study, we use a natural language processing tool called Coh-Metrix to evaluate deceptive…

  10. State-of-the-art characterization techniques for advanced lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Wu, Tianpin; Amine, Khalil

    2017-03-01

    To meet future needs for industries from personal devices to automobiles, state-of-the-art rechargeable lithium-ion batteries will require both improved durability and lowered costs. To enhance battery performance and lifetime, understanding electrode degradation mechanisms is of critical importance. Various advanced in situ and operando characterization tools developed during the past few years have proven indispensable for optimizing battery materials, understanding cell degradation mechanisms, and ultimately improving the overall battery performance. Here we review recent progress in the development and application of advanced characterization techniques such as in situ transmission electron microscopy for high-performance lithium-ion batteries. Using three representative electrode systems—layered metal oxides, Li-rich layered oxides and Si-based or Sn-based alloys—we discuss how these tools help researchers understand the battery process and design better battery systems. We also summarize the application of the characterization techniques to lithium-sulfur and lithium-air batteries and highlight the importance of those techniques in the development of next-generation batteries.

  11. Detection and characterization of exercise induced muscle damage (EIMD) via thermography and image processing

    NASA Astrophysics Data System (ADS)

    Avdelidis, N. P.; Kappatos, V.; Georgoulas, G.; Karvelis, P.; Deli, C. K.; Theodorakeas, P.; Giakas, G.; Tsiokanos, A.; Koui, M.; Jamurtas, A. Z.

    2017-04-01

    Exercise induced muscle damage (EIMD), is usually experienced in i) humans who have been physically inactive for prolonged periods of time and then begin with sudden training trials and ii) athletes who train over their normal limits. EIMD is not so easy to be detected and quantified, by means of commonly measurement tools and methods. Thermography has been used successfully as a research detection tool in medicine for the last 6 decades but very limited work has been reported on EIMD area. The main purpose of this research is to assess and characterize EIMD, using thermography and image processing techniques. The first step towards that goal is to develop a reliable segmentation technique to isolate the region of interest (ROI). A semi-automatic image processing software was designed and regions of the left and right leg based on superpixels were segmented. The image is segmented into a number of regions and the user is able to intervene providing the regions which belong to each of the two legs. In order to validate the image processing software, an extensive experimental investigation was carried out, acquiring thermographic images of the rectus femoris muscle before, immediately post and 24, 48 and 72 hours after an acute bout of eccentric exercise (5 sets of 15 maximum repetitions), on males and females (20-30 year-old). Results indicate that the semi-automated approach provides an excellent bench-mark that can be used as a clinical reliable tool.

  12. Dry rotary swaging with structured and coated tools

    NASA Astrophysics Data System (ADS)

    Herrmann, Marius; Schenck, Christian; Kuhfuss, Bernd

    2018-05-01

    Rotary swaging is a cold bulk forming process for manufacturing of complex bar and tube profiles like axles and gear shafts in the automotive industry. Conventional rotary swaging is carried out under intense use of lubricant usually based on mineral oil. Besides lubrication the lubricant fulfills necessary functions like lubrication, flushing and cooling, but generates costs for recycling, replacement and cleaning of the workpieces. Hence, the development of a dry process design is highly desirable, both under economic and ecological points of view. Therefore, it is necessary to substitute the functions of the lubricant. This was realized by the combination of newly developed a-C:H:W coating systems on the tools to minimize the friction and to avoid adhesion effects. With the application of a deterministic structure in the forging zone of the tools the friction conditions are modified to control the axial process forces. In this study infeed rotary swaging with functionalized tools was experimentally investigated. Therefore, steel and aluminum tubes were formed with and without lubricant. Different structures which were coated and uncoated were implemented in the reduction zone of the tools. The antagonistic effects of coating and structuring were characterized by measuring the axial process force and the produced workpiece quality in terms of roundness and surface roughness. Thus, the presented results allow for further developments towards a dry rotary swaging process.

  13. High Temperature Resin/Carbon Nanotube Composite Fabrication

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Sun, Keun J.; Criss, Jim M.; Siochi, Emilie J.; Connell, John W.

    2006-01-01

    For the purpose of incorporating multifunctionality into advanced composites, blends of phenylethynyl terminated imides-330 (PETI-330) and multi-walled carbon nanotubes (MWCNTs) were prepared, characterized and fabricated into moldings. PETI-330/MWCNT mixtures were prepared at concentrations ranging from 3 to 25 weight percent by dry mixing the components in a ball mill. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, PETI-330/MWCNT samples were scaled up to approximately 300 g and used to fabricate moldings by injecting the mixtures at 260-280 deg C into a stainless steel tool followed by curing for 1 h at 371 deg C. The tool was designed to impart a degree of shear during the injection process in an attempt to achieve some alignment of the MWCNTs in the flow direction. Obtained moldings were subsequently characterized for thermal, mechanical, and electrical properties. The degree of dispersion and alignment of MWCNTs were investigated using high-resolution scanning electron microscopy. The preparation and preliminary characterization of PETI-330/MWCNT composites will be discussed.

  14. Robot based deposition of WC-Co HVOF coatings on HSS cutting tools as a substitution for solid cemented carbide cutting tools

    NASA Astrophysics Data System (ADS)

    Tillmann, W.; Schaak, C.; Biermann, D.; Aßmuth, R.; Goeke, S.

    2017-03-01

    Cemented carbide (hard metal) cutting tools are the first choice to machine hard materials or to conduct high performance cutting processes. Main advantages of cemented carbide cutting tools are their high wear resistance (hardness) and good high temperature strength. In contrast, cemented carbide cutting tools are characterized by a low toughness and generate higher production costs, especially due to limited resources. Usually, cemented carbide cutting tools are produced by means of powder metallurgical processes. Compared to conventional manufacturing routes, these processes are more expensive and only a limited number of geometries can be realized. Furthermore, post-processing and preparing the cutting edges in order to achieve high performance tools is often required. In the present paper, an alternative method to substitute solid cemented carbide cutting tools is presented. Cutting tools made of conventional high speed steels (HSS) were coated with thick WC-Co (88/12) layers by means of thermal spraying (HVOF). The challenge is to obtain a dense, homogenous, and near-net-shape coating on the flanks and the cutting edge. For this purpose, different coating strategies were realized using an industrial robot. The coating properties were subsequently investigated. After this initial step, the surfaces of the cutting tools were ground and selected cutting edges were prepared by means of wet abrasive jet machining to achieve a smooth and round micro shape. Machining tests were conducted with these coated, ground and prepared cutting tools. The occurring wear phenomena were analyzed and compared to conventional HSS cutting tools. Overall, the results of the experiments proved that the coating withstands mechanical stresses during machining. In the conducted experiments, the coated cutting tools showed less wear than conventional HSS cutting tools. With respect to the initial wear resistance, additional benefits can be obtained by preparing the cutting edge by means of wet abrasive jet machining.

  15. Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum

    NASA Astrophysics Data System (ADS)

    Guan, Shan; Song, Weijie; Pang, Hongyang

    2017-09-01

    In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.

  16. REACTIVE MINERALS IN AQUIFERS: FORMATION PROCESSES AND QUANTITATIVE ANALYSIS

    EPA Science Inventory

    The presentation will focus on the occurrence, form, and characterization of reactive iron minerals in aquifers and soils. The potential for abiotic reductive transformations of contaminants at the mineral-water interface will be discussed along with available tools for site min...

  17. Reducing inherent biases introduced during DNA viral metagenome analyses of municipal wastewater

    EPA Science Inventory

    Metagenomics is a powerful tool for characterizing viral composition within environmental samples, but sample and molecular processing steps can bias the estimation of viral community structure. The objective of this study is to understand the inherent variability introduced when...

  18. Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management

    DOE PAGES

    McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid

    2016-02-17

    Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.

  19. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  20. Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid

    Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.

  1. Analyzing the Evolution of Membrane Fouling via a Novel Method Based on 3D Optical Coherence Tomography Imaging.

    PubMed

    Li, Weiyi; Liu, Xin; Wang, Yi-Ning; Chong, Tzyy Haur; Tang, Chuyang Y; Fane, Anthony G

    2016-07-05

    The development of novel tools for studying the fouling behavior during membrane processes is critical. This work explored optical coherence tomography (OCT) to quantitatively interpret the formation of a cake layer during a membrane process; the quantitative analysis was based on a novel image processing method that was able to precisely resolve the 3D structure of the cake layer on a micrometer scale. Fouling experiments were carried out with foulants having different physicochemical characteristics (silica nanoparticles and bentonite particles). The cake layers formed at a series of times were digitalized using the OCT-based characterization. The specific deposit (cake volume/membrane surface area) and surface coverage were evaluated as a function of time, which for the first time provided direct experimental evidence for the transition of various fouling mechanisms. Axial stripes were observed in the grayscale plots showing the deposit distribution in the scanned area; this interesting observation was in agreement with the instability analysis that correlated the polarized particle groups with the small disturbances in the boundary layer. This work confirms that the OCT-based characterization is able to provide deep insights into membrane fouling processes and offers a powerful tool for exploring membrane processes with enhanced performance.

  2. Prediction of composites behavior undergoing an ATP process through data-mining

    NASA Astrophysics Data System (ADS)

    Martin, Clara Argerich; Collado, Angel Leon; Pinillo, Rubén Ibañez; Barasinski, Anaïs; Abisset-Chavanne, Emmanuelle; Chinesta, Francisco

    2018-05-01

    The need to characterize composite surfaces for distinct mechanical or physical processes leads to different manners of evaluate the state of the surface. During many manufacturing processes deformation occurs, thus hindering composite classification for fabrication processes. In this work we focus on the challenge of a priori identifying the surfaces' behavior in order to optimize manufacturing. We will propose and validate the curvature of the surface as a reliable parameter and we will develop a tool that allows the prediction of the surface behavior.

  3. Intermediate Experimental Vehicle, ESA Program IXV ATDB Tool and Aerothermodynamic Characterization

    NASA Astrophysics Data System (ADS)

    Mareschi, Vincenzo; Ferrarella, Daniela; Zaccagnino, Elio; Tribot, Jean-Pierre; Vallee, Jean-Jacques; Haya-Ramos, Rodrigo; Rufolo, Giuseppe; Mancuso, Salvatore

    2011-05-01

    In the complex domain of the space technologies and among the different applications available in Europe, a great interest has been placed since several years in the development of re-entry technologies. Among the different achievements obtained in that field it is to be recalled the experience of the Atmospheric Re-entry Vehicle flight in 1998 and a certain number of important investments per-formed at Agency and national levels like Hermes, MSTP, Festip, X-38, FLPP, TRP, GSTP, HSTS, AREV, Pre-X. IXV (Intermediate eXperimental V ehicle) builds on these past experiences and studies and it is conceived to be the next technological step forward with respect to ARD With respect to previous European ballistic or quasi- ballistic demonstrators, IXV will have an increased in- flight manoeuvrability and the planned mission will allow verifying the performances of the required technologies against a wider re-entry corridor. This will imply from the pure technological aspect to increase the level of engagement on critical technologies and disciplines like aerodynamics/aerothermodynamics, guidance, navigation, control, thermal protection materials and in flight measurements. In order to support the TPS design and the other sub- systems, an AeroThermodynamicDataBase Tool has been developed by Dassault Aviation and integrated by Thales Alenia Space with the Functional Engineering Simulator (used for GNC performances evaluation) in order to characterize the aerothermodynamic behaviour of the vehicle. This paper will describe: - The methodology used to develop the ATDB tool, based on the processing of CFD computations and WTT campaigns results. - The utilization of the ATDB tool, by means of its integration into the System process. - The methodology used for the aerothermal characterization of IXV.

  4. Experimental investigation and optimization of welding process parameters for various steel grades using NN tool and Taguchi method

    NASA Astrophysics Data System (ADS)

    Soni, Sourabh Kumar; Thomas, Benedict

    2018-04-01

    The term "weldability" has been used to describe a wide variety of characteristics when a material is subjected to welding. In our analysis we perform experimental investigation to estimate the tensile strength of welded joint strength and then optimization of welding process parameters by using taguchi method and Artificial Neural Network (ANN) tool in MINITAB and MATLAB software respectively. The study reveals the influence on weldability of steel by varying composition of steel by mechanical characterization. At first we prepare the samples of different grades of steel (EN8, EN 19, EN 24). The samples were welded together by metal inert gas welding process and then tensile testing on Universal testing machine (UTM) was conducted for the same to evaluate the tensile strength of the welded steel specimens. Further comparative study was performed to find the effects of welding parameter on quality of weld strength by employing Taguchi method and Neural Network tool. Finally we concluded that taguchi method and Neural Network Tool is much efficient technique for optimization.

  5. Preparation and Characterization of PETI-330/Multiwalled Carbon Nanotube

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Working, Dennis C.; Criss, Jim M.; Siochi, Emilie J.; Connell, John W.

    2005-01-01

    As part of an ongoing effort to incorporate multifunctionality into advanced composites, blends of PETI-330 and multi-walled carbon nanotubes (MWCNTs) were prepared, characterized and fabricated into moldings. The PETI-330/MWCNT mixtures were prepared at concentrations ranging from 3 to 25 weight percent by dry mixing the components in a ball mill. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, PETI-330/MWCNT samples were scaled up to 300 g and used to fabricate moldings 10.2 cm x 15.2 cm x 0.32 cm thick. The moldings were made by injecting the mixtures at 260-280 C into an Invar tool followed by curing for 1 h at 371 C. The tool was designed to impart shear during the injection process in an attempt to achieve some alignment of the MWCNTs in the flow direction. Good quality moldings were obtained that were subsequently characterized for thermal, mechanical and electrical properties. The degree of dispersion and alignment of the MWCNTs were investigated using high-resolution scanning electron microscopy. The preparation and preliminary characterization of PETI-330/MWCNT composites will be discussed. Keywords: phenylethynyl terminated imides, high temperature polymers, nanocomposites, moldings

  6. Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.

    PubMed

    Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse

    2013-05-01

    Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  7. Hydrological modeling in forested systems

    Treesearch

    H.E. Golden; G.R. Evenson; S. Tian; Devendra Amatya; Ge Sun

    2015-01-01

    Characterizing and quantifying interactions among components of the forest hydrological cycle is complex and usually requires a combination of field monitoring and modelling approaches (Weiler and McDonnell, 2004; National Research Council, 2008). Models are important tools for testing hypotheses, understanding hydrological processes and synthesizing experimental data...

  8. Modeling and Characterization of Damage Processes in Metallic Materials

    NASA Technical Reports Server (NTRS)

    Glaessgen, E. H.; Saether, E.; Smith, S. W.; Hochhalter, J. D.; Yamakov, V. I.; Gupta, V.

    2011-01-01

    This paper describes a broad effort that is aimed at understanding the fundamental mechanisms of crack growth and using that understanding as a basis for designing materials and enabling predictions of fracture in materials and structures that have small characteristic dimensions. This area of research, herein referred to as Damage Science, emphasizes the length scale regimes of the nanoscale and the microscale for which analysis and characterization tools are being developed to predict the formation, propagation, and interaction of fundamental damage mechanisms. Examination of nanoscale processes requires atomistic and discrete dislocation plasticity simulations, while microscale processes can be examined using strain gradient plasticity, crystal plasticity and microstructure modeling methods. Concurrent and sequential multiscale modeling methods are being developed to analytically bridge between these length scales. Experimental methods for characterization and quantification of near-crack tip damage are also being developed. This paper focuses on several new methodologies in these areas and their application to understanding damage processes in polycrystalline metals. On-going and potential applications are also discussed.

  9. Modeling and Advanced Control for Sustainable Process ...

    EPA Pesticide Factsheets

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Weizhao; Ren, Huaqing; Lu, Jie

    This paper reports several characterization methods of the properties of the uncured woven prepreg during the preforming process. The uniaxial tension, bias-extension, and bending tests are conducted to measure the in-plane properties of the material. The friction tests utilized to reveal the prepreg-prepreg and prepreg-forming tool interactions. All these tests are performed within the temperature range of the real manufacturing process. The results serve as the inputs to the numerical simulation for the product prediction and preforming process parameter optimization.

  11. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    PubMed

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  12. Pin Tool Geometry Effects in Friction Stir Welding

    NASA Technical Reports Server (NTRS)

    Querin, J. A.; Rubisoff, H. A.; Schneider, J. A.

    2009-01-01

    In friction stir welding (FSW) there is significant evidence that material can take one of two different flow paths when being displaced from its original position in front of the pin tool to its final position in the wake of the weld. The geometry of the pin tool, along with the process parameters, plays an important role in dictating the path that the material takes. Each flow path will impart a different thermomechanical history on the material, consequently altering the material microstructure and subsequent weld properties. The intention of this research is to isolate the effect that different pin tool attributes have on the flow paths imparted on the FSWed material. Based on published weld tool geometries, a variety of weld tools were fabricated and used to join AA2219. Results from the tensile properties and microstructural characterization will be presented.

  13. Eddy current characterization of magnetic treatment of nickel 200

    NASA Technical Reports Server (NTRS)

    Chern, E. J.

    1993-01-01

    Eddy current methods have been applied to characterize the effect of magnetic treatments on component service-life extension. Coil impedance measurements were acquired and analyzed on nickel 200 specimens that have been subjected to many mechanical and magnetic engineering processes: annealing, applied strain, magnetic field, shot peening, and magnetic field after peening. Experimental results have demonstrated a functional relationship between coil impedance, resistance and reactance, and specimens subjected to various engineering processes. It has shown that magnetic treatment does induce changes in electromagnetic properties of nickel 200 that then exhibit evidence of stress relief. However, further fundamental studies are necessary for a thorough understanding of the exact mechanism of the magnetic field processing effect on machine-tool service life.

  14. Measurement of Oxidative Stress: Mitochondrial Function Using the Seahorse System.

    PubMed

    Leung, Dilys T H; Chu, Simon

    2018-01-01

    The Seahorse XFp Analyzer is a powerful tool for the assessment of various parameters of cellular respiration. Here we describe the process of the Seahorse Cell Phenotype Test using the Seahorse XFp Analyzer to characterize the metabolic phenotype of live cells. The Seahorse XFp Analyzer can also be coupled with other assays to measure cellular energetics. Given that mitochondrial dysfunction is implicated in preeclampsia, the Seahorse XFp Analyzer will serve as a useful tool for the understanding of pathological metabolism in this disorder.

  15. Developing Rapid and Cost-Effective Tools for Assessing Groundwater Impacts on Contaminated Sediments

    EPA Science Inventory

    This research developed quick and inexpensive methods that can be useful in characterizing the interaction of water and solids within the GW/SW transition zone to explain processes that occur during physical contact between groundwater and sediments. The research used self-conta...

  16. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management

    EPA Science Inventory

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scae and are important strata for fraing whole-watershed research questions and management plans. Hierarchi...

  17. Characterization of Intraventricular and Intracerebral Hematomas in Non-Contrast CT

    PubMed Central

    Nowinski, Wieslaw L; Gomolka, Ryszard S; Qian, Guoyu; Gupta, Varsha; Ullman, Natalie L; Hanley, Daniel F

    2014-01-01

    Summary Characterization of hematomas is essential in scan reading, manual delineation, and designing automatic segmentation algorithms. Our purpose is to characterize the distribution of intraventricular (IVH) and intracerebral hematomas (ICH) in NCCT scans, study their relationship to gray matter (GM), and to introduce a new tool for quantitative hematoma delineation. We used 289 serial retrospective scans of 51 patients. Hematomas were manually delineated in a two-stage process. Hematoma contours generated in the first stage were quantified and enhanced in the second stage. Delineation was based on new quantitative rules and hematoma profiling, and assisted by a dedicated tool superimposing quantitative information on scans with 3D hematoma display. The tool provides: density maps (40-85HU), contrast maps (8/15HU), mean horizontal/vertical contrasts for hematoma contours, and hematoma contours below a specified mean contrast (8HU). White matter (WM) and GM were segmented automatically. IVH/ICH on serial NCCT is characterized by 59.0HU mean, 60.0HU median, 11.6HU standard deviation, 23.9HU mean contrast, –0.99HU/day slope, and –0.24 skewness (changing over time from negative to positive). Its 0.1st-99.9th percentile range corresponds to 25-88HU range. WM and GM are highly correlated (R 2=0.88; p<10–10) whereas the GM-GS correlation is weak (R 2=0.14; p<10–10). The intersection point of mean GM-hematoma density distributions is at 55.6±5.8HU with the corresponding GM/hematoma percentiles of 88th/40th. Objective characterization of IVH/ICH and stating the rules quantitatively will aid raters to delineate hematomas more robustly and facilitate designing algorithms for automatic hematoma segmentation. Our two-stage process is general and potentially applicable to delineate other pathologies on various modalities more robustly and quantitatively. PMID:24976197

  18. Characterization of intraventricular and intracerebral hematomas in non-contrast CT.

    PubMed

    Nowinski, Wieslaw L; Gomolka, Ryszard S; Qian, Guoyu; Gupta, Varsha; Ullman, Natalie L; Hanley, Daniel F

    2014-06-01

    Characterization of hematomas is essential in scan reading, manual delineation, and designing automatic segmentation algorithms. Our purpose is to characterize the distribution of intraventricular (IVH) and intracerebral hematomas (ICH) in NCCT scans, study their relationship to gray matter (GM), and to introduce a new tool for quantitative hematoma delineation. We used 289 serial retrospective scans of 51 patients. Hematomas were manually delineated in a two-stage process. Hematoma contours generated in the first stage were quantified and enhanced in the second stage. Delineation was based on new quantitative rules and hematoma profiling, and assisted by a dedicated tool superimposing quantitative information on scans with 3D hematoma display. The tool provides: density maps (40-85HU), contrast maps (8/15HU), mean horizontal/vertical contrasts for hematoma contours, and hematoma contours below a specified mean contrast (8HU). White matter (WM) and GM were segmented automatically. IVH/ICH on serial NCCT is characterized by 59.0HU mean, 60.0HU median, 11.6HU standard deviation, 23.9HU mean contrast, -0.99HU/day slope, and -0.24 skewness (changing over time from negative to positive). Its 0.1(st)-99.9(th) percentile range corresponds to 25-88HU range. WM and GM are highly correlated (R (2)=0.88; p<10(-10)) whereas the GM-GS correlation is weak (R (2)=0.14; p<10(-10)). The intersection point of mean GM-hematoma density distributions is at 55.6±5.8HU with the corresponding GM/hematoma percentiles of 88(th)/40(th). Objective characterization of IVH/ICH and stating the rules quantitatively will aid raters to delineate hematomas more robustly and facilitate designing algorithms for automatic hematoma segmentation. Our two-stage process is general and potentially applicable to delineate other pathologies on various modalities more robustly and quantitatively.

  19. Electronic and software systems of an automated portable static mass spectrometer

    NASA Astrophysics Data System (ADS)

    Chichagov, Yu. V.; Bogdanov, A. A.; Lebedev, D. S.; Kogan, V. T.; Tubol'tsev, Yu. V.; Kozlenok, A. V.; Moroshkin, V. S.; Berezina, A. V.

    2017-01-01

    The electronic systems of a small high-sensitivity static mass spectrometer and software and hardware tools, which allow one to determine trace concentrations of gases and volatile compounds in air and water samples in real time, have been characterized. These systems and tools have been used to set up the device, control the process of measurement, synchronize this process with accompanying measurements, maintain reliable operation of the device, process the obtained results automatically, and visualize and store them. The developed software and hardware tools allow one to conduct continuous measurements for up to 100 h and provide an opportunity for personnel with no special training to perform maintenance on the device. The test results showed that mobile mass spectrometers for geophysical and medical research, which were fitted with these systems, had a determination limit for target compounds as low as several ppb(m) and a mass resolving power (depending on the current task) as high as 250.

  20. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  1. Understanding and reduction of defects on finished EUV masks

    NASA Astrophysics Data System (ADS)

    Liang, Ted; Sanchez, Peter; Zhang, Guojing; Shu, Emily; Nagpal, Rajesh; Stivers, Alan

    2005-05-01

    To reduce the risk of EUV lithography adaptation for the 32nm technology node in 2009, Intel has operated a EUV mask Pilot Line since early 2004. The Pilot Line integrates all the necessary process modules including common tool sets shared with current photomask production as well as EUV specific tools. This integrated endeavor ensures a comprehensive understanding of any issues, and development of solutions for the eventual fabrication of defect-free EUV masks. Two enabling modules for "defect-free" masks are pattern inspection and repair, which have been integrated into the Pilot Line. This is the first time we are able to look at real defects originated from multilayer blanks and patterning process on finished masks over entire mask area. In this paper, we describe our efforts in the qualification of DUV pattern inspection and electron beam mask repair tools for Pilot Line operation, including inspection tool sensitivity, defect classification and characterization, and defect repair. We will discuss the origins of each of the five classes of defects as seen by DUV pattern inspection tool on finished masks, and present solutions of eliminating and mitigating them.

  2. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    NASA Astrophysics Data System (ADS)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  3. Double Vacuum Bag Process for Resin Matrix Composite Manufacturing

    NASA Technical Reports Server (NTRS)

    Hou, Tan-Hung (Inventor); Jensen, Brian J. (Inventor)

    2007-01-01

    A double vacuum bag molding assembly with improved void management and laminate net shape control which provides a double vacuum enviromnent for use in fabricating composites from prepregs containing air and/or volatiles such as reactive resin matrix composites or composites from solvent containing prepregs with non-reactive resins matrices. By using two vacuum environments during the curing process, a vacuum can be drawn during a B-stage of a two-step cycle without placing the composite under significant relative pressure. During the final cure stage, a significant pressure can be applied by releasing the vacuum in one of the two environments. Inner and outer bags are useful for creating the two vacuum environments with a perforated tool intermediate the two. The composite is placed intermediate a tool plate and a caul plate in the first environment with the inner bag and tool plate defining the first environment. The second environment is characterized by the outer bag which is placed over the inner bag and the tool plate.

  4. Friction Stir Spot Welding of Advanced High Strength Steels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Grant, Glenn J.; Santella, M. L.

    Friction stir spot welding techniques were developed to successfully join several advanced high strength steels. Two distinct tool materials were evaluated to determine the effect of tool materials on the process parameters and joint properties. Welds were characterized primarily via lap shear, microhardness, and optical microscopy. Friction stir spot welds were compared to the resistance spot welds in similar strength alloys by using the AWS standard for resistance spot welding high strength steels. As further comparison, a primitive cost comparison between the two joining processes was developed, which included an evaluation of the future cost prospects of friction stir spotmore » welding in advanced high strength steels.« less

  5. Description of operation of fast-response solenoid actuator in diesel fuel system model

    NASA Astrophysics Data System (ADS)

    Zhao, J.; Grekhov, L. V.; Fan, L.; Ma, X.; Song, E.

    2018-03-01

    The performance of the fast-response solenoid actuator (FRSA) of engine fuel systems is characterized by the response time of less than 0.1 ms and the necessity to take into consideration the non-stationary peculiarities of mechanical, hydraulic, electrical and magnetic processes. Simple models for magnetization in static and dynamic hysteresis are used for this purpose. The experimental study of the FRSA performance within the electro-hydraulic injector of the Common Rail demonstrated an agreement between the computational and experimental results. The computation of the processes is not only a tool for analysis, but also a tool for design and optimization of the solenoid actuator of new engine fuels systems.

  6. 3D-atom probe characterization of nano-precipitates in a PM processed tool steels

    NASA Astrophysics Data System (ADS)

    Niederkofler, M.; Leisch, M.

    2004-07-01

    The microstructure of a powder metallurgical processed high speed steel (nom. composition (wt.%): 1.6 C, 4.8 Cr, 2.0 Mo, 5.0 V, 105 W, 8.0 Co and balance Fe) has been examined using 3D-atom probe technique. By the depth profiling of the time to flight mass spectrometer and position sensitive recording, cylindrical volumes of 10-15 nm in diameter and up to 40 nm in depth have been probed and characterized. The depth profiling measurements of the samples show generally a very homogeneous structure which was expected by the powder metallurgical processing of the material. Different morphologies of the precipitates were recorded. Besides the needle shaped precipitates with an extend up to 20 nm and thickness of few atomic layers, platelets and spherical particles are observed as well. The species which can be assigned to the precipitates appear to some extend as MC molecules in the mass histogram, while the leading constituents in this MC are Mo, V and Cr. Beside distinct particles agglomerations like one-dimensional atomic chains of the alloy components are also observed in the 3D reconstructions of the tool steel matrix.

  7. Rheology as a tool for evaluation of melt processability of innovative dosage forms.

    PubMed

    Aho, Johanna; Boetker, Johan P; Baldursdottir, Stefania; Rantanen, Jukka

    2015-10-30

    Future manufacturing of pharmaceuticals will involve innovative use of polymeric excipients. Hot melt extrusion (HME) is an already established manufacturing technique and several products based on HME are on the market. Additionally, processing based on, e.g., HME or three dimensional (3D) printing, will have an increasingly important role when designing products for flexible dosing, since dosage forms based on compacting of a given powder mixture do not enable manufacturing of optimal pharmaceutical products for personalized treatments. The melt processability of polymers and API-polymer mixtures is highly dependent on the rheological properties of these systems, and rheological measurements should be considered as a more central part of the material characterization tool box when selecting suitable candidates for melt processing by, e.g., HME or 3D printing. The polymer processing industry offers established platforms, methods, and models for rheological characterization, and they can often be readily applied in the field of pharmaceutical manufacturing. Thoroughly measured and calculated rheological parameters together with thermal and mechanical material data are needed for the process simulations which are also becoming increasingly important. The authors aim to give an overview to the basics of rheology and summarize examples of the studies where rheology has been utilized in setting up or evaluating extrusion processes. Furthermore, examples of different experimental set-ups available for rheological measurements are presented, discussing each of their typical application area, advantages and limitations. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. The Role of Gesture in Meaning Construction

    ERIC Educational Resources Information Center

    Singer, Melissa; Radinsky, Joshua; Goldman, Susan R.

    2008-01-01

    This article examines the role of gesture in the shared meaning-making processes of 6th-grade students studying plate tectonics using a data visualization tool; specifically, a geographic information system. Students' verbal and gestural characterizations of key concepts of plate motions (i.e., "subduction", "rift", and "buckling") were…

  9. Utilization of FEM model for steel microstructure determination

    NASA Astrophysics Data System (ADS)

    Kešner, A.; Chotěborský, R.; Linda, M.; Hromasová, M.

    2018-02-01

    Agricultural tools which are used in soil processing, they are worn by abrasive wear mechanism cases by hard minerals particles in the soil. The wear rate is influenced by mechanical characterization of tools material and wear rate is influenced also by soil mineral particle contents. Mechanical properties of steel can be affected by a technology of heat treatment that it leads to a different microstructures. Experimental work how to do it is very expensive and thanks to numerical methods like FEM we can assumed microstructure at low cost but each of numerical model is necessary to be verified. The aim of this work has shown a procedure of prediction microstructure of steel for agricultural tools. The material characterizations of 51CrV4 grade steel were used for numerical simulation like TTT diagram, heat capacity, heat conduction and other physical properties of material. A relationship between predicted microstructure by FEM and real microstructure after heat treatment shows a good correlation.

  10. STRAP PTM: Software Tool for Rapid Annotation and Differential Comparison of Protein Post-Translational Modifications.

    PubMed

    Spencer, Jean L; Bhatia, Vivek N; Whelan, Stephen A; Costello, Catherine E; McComb, Mark E

    2013-12-01

    The identification of protein post-translational modifications (PTMs) is an increasingly important component of proteomics and biomarker discovery, but very few tools exist for performing fast and easy characterization of global PTM changes and differential comparison of PTMs across groups of data obtained from liquid chromatography-tandem mass spectrometry experiments. STRAP PTM (Software Tool for Rapid Annotation of Proteins: Post-Translational Modification edition) is a program that was developed to facilitate the characterization of PTMs using spectral counting and a novel scoring algorithm to accelerate the identification of differential PTMs from complex data sets. The software facilitates multi-sample comparison by collating, scoring, and ranking PTMs and by summarizing data visually. The freely available software (beta release) installs on a PC and processes data in protXML format obtained from files parsed through the Trans-Proteomic Pipeline. The easy-to-use interface allows examination of results at protein, peptide, and PTM levels, and the overall design offers tremendous flexibility that provides proteomics insight beyond simple assignment and counting.

  11. Detection and Characterization of Boundary-Layer Transition in Flight at Supersonic Conditions Using Infrared Thermography

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a powerful tool for investigating fluid mechanics on flight vehicles. (Can be used to visualize and characterize transition, shock impingement, separation etc.). Updated onboard F-15 based system was used to visualize supersonic boundary layer transition test article. (Tollmien-Schlichting and cross-flow dominant flow fields). Digital Recording improves image quality and analysis capability. (Allows accurate quantitative (temperature) measurements, Greater enhancement through image processing allows analysis of smaller scale phenomena).

  12. Graph theory for feature extraction and classification: a migraine pathology case study.

    PubMed

    Jorge-Hernandez, Fernando; Garcia Chimeno, Yolanda; Garcia-Zapirain, Begonya; Cabrera Zubizarreta, Alberto; Gomez Beldarrain, Maria Angeles; Fernandez-Ruanova, Begonya

    2014-01-01

    Graph theory is also widely used as a representational form and characterization of brain connectivity network, as is machine learning for classifying groups depending on the features extracted from images. Many of these studies use different techniques, such as preprocessing, correlations, features or algorithms. This paper proposes an automatic tool to perform a standard process using images of the Magnetic Resonance Imaging (MRI) machine. The process includes pre-processing, building the graph per subject with different correlations, atlas, relevant feature extraction according to the literature, and finally providing a set of machine learning algorithms which can produce analyzable results for physicians or specialists. In order to verify the process, a set of images from prescription drug abusers and patients with migraine have been used. In this way, the proper functioning of the tool has been proved, providing results of 87% and 92% of success depending on the classifier used.

  13. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M

    2005-01-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less

  14. MoCha: Molecular Characterization of Unknown Pathways.

    PubMed

    Lobo, Daniel; Hammelman, Jennifer; Levin, Michael

    2016-04-01

    Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.

  15. Friction Stir Welding (FSW) of Aged CuCrZr Alloy Plates

    NASA Astrophysics Data System (ADS)

    Jha, Kaushal; Kumar, Santosh; Nachiket, K.; Bhanumurthy, K.; Dey, G. K.

    2018-01-01

    Friction Stir Welding (FSW) of Cu-0.80Cr-0.10Zr (in wt pct) alloy under aged condition was performed to study the effects of process parameters on microstructure and properties of the joint. FSW was performed over a wide range of process parameters, like tool-rotation speed (from 800 to 1200 rpm) and tool-travel speed (from 40 to 100 mm/min), and the resulting thermal cycles were recorded on both sides (advancing and retreating) of the joint. The joints were characterized for their microstructure and tensile properties. The welding process resulted in a sound and defect-free weld joint, over the entire range of the process parameters used in this study. Microstructure of the stir zone showed fine and equiaxed grains, the scale of which varied with FSW process parameters. Grain size in the stir zone showed direct correlation with tool rotation and inverse correlation with tool-travel speed. Tensile strength of the weld joints was ranging from 225 to 260 MPa, which is substantially lower than that of the parent metal under aged condition ( 400 MPa), but superior to that of the parent material under annealed condition ( 220 MPa). Lower strength of the FSW joint than that of the parent material under aged condition can be attributed to dissolution of the precipitates in the stir zone and TMAZ. These results are presented and discussed in this paper.

  16. Preparation and Characterization of PETI-330/Multiwalled Carbon Nanotube Composites

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Working, Dennis C.; Delozier, Donavon M.; Criss, Jim M.; Siochi, Emilie J.; Connell, John W.

    2005-01-01

    As part of an ongoing effort to incorporate multi-functionality into advanced composites, blends of PETI-330 and multi-walled carbon nanotubes (MWCNTs) were prepared, characterized and fabricated into moldings. The PETI-330/MWCNT mixtures were prepared at concentrations ranging from 3 to 25 weight percent by dry mixing the components in a ball mill. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, PETI-330/MWCNT samples were scaled up to approx. 300 g and used to fabricate moldings 10.2 cm x 15.2 cm x 0.32 cm thick. The moldings were fabricated by injecting the mixtures at 260-280 C into a stainless steel tool followed by curing for 1 h at 371 C. The tool was designed to impart high shear during the injection process in an attempt to achieve some alignment of the MWCNTs in the flow direction. Good quality moldings were obtained that were subsequently characterized for thermal, mechanical and electrical properties. The degree of dispersion and alignment of the MWCNTs were investigated using high-resolution scanning electron microscopy and Raman spectroscopy. The preparation and preliminary characterization of PETI-330/MWCNT composites will be discussed. Keywords: phenylethynyl terminated imides, high temperature polymers, nanocomposites,

  17. CAE for Injection Molding — Past, Present and the Future

    NASA Astrophysics Data System (ADS)

    Wang, Kuo K.

    2004-06-01

    It is well known that injection molding is the most effective process for mass-producing discrete plastic parts of complex shape to the highest precision at the lowest cost. However, due to the complex property of polymeric materials undergoing a transient non-isothermal process, it is equally well recognized that the quality of final products is often difficult to be assured. This is particularly true when a new mold or material is encountered. As a result, injection molding has often been viewed as an art than a science. During the past few decades, numerical simulation of injection molding process based on analytic models has become feasible for practical use as computers became faster and cheaper continually. A research effort was initiated at the Cornell Injection Molding Program (CIMP) in 1974 under a grant from the National Science Foundation. Over a quarter of the century, CIMP has established some scientific bases ranging from materials characterization, flow analysis, to prediction of part quality. Use of such CAE tools has become common place today in industry. Present effort has been primarily aimed at refinements of many aspects of the process. Computational efficiency and user-interface have been main thrusts by commercial software developers. Extension to 3-dimensional flow analysis for certain parts has drawn some attention. Research activities are continuing on molding of fiber-filled materials and reactive polymers. Expanded molding processes such as gas-assisted, co-injection, micro-molding and many others are continually being investigated. In the future, improvements in simulation accuracy and efficiency will continue. This will include in-depth studies on materials characterization. Intelligent on-line process control may draw more attention in order to achieve higher degree of automation. As Internet technology continues to evolve, Web-based CAE tools for design, production, remote process monitoring and control can come to path. The CAE tools will eventually be integrated into an Enterprise Resources Planning (ERP) system as the trend of enterprise globalization continues.

  18. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    PubMed Central

    Géneaux, R.; Camper, A.; Auguste, T.; Gobert, O.; Caillat, J.; Taïeb, R.; Ruchon, T.

    2016-01-01

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterize helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. These breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices. PMID:27573787

  19. PET-Tool: a software suite for comprehensive processing and managing of Paired-End diTag (PET) sequence data.

    PubMed

    Chiu, Kuo Ping; Wong, Chee-Hong; Chen, Qiongyu; Ariyaratne, Pramila; Ooi, Hong Sain; Wei, Chia-Lin; Sung, Wing-Kin Ken; Ruan, Yijun

    2006-08-25

    We recently developed the Paired End diTag (PET) strategy for efficient characterization of mammalian transcriptomes and genomes. The paired end nature of short PET sequences derived from long DNA fragments raised a new set of bioinformatics challenges, including how to extract PETs from raw sequence reads, and correctly yet efficiently map PETs to reference genome sequences. To accommodate and streamline data analysis of the large volume PET sequences generated from each PET experiment, an automated PET data process pipeline is desirable. We designed an integrated computation program package, PET-Tool, to automatically process PET sequences and map them to the genome sequences. The Tool was implemented as a web-based application composed of four modules: the Extractor module for PET extraction; the Examiner module for analytic evaluation of PET sequence quality; the Mapper module for locating PET sequences in the genome sequences; and the Project Manager module for data organization. The performance of PET-Tool was evaluated through the analyses of 2.7 million PET sequences. It was demonstrated that PET-Tool is accurate and efficient in extracting PET sequences and removing artifacts from large volume dataset. Using optimized mapping criteria, over 70% of quality PET sequences were mapped specifically to the genome sequences. With a 2.4 GHz LINUX machine, it takes approximately six hours to process one million PETs from extraction to mapping. The speed, accuracy, and comprehensiveness have proved that PET-Tool is an important and useful component in PET experiments, and can be extended to accommodate other related analyses of paired-end sequences. The Tool also provides user-friendly functions for data quality check and system for multi-layer data management.

  20. PIPI: PTM-Invariant Peptide Identification Using Coding Method.

    PubMed

    Yu, Fengchao; Li, Ning; Yu, Weichuan

    2016-12-02

    In computational proteomics, the identification of peptides with an unlimited number of post-translational modification (PTM) types is a challenging task. The computational cost associated with database search increases exponentially with respect to the number of modified amino acids and linearly with respect to the number of potential PTM types at each amino acid. The problem becomes intractable very quickly if we want to enumerate all possible PTM patterns. To address this issue, one group of methods named restricted tools (including Mascot, Comet, and MS-GF+) only allow a small number of PTM types in database search process. Alternatively, the other group of methods named unrestricted tools (including MS-Alignment, ProteinProspector, and MODa) avoids enumerating PTM patterns with an alignment-based approach to localizing and characterizing modified amino acids. However, because of the large search space and PTM localization issue, the sensitivity of these unrestricted tools is low. This paper proposes a novel method named PIPI to achieve PTM-invariant peptide identification. PIPI belongs to the category of unrestricted tools. It first codes peptide sequences into Boolean vectors and codes experimental spectra into real-valued vectors. For each coded spectrum, it then searches the coded sequence database to find the top scored peptide sequences as candidates. After that, PIPI uses dynamic programming to localize and characterize modified amino acids in each candidate. We used simulation experiments and real data experiments to evaluate the performance in comparison with restricted tools (i.e., Mascot, Comet, and MS-GF+) and unrestricted tools (i.e., Mascot with error tolerant search, MS-Alignment, ProteinProspector, and MODa). Comparison with restricted tools shows that PIPI has a close sensitivity and running speed. Comparison with unrestricted tools shows that PIPI has the highest sensitivity except for Mascot with error tolerant search and ProteinProspector. These two tools simplify the task by only considering up to one modified amino acid in each peptide, which results in a higher sensitivity but has difficulty in dealing with multiple modified amino acids. The simulation experiments also show that PIPI has the lowest false discovery proportion, the highest PTM characterization accuracy, and the shortest running time among the unrestricted tools.

  1. Metal-Insulator-Metal Diode Process Development for Energy Harvesting Applications

    DTIC Science & Technology

    2010-04-01

    Sputter Tool Dep Method: Sputtering (DC Magnetron ) Recipe: MC_Pt 1640A_TiO2 1000A_Ti 2000A_500C_1a MC_Pt 1640A_TiO2 1000A_Ti 2000A_300C_1a MC_Pt...thin films were sputtered onto silicon substrates with silicon dioxide overlayers. I-V measurements were taken using an electrical characterization...deposition of the entire MIM material stack to be done without breaking the vacuum within a multi-material system DC sputtering tool. A CAD layout of a MIM

  2. NASA Electronic Parts and Packaging Program

    NASA Technical Reports Server (NTRS)

    Kayali, Sammy

    2000-01-01

    NEPP program objectives are to: (1) Access the reliability of newly available electronic parts and packaging technologies for usage on NASA projects through validations, assessments, and characterizations, and the development of test methods/tools; (2)Expedite infusion paths for advanced (emerging) electronic parts and packaging technologies by evaluations of readiness for manufacturability and project usage consideration; (3) Provide NASA projects with technology selection, application, and validation guidelines for electronic parts and packaging hardware and processes; nd (4) Retain and disseminate electronic parts and packaging quality assurance, reliability validations, tools, and availability information to the NASA community.

  3. Integrated Measurements and Characterization | Photovoltaic Research | NREL

    Science.gov Websites

    Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool

  4. Combining Simulation and Optimization Models for Hardwood Lumber Production

    Treesearch

    G.A. Mendoza; R.J. Meimban; W.G. Luppold; Philip A. Araman

    1991-01-01

    Published literature contains a number of optimization and simulation models dealing with the primary processing of hardwood and softwood logs. Simulation models have been developed primarily as descriptive models for characterizing the general operations and performance of a sawmill. Optimization models, on the other hand, were developed mainly as analytical tools for...

  5. A Simpli ed, General Approach to Simulating from Multivariate Copula Functions

    Treesearch

    Barry Goodwin

    2012-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...

  6. A modular Human Exposure Model (HEM) framework to characterize near-field chemical exposure in LCIA and CAA

    EPA Science Inventory

    Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historicall...

  7. Monitoring non-thermal plasma processes for nanoparticle synthesis

    NASA Astrophysics Data System (ADS)

    Mangolini, Lorenzo

    2017-09-01

    Process characterization tools have played a crucial role in the investigation of dusty plasmas. The presence of dust in certain non-thermal plasma processes was first detected by laser light scattering measurements. Techniques like laser induced particle explosive evaporation and ion mass spectrometry have provided the experimental evidence necessary for the development of the theory of particle nucleation in silane-containing non-thermal plasmas. This review provides first a summary of these early efforts, and then discusses recent investigations using in situ characterization techniques to understand the interaction between nanoparticles and plasmas. The advancement of such monitoring techniques is necessary to fully develop the potential of non-thermal plasmas as unique materials synthesis and processing platforms. At the same time, the strong coupling between materials and plasma properties suggest that it is also necessary to advance techniques for the measurement of plasma properties while in presence of dust. Recent progress in this area will be discussed.

  8. Thermography Inspection for Early Detection of Composite Damage in Structures During Fatigue Loading

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Burke, Eric R.; Parker, F. Raymond; Seebo, Jeffrey P.; Wright, Christopher W.; Bly, James B.

    2012-01-01

    Advanced composite structures are commonly tested under controlled loading. Understanding the initiation and progression of composite damage under load is critical for validating design concepts and structural analysis tools. Thermal nondestructive evaluation (NDE) is used to detect and characterize damage in composite structures during fatigue loading. A difference image processing algorithm is demonstrated to enhance damage detection and characterization by removing thermal variations not associated with defects. In addition, a one-dimensional multilayered thermal model is used to characterize damage. Lastly, the thermography results are compared to other inspections such as non-immersion ultrasonic inspections and computed tomography X-ray.

  9. Afraid to Start Because the Outcome is Uncertain?: Social Site Characterization as a Tool for Informing Public Engagement Efforts

    USGS Publications Warehouse

    Wade, S.; Greenberg, S.

    2009-01-01

    This paper introduces the concept of social site characterization as a parallel effort to technical site characterization to be used in evaluating and planning carbon dioxides capture and storage (CCS) projects. Social site characterization, much like technical site characterization, relies on a series of iterative investigations into public attitudes towards a CCS project and the factors that will shape those views. This paper also suggests ways it can be used to design approaches for actively engaging stakeholders and communities in the deployment of CCS projects. This work is informed by observing the site selection process for FutureGen and the implementation of research projects under the Regional Carbon Sequestration Partnership Program. ?? 2009 Elsevier Ltd. All rights reserved.

  10. Bioprocess integration for human mesenchymal stem cells: From up to downstream processing scale-up to cell proteome characterization.

    PubMed

    Cunha, Bárbara; Aguiar, Tiago; Carvalho, Sofia B; Silva, Marta M; Gomes, Ricardo A; Carrondo, Manuel J T; Gomes-Alves, Patrícia; Peixoto, Cristina; Serra, Margarida; Alves, Paula M

    2017-04-20

    To deliver the required cell numbers and doses to therapy, scaling-up production and purification processes (at least to the liter-scale) while maintaining cells' characteristics is compulsory. Therefore, the aim of this work was to prove scalability of an integrated streamlined bioprocess compatible with current good manufacturing practices (cGMP) comprised by cell expansion, harvesting and volume reduction unit operations using human mesenchymal stem cells (hMSC) isolated from bone marrow (BM-MSC) and adipose tissue (AT-MSC). BM-MSC and AT-MSC expansion and harvesting steps were scaled-up from spinner flasks to 2L scale stirred tank single-use bioreactor using synthetic microcarriers and xeno-free medium, ensuring high cellular volumetric productivities (50×10 6 cellL -1 day -1 ), expansion factors (14-16 fold) and cell recovery yields (80%). For the concentration step, flat sheet cassettes (FSC) and hollow fiber cartridges (HF) were compared showing a fairly linear scale-up, with a need to slightly decrease the permeate flux (30-50 LMH, respectively) to maximize cell recovery yield. Nonetheless, FSC allowed to recover 18% more cells after a volume reduction factor of 50. Overall, at the end of the entire bioprocess more than 65% of viable (>95%) hMSC could be recovered without compromising cell's critical quality attributes (CQA) of viability, identity and differentiation potential. Alongside the standard quality assays, a proteomics workflow based on mass spectrometry tools was established to characterize the impact of processing on hMSC's CQA; These analytical tools constitute a powerful tool to be used in process design and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Visual Illusions: An Interesting Tool to Investigate Developmental Dyslexia and Autism Spectrum Disorder

    PubMed Central

    Gori, Simone; Molteni, Massimo; Facoetti, Andrea

    2016-01-01

    A visual illusion refers to a percept that is different in some aspect from the physical stimulus. Illusions are a powerful non-invasive tool for understanding the neurobiology of vision, telling us, indirectly, how the brain processes visual stimuli. There are some neurodevelopmental disorders characterized by visual deficits. Surprisingly, just a few studies investigated illusory perception in clinical populations. Our aim is to review the literature supporting a possible role for visual illusions in helping us understand the visual deficits in developmental dyslexia and autism spectrum disorder. Future studies could develop new tools – based on visual illusions – to identify an early risk for neurodevelopmental disorders. PMID:27199702

  12. Non-Markovian quantum processes: Complete framework and efficient characterization

    NASA Astrophysics Data System (ADS)

    Pollock, Felix A.; Rodríguez-Rosario, César; Frauenheim, Thomas; Paternostro, Mauro; Modi, Kavan

    2018-01-01

    Currently, there is no systematic way to describe a quantum process with memory solely in terms of experimentally accessible quantities. However, recent technological advances mean we have control over systems at scales where memory effects are non-negligible. The lack of such an operational description has hindered advances in understanding physical, chemical, and biological processes, where often unjustified theoretical assumptions are made to render a dynamical description tractable. This has led to theories plagued with unphysical results and no consensus on what a quantum Markov (memoryless) process is. Here, we develop a universal framework to characterize arbitrary non-Markovian quantum processes. We show how a multitime non-Markovian process can be reconstructed experimentally, and that it has a natural representation as a many-body quantum state, where temporal correlations are mapped to spatial ones. Moreover, this state is expected to have an efficient matrix-product-operator form in many cases. Our framework constitutes a systematic tool for the effective description of memory-bearing open-system evolutions.

  13. An Analysis of the Use of Social Software and Its Impact on Organizational Processes

    NASA Astrophysics Data System (ADS)

    Pascual-Miguel, Félix; Chaparro-Peláez, Julián; Hernández-García, Ángel

    This article proposes a study on the implementation rate of the most relevant 2.0 tools and technologies in Spanish enterprises, and their impact on 12 important aspects of business processes. In order to characterize the grade of implementation and the perceived improvements on the processes two indexes, Implementation Index and Impact Rate, have been created and displayed in a matrix called "2.0 Success Matrix". Data has been analyzed from a survey taken to directors and executives of large companies and small and medium businesses.

  14. Assessing Group Interaction with Social Language Network Analysis

    NASA Astrophysics Data System (ADS)

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  15. Effect of process parameters on microstructure and mechanical properties of friction stir welded joints: A review

    NASA Astrophysics Data System (ADS)

    Wanare, S. P.; Kalyankar, V. D.

    2018-04-01

    Friction stir welding is emerging as a promising technique for joining of lighter metal alloys due to its several advantages over conventional fusion welding processes such as low thermal distortion, good mechanical properties, fine weld joint microstructure, etc. This review article mainly focuses on analysis of microstructure and mechanical properties of friction stir welded joints. Various microstructure characterization techniques used by previous researchers such as optical microscopes, x-ray diffraction, electron probe microscope, transmission electron microscope, scanning electron microscopes with electron back scattered diffraction, electron dispersive microscopy, etc. are thoroughly overviewed and their results are discussed. The effects of friction stir welding process parameters such as tool rotational speed, welding speed, tool plunge depth, axial force, tool shoulder diameter to tool pin diameter ratio, tool geometry etc. on microstructure and mechanical properties of welded joints are studied and critical observations are noted down. The microstructure examination carried out by previous researchers on various zones of welded joints such as weld zone, heat affected zone and base metal are studied and critical remarks have been presented. Mechanical performances of friction stir welded joints based on tensile test, micro-hardness test, etc. are discussed. This article includes exhaustive literature review of standard research articles which may become ready information for subsequent researchers to establish their line of action.

  16. Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1996-01-01

    The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.

  17. TRIP-ID: A tool for a smart and interactive identification of Magic Formula tyre model parameters from experimental data acquired on track or test rig

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco

    2018-03-01

    Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Hua

    Combustion represents a key chemical process in energy consumption in modern societies and a clear and comprehensive understanding of the elemental reactions in combustion is of great importance to a number of challenging areas such as engine efficiency and environmental protection. In this award, we proposed to develop new theoretical tools to understand elemental chemical processes in combustion environments. With the support of this DOE grant, we have made significant advances in developing new and more efficient and accurate algorithms to characterize reaction dynamics.

  19. Developing seventh grade students' systems thinking skills in the context of the human circulatory system.

    PubMed

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students' systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students' systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students' systems thinking components and examine possible changes in the students' knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students' ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems' components and its processes within a framework of relationships, namely the students' systems thinking skills were improved in the course of learning using the teaching and learning materials.

  20. Developing Seventh Grade Students’ Systems Thinking Skills in the Context of the Human Circulatory System

    PubMed Central

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students’ systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students’ systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students’ systems thinking components and examine possible changes in the students’ knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students’ ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems’ components and its processes within a framework of relationships, namely the students’ systems thinking skills were improved in the course of learning using the teaching and learning materials. PMID:25520948

  1. Categorization for Faces and Tools-Two Classes of Objects Shaped by Different Experience-Differs in Processing Timing, Brain Areas Involved, and Repetition Effects.

    PubMed

    Kozunov, Vladimir; Nikolaeva, Anastasia; Stroganova, Tatiana A

    2017-01-01

    The brain mechanisms that integrate the separate features of sensory input into a meaningful percept depend upon the prior experience of interaction with the object and differ between categories of objects. Recent studies using representational similarity analysis (RSA) have characterized either the spatial patterns of brain activity for different categories of objects or described how category structure in neuronal representations emerges in time, but never simultaneously. Here we applied a novel, region-based, multivariate pattern classification approach in combination with RSA to magnetoencephalography data to extract activity associated with qualitatively distinct processing stages of visual perception. We asked participants to name what they see whilst viewing bitonal visual stimuli of two categories predominantly shaped by either value-dependent or sensorimotor experience, namely faces and tools, and meaningless images. We aimed to disambiguate the spatiotemporal patterns of brain activity between the meaningful categories and determine which differences in their processing were attributable to either perceptual categorization per se , or later-stage mentalizing-related processes. We have extracted three stages of cortical activity corresponding to low-level processing, category-specific feature binding, and supra-categorical processing. All face-specific spatiotemporal patterns were associated with bilateral activation of ventral occipito-temporal areas during the feature binding stage at 140-170 ms. The tool-specific activity was found both within the categorization stage and in a later period not thought to be associated with binding processes. The tool-specific binding-related activity was detected within a 210-220 ms window and was located to the intraparietal sulcus of the left hemisphere. Brain activity common for both meaningful categories started at 250 ms and included widely distributed assemblies within parietal, temporal, and prefrontal regions. Furthermore, we hypothesized and tested whether activity within face and tool-specific binding-related patterns would demonstrate oppositely acting effects following procedural perceptual learning. We found that activity in the ventral, face-specific network increased following the stimuli repetition. In contrast, tool processing in the dorsal network adapted by reducing its activity over the repetition period. Altogether, we have demonstrated that activity associated with visual processing of faces and tools during the categorization stage differ in processing timing, brain areas involved, and in their dynamics underlying stimuli learning.

  2. Ultimate intra-wafer critical dimension uniformity control by using lithography and etch tool corrections

    NASA Astrophysics Data System (ADS)

    Kubis, Michael; Wise, Rich; Reijnen, Liesbeth; Viatkina, Katja; Jaenen, Patrick; Luca, Melisa; Mernier, Guillaume; Chahine, Charlotte; Hellin, David; Kam, Benjamin; Sobieski, Daniel; Vertommen, Johan; Mulkens, Jan; Dusa, Mircea; Dixit, Girish; Shamma, Nader; Leray, Philippe

    2016-03-01

    With shrinking design rules, the overall patterning requirements are getting aggressively tighter. For the 7-nm node and below, allowable CD uniformity variations are entering the Angstrom region (ref [1]). Optimizing inter- and intra-field CD uniformity of the final pattern requires a holistic tuning of all process steps. In previous work, CD control with either litho cluster or etch tool corrections has been discussed. Today, we present a holistic CD control approach, combining the correction capability of the etch tool with the correction capability of the exposure tool. The study is done on 10-nm logic node wafers, processed with a test vehicle stack patterning sequence. We include wafer-to-wafer and lot-to-lot variation and apply optical scatterometry to characterize the fingerprints. Making use of all available correction capabilities (lithography and etch), we investigated single application of exposure tool corrections and of etch tool corrections as well as combinations of both to reach the lowest CD uniformity. Results of the final pattern uniformity based on single and combined corrections are shown. We conclude on the application of this holistic lithography and etch optimization to 7nm High-Volume manufacturing, paving the way to ultimate within-wafer CD uniformity control.

  3. Automatic analysis of microscopic images of red blood cell aggregates

    NASA Astrophysics Data System (ADS)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  4. Dynamics and morphometric characterization of hippocampus neurons using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Elkatlawy, Saeid; Gomariz, María.; Soto-Sánchez, Cristina; Martínez Navarrete, Gema; Fernández, Eduardo; Fimia, Antonio

    2014-05-01

    In this paper we report on the use of digital holographic microscopy for 3D real time imaging of cultured neurons and neural networks, in vitro. Digital holographic microscopy is employed as an assessment tool to study the biophysical origin of neurodegenerative diseases. Our study consists in the morphological characterization of the axon, dendrites and cell bodies. The average size and thickness of the soma were 21 and 13 μm, respectively. Furthermore, the average size and diameter of some randomly selected neurites were 4.8 and 0.89 μm, respectively. In addition, the spatiotemporal growth process of cellular bodies and extensions was fitted to by a non-linear behavior of the nerve system. Remarkably, this non-linear process represents the relationship between the growth process of cellular body with respect to the axon and dendrites of the neurons.

  5. Atomization and vaporization characteristics of airblast fuel injection inside a venturi tube

    NASA Technical Reports Server (NTRS)

    Sun, H.; Chue, T.-H.; Lai, M.-C.; Tacina, R. R.

    1993-01-01

    This paper describes the experimental and numerical characterization of the capillary fuel injection, atomization, dispersion, and vaporization of liquid fuel in a coflowing air stream inside a single venturi tube. The experimental techniques used are all laser-based. Phase Doppler analyzer was used to characterize the atomization and vaporization process. Planar laser-induced fluorescence visualizations give good qualitative picture of the fuel droplet and vapor distribution. Limited quantitative capabilities of the technique are also demonstrated. A modified version of the KIVA-II was used to simulate the entire spray process, including breakup and vaporization. The advantage of venturi nozzle is demonstrated in terms of better atomization, more uniform F/A distribution, and less pressure drop. Multidimensional spray calculations can be used as a design tool only if care is taken for the proper breakup model, and wall impingement process.

  6. Next generation calmodulin affinity purification: Clickable calmodulin facilitates improved protein purification

    PubMed Central

    Kinzer-Ursem, Tamara L.

    2018-01-01

    As the proteomics field continues to expand, scientists are looking to integrate cross-disciplinary tools for studying protein structure, function, and interactions. Protein purification remains a key tool for many characterization studies. Calmodulin (CaM) is a calcium-binding messenger protein with over a hundred downstream binding partners, and is involved in a host of physiological processes, from learning and memory to immune and cardiac function. To facilitate biophysical studies of calmodulin, researchers have designed a site-specific labeling process for use in bioconjugation applications while maintaining high levels of protein activity. Here, we present a platform for selective conjugation of calmodulin directly from clarified cell lysates under bioorthogonal reaction conditions. Using a chemoenzymatically modified calmodulin, we employ popular click chemistry reactions for the conjugation of calmodulin to Sepharose resin, thereby streamlining a previously multi-step purification and conjugation process. We show that this “next-generation” calmodulin-Sepharose resin is not only easy to produce, but is also able to purify more calmodulin-binding proteins per volume of resin than traditional calmodulin-Sepharose resins. We expect these methods to be translatable to other proteins of interest and to other conjugation applications such as surface-based assays for the characterization of protein-protein interaction dynamics. PMID:29864125

  7. Current–Voltage Characterization of Individual As-Grown Nanowires Using a Scanning Tunneling Microscope

    PubMed Central

    2013-01-01

    Utilizing semiconductor nanowires for (opto)electronics requires exact knowledge of their current–voltage properties. We report accurate on-top imaging and I–V characterization of individual as-grown nanowires, using a subnanometer resolution scanning tunneling microscope with no need for additional microscopy tools, thus allowing versatile application. We form Ohmic contacts to InP and InAs nanowires without any sample processing, followed by quantitative measurements of diameter dependent I–V properties with a very small spread in measured values compared to standard techniques. PMID:24059470

  8. Current-voltage characterization of individual as-grown nanowires using a scanning tunneling microscope.

    PubMed

    Timm, Rainer; Persson, Olof; Engberg, David L J; Fian, Alexander; Webb, James L; Wallentin, Jesper; Jönsson, Andreas; Borgström, Magnus T; Samuelson, Lars; Mikkelsen, Anders

    2013-11-13

    Utilizing semiconductor nanowires for (opto)electronics requires exact knowledge of their current-voltage properties. We report accurate on-top imaging and I-V characterization of individual as-grown nanowires, using a subnanometer resolution scanning tunneling microscope with no need for additional microscopy tools, thus allowing versatile application. We form Ohmic contacts to InP and InAs nanowires without any sample processing, followed by quantitative measurements of diameter dependent I-V properties with a very small spread in measured values compared to standard techniques.

  9. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  10. Fabrication and Characterization of High Temperature Resin/Carbon Nanofiber Composites

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Working, Dennis C.; Criss, Jim M.; Siochi, Emilie J.; Connell, John W.

    2005-01-01

    Multifunctional composites present a route to structural weight reduction. Nanoparticles such as carbon nanofibers (CNF) provide a compromise as a lower cost nanosize reinforcement that yields a desirable combination of properties. Blends of PETI-330 and CNFs were prepared and characterized to investigate the potential of CNF composites as a high performance structural medium. Dry mixing techniques were employed and the effect of CNF loading level on melt viscosity was determined. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, samples containing 30 and 40 wt% CNF were scaled up to approx.300 g and used to fabricate moldings 10.2 cm x 15.2 cm x 0.32 cm thick. The moldings were fabricated by injecting the mixtures at 260-280 C into a stainless steel tool followed by curing for 1 h at 371 C. The tool was designed to impart high shear during the process in an attempt to achieve some alignment of CNFs in the flow direction. Moldings were obtained that were subsequently characterized for thermal, mechanical and electrical properties. The degree of dispersion and alignment of CNFs were investigated using high-resolution scanning electron microscopy. The preparation and preliminary characterization of PETI-330/CNF composites are discussed. Keywords: resins, carbon nanofibers, scanning electron microscopy, electrical properties, thermal conductivity,injection

  11. Initial Assessment of X-Ray Computer Tomography image analysis for material defect microstructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, Joshua James; Windes, William Enoch

    2016-06-01

    The original development work leading to this report was focused on the non destructive three-dimensional (3-D) characterization of nuclear graphite as a means to better understand the nature of the inherent pore structure. The pore structure of graphite and its evolution under various environmental factors such as irradiation, mechanical stress, and oxidation plays an important role in their observed properties and characteristics. If we are to transition from an empirical understanding of graphite behavior to a truly predictive mechanistic understanding the pore structure must be well characterized and understood. As the pore structure within nuclear graphite is highly interconnected andmore » truly 3-D in nature, 3-D characterization techniques are critical. While 3-D characterization has been an excellent tool for graphite pore characterization, it is applicable to a broad number of materials systems over many length scales. Given the wide range of applications and the highly quantitative nature of the tool, it is quite surprising to discover how few materials researchers understand and how valuable of a tool 3-D image processing and analysis can be. Ultimately, this report is intended to encourage broader use of 3 D image processing and analysis in materials science and engineering applications, more specifically nuclear-related materials applications, by providing interested readers with enough familiarity to explore its vast potential in identifying microstructure changes. To encourage this broader use, the report is divided into two main sections. Section 2 provides an overview of some of the key principals and concepts needed to extract a wide variety of quantitative metrics from a 3-D representation of a material microstructure. The discussion includes a brief overview of segmentation methods, connective components, morphological operations, distance transforms, and skeletonization. Section 3 focuses on the application of concepts from Section 2 to relevant materials at Idaho National Laboratory. In this section, image analysis examples featuring nuclear graphite will be discussed in detail. Additionally, example analyses from Transient Reactor Test Facility low-enriched uranium conversion, Advanced Gas Reactor like compacts, and tristructural isotopic particles are shown to give a broader perspective of the applicability to relevant materials of interest.« less

  12. 2015 Army Science Planning and Strategy Meeting Series: Outcomes and Conclusions

    DTIC Science & Technology

    2017-12-21

    modeling and nanoscale characterization tools to enable efficient design of hybridized manufacturing ; realtime, multiscale computational capability...to enable predictive analytics for expeditionary on-demand manufacturing • Discovery of design principles to enable programming advanced genetic...goals, significant research is needed to mature the fundamental materials science, processing and manufacturing sciences, design methodologies, data

  13. Improving the Validity and Reliability of Large Scale Writing Assessment.

    ERIC Educational Resources Information Center

    Fenton, Ray; Straugh, Tom; Stofflet, Fred; Garrison, Steve

    This paper examines the efforts of the Anchorage School District, Alaska, to improve the validity of its writing assessment as a useful tool for the training of teachers and the characterization of the quality of student writing. The paper examines how a number of changes in the process and scoring of the Anchorage Writing Assessment affected the…

  14. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*

    PubMed Central

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying

    2016-01-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kitanidis, Peter

    As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO 2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic,more » tracer and thermal tests before CO 2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO 2 storage examples.« less

  16. SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity

    NASA Astrophysics Data System (ADS)

    Crema, Stefano; Cavalli, Marco

    2018-02-01

    There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.

  17. An SPM12 extension for multiple sclerosis lesion segmentation

    NASA Astrophysics Data System (ADS)

    Roura, Eloy; Oliver, Arnau; Cabezas, Mariano; Valverde, Sergi; Pareto, Deborah; Vilanova, Joan C.; Ramió-Torrentà, Lluís.; Rovira, Àlex; Lladó, Xavier

    2016-03-01

    Purpose: Magnetic resonance imaging is nowadays the hallmark to diagnose multiple sclerosis (MS), characterized by white matter lesions. Several approaches have been recently presented to tackle the lesion segmentation problem, but none of them have been accepted as a standard tool in the daily clinical practice. In this work we present yet another tool able to automatically segment white matter lesions outperforming the current-state-of-the-art approaches. Methods: This work is an extension of Roura et al. [1], where external and platform dependent pre-processing libraries (brain extraction, noise reduction and intensity normalization) were required to achieve an optimal performance. Here we have updated and included all these required pre-processing steps into a single framework (SPM software). Therefore, there is no need of external tools to achieve the desired segmentation results. Besides, we have changed the working space from T1w to FLAIR, reducing interpolation errors produced in the registration process from FLAIR to T1w space. Finally a post-processing constraint based on shape and location has been added to reduce false positive detections. Results: The evaluation of the tool has been done on 24 MS patients. Qualitative and quantitative results are shown with both approaches in terms of lesion detection and segmentation. Conclusion: We have simplified both installation and implementation of the approach, providing a multiplatform tool1 integrated into the SPM software, which relies only on using T1w and FLAIR images. We have reduced with this new version the computation time of the previous approach while maintaining the performance.

  18. Sensible use of antisense: how to use oligonucleotides as research tools.

    PubMed

    Myers, K J; Dean, N M

    2000-01-01

    In the past decade, there has been a vast increase in the amount of gene sequence information that has the potential to revolutionize the way diseases are both categorized and treated. Old diagnoses, largely anatomical or descriptive in nature, are likely to be superceded by the molecular characterization of the disease. The recognition that certain genes drive key disease processes will also enable the rational design of gene-specific therapeutics. Antisense oligonucleotides represent a technology that should play multiple roles in this process.

  19. Models, Measurements, and Local Decisions: Assessing and ...

    EPA Pesticide Factsheets

    This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  20. Determining Spacecraft Reaction Wheel Friction Parameters

    NASA Technical Reports Server (NTRS)

    Sarani, Siamak

    2009-01-01

    Software was developed to characterize the drag in each of the Cassini spacecraft's Reaction Wheel Assemblies (RWAs) to determine the RWA friction parameters. This tool measures the drag torque of RWAs for not only the high spin rates (greater than 250 RPM), but also the low spin rates (less than 250 RPM) where there is a lack of an elastohydrodynamic boundary layer in the bearings. RWA rate and drag torque profiles as functions of time are collected via telemetry once every 4 seconds and once every 8 seconds, respectively. Intermediate processing steps single-out the coast-down regions. A nonlinear model for the drag torque as a function of RWA spin rate is incorporated in order to characterize the low spin rate regime. The tool then uses a nonlinear parameter optimization algorithm based on the Nelder-Mead simplex method to determine the viscous coefficient, the Dahl friction, and the two parameters that account for the low spin-rate behavior.

  1. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. Extending i-line capabilities through variance characterization and tool enhancement

    NASA Astrophysics Data System (ADS)

    Miller, Dan; Salinas, Adrian; Peterson, Joel; Vickers, David; Williams, Dan

    2006-03-01

    Continuous economic pressures have moved a large percent of integrated device manufacturing (IDM) operations either overseas or to foundry operations over the last 10 years. These pressures have left the IDM fabs in the U.S. with required COO improvements in order to maintain operations domestically. While the assets of many of these factories are at a very favorable point in the depreciation life cycle, the equipment and processes are constrained to the quality of the equipment in its original state and the degradation over its installed life. With the objective to enhance output and improve process performance, this factory and their primary lithography process tool supplier have been able to extend the usable life of the existing process tools, increase the output of the tool base, and improve the distribution of the CDs on the product produced. Texas Instruments Incorporated lead an investigation with the POLARIS ® Systems & Services business of FSI International to determine the sources of variance in the i-line processing of a wide array of IC device types. Data from the sources of variance were investigated such as PEB temp, PEB delay time, develop recipe, develop time, and develop programming. While PEB processes are a primary driver of acid catalyzed resists, the develop mode is shown in this work to have an overwhelming impact on the wafer to wafer and across wafer CD performance of these i-line processes. These changes have been able to improve the wafer to wafer CD distribution by more than 80 %, and the within wafer CD distribution by more than 50 % while enabling a greater than 50 % increase in lithography cluster throughput. The paper will discuss the contribution from each of the sources of variance and their importance in overall system performance.

  3. Characterization of Light Lesion Paradigms and Optical Coherence Tomography as Tools to Study Adult Retina Regeneration in Zebrafish

    PubMed Central

    Weber, Anke; Hochmann, Sarah; Cimalla, Peter; Gärtner, Maria; Kuscha, Veronika; Hans, Stefan; Geffarth, Michaela; Kaslin, Jan; Koch, Edmund; Brand, Michael

    2013-01-01

    Light-induced lesions are a powerful tool to study the amazing ability of photoreceptors to regenerate in the adult zebrafish retina. However, the specificity of the lesion towards photoreceptors or regional differences within the retina are still incompletely understood. We therefore characterized the process of degeneration and regeneration in an established paradigm, using intense white light from a fluorescence lamp on swimming fish (diffuse light lesion). We also designed a new light lesion paradigm where light is focused through a microscope onto the retina of an immobilized fish (focused light lesion). Focused light lesion has the advantage of creating a locally restricted area of damage, with the additional benefit of an untreated control eye in the same animal. In both paradigms, cell death is observed as an immediate early response, and proliferation is initiated around 2 days post lesion (dpl), peaking at 3 dpl. We furthermore find that two photoreceptor subtypes (UV and blue sensitive cones) are more susceptible towards intense white light than red/green double cones and rods. We also observed specific differences within light lesioned areas with respect to the process of photoreceptor degeneration: UV cone debris is removed later than any other type of photoreceptor in light lesions. Unspecific damage to retinal neurons occurs at the center of a focused light lesion territory, but not in the diffuse light lesion areas. We simulated the fish eye optical properties using software simulation, and show that the optical properties may explain the light lesion patterns that we observe. Furthermore, as a new tool to study retinal degeneration and regeneration in individual fish in vivo, we use spectral domain optical coherence tomography. Collectively, the light lesion and imaging assays described here represent powerful tools for studying degeneration and regeneration processes in the adult zebrafish retina. PMID:24303018

  4. Quantitative image analysis for evaluating the coating thickness and pore distribution in coated small particles.

    PubMed

    Laksmana, F L; Van Vliet, L J; Hartman Kok, P J A; Vromans, H; Frijlink, H W; Van der Voort Maarschalk, K

    2009-04-01

    This study aims to develop a characterization method for coating structure based on image analysis, which is particularly promising for the rational design of coated particles in the pharmaceutical industry. The method applies the MATLAB image processing toolbox to images of coated particles taken with Confocal Laser Scanning Microscopy (CSLM). The coating thicknesses have been determined along the particle perimeter, from which a statistical analysis could be performed to obtain relevant thickness properties, e.g. the minimum coating thickness and the span of the thickness distribution. The characterization of the pore structure involved a proper segmentation of pores from the coating and a granulometry operation. The presented method facilitates the quantification of porosity, thickness and pore size distribution of a coating. These parameters are considered the important coating properties, which are critical to coating functionality. Additionally, the effect of the coating process variations on coating quality can straight-forwardly be assessed. Enabling a good characterization of the coating qualities, the presented method can be used as a fast and effective tool to predict coating functionality. This approach also enables the influence of different process conditions on coating properties to be effectively monitored, which latterly leads to process tailoring.

  5. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  6. SMARTe Site Characterization Tool. In: SMARTe20ll, EPA/600/C-10/007

    EPA Science Inventory

    The purpose of the Site Characterization Tool is to: (1) develop a sample design for collecting site characterization data and (2) perform data analysis on uploaded data. The sample design part helps to determine how many samples should be collected to characterize a site with ...

  7. Phase transformations in steels: Processing, microstructure, and performance

    DOE PAGES

    Gibbs, Paul J.

    2014-04-03

    In this study, contemporary steel research is revealing new processing avenues to tailor microstructure and properties that, until recently, were only imaginable. Much of the technological versatility facilitating this development is provided by the understanding and utilization of the complex phase transformation sequences available in ferrous alloys. Today we have the opportunity to explore the diverse phenomena displayed by steels with specialized analytical and experimental tools. Advances in multi-scale characterization techniques provide a fresh perspective into microstructural relationships at the macro- and micro-scale, enabling a fundamental understanding of the role of phase transformations during processing and subsequent deformation.

  8. Statistical Techniques for Signal Processing

    DTIC Science & Technology

    1993-01-12

    functions and extended influence functions of the associated underlying estimators. An interesting application of the influence function and its...and related filter smtctures. While the influence function is best known for its role in characterizing the robustness of estimators. the mathematical...statistics can be designed and analyzed for performance using the influence function as a tool. In particular, we have examined the mean-median

  9. Two-Dimensional NMR Evidence for Cleavage of Lignin and Xylan Substituents in Wheat Straw Through Hydrothermal Pretreatment and Enzymatic Hydrolysis

    Treesearch

    Daniel J. Yelle; Prasad Kaparaju; Christopher G. Hunt; Kolby Hirth; Hoon Kim; John Ralph; Claus Felby

    2012-01-01

    Solution-state two-dimensional (2D) nuclear magnetic resonance (NMR) spectroscopy of plant cell walls is a powerful tool for characterizing changes in cell wall chemistry during the hydrothermal pretreatment process of wheat straw for second-generation bioethanol production. One-bond 13C-1H NMR correlation spectroscopy, via...

  10. A note on a simplified and general approach to simulating from multivariate copula functions

    Treesearch

    Barry K. Goodwin

    2013-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses ‘Probability-...

  11. Ultrasonic grinding of optical materials

    NASA Astrophysics Data System (ADS)

    Cahill, Michael; Bechtold, Michael; Fess, Edward; Stephan, Thomas; Bechtold, Rob

    2017-10-01

    Hard ceramic optical materials such as sapphire, ALON, Spinel, PCA, or Silicon Carbide can present a significant challenge in manufacturing precision optical components due to their tough mechanical properties. These are also the same mechanical properties that make them desirable materials when used in harsh environments. Slow processing speeds, premature tool wear, and poor surface quality are common results of the tough mechanical properties of these materials. Often, as a preparatory stage for polishing, the finish of the ground surface greatly influences the polishing process and the resulting finished product. To overcome these challenges, OptiPro Systems has developed an ultrasonic assisted grinding technology, OptiSonic, which has been designed for the precision optics and ceramics industry. OptiSonic utilizes a custom tool holder designed to produce oscillations, in microns of amplitude, in line with the rotating spindle. A software package, IntelliSonic, is integral to the function of this platform. IntelliSonic can automatically characterize tooling during setup to identify and select the ideal resonant peak which to operate at. Then, while grinding, IntelliSonic continuously adjusts the output frequency for optimal grinding efficiency while in contact with the part. This helps maintain a highly consistent process under changing load conditions for a more precise surface. Utilizing a variety of instruments, tests have proven to show a reduction in force between tool and part by up to 50%, while increasing the surface quality and reducing tool wear. This paper will present the challenges associated with these materials and solutions created to overcome them.

  12. Hydrochemistry and stable isotopes (δ18O and δ2H) tools applied to the study of karst aquifers in southern mediterranean basin (Teboursouk area, NW Tunisia)

    NASA Astrophysics Data System (ADS)

    Ayadi, Yosra; Mokadem, Naziha; Besser, Houda; Khelifi, Faten; Harabi, Samia; Hamad, Amor; Boyce, Adrian; Laouar, Rabah; Hamed, Younes

    2018-01-01

    Karst aquifers receive increasing attention in Mediterranean countries as they provide large supplies water used for drinkable and irrigation purposes as well as for electricity production. In Teboursouk basin, Northwestern Tunisia, characterized by a typical karst landscape, the water hosted in the carbonates aquifers provides large parts of water supply for drinkable water and agriculture purposes. Groundwater circulation in karst aquifers is characterized by short residence time and low water-rock interaction caused by high karstification processes in the study area. Ion exchange process, rock dissolution and rainfall infiltration are the principal factors of water mineralization and spatial distribution of groundwater chemistry. The present work attempted to study karstic groundwater in Teboursouk region using hydrochemistry and stable isotopes (δ18O and δ2H) tools. Karst aquifers have good water quality with low salinity levels expressed by TDS values largely below 1.5 g/l with Ca-SO4-Cl water type prevailing in the study area. The aquifers have been recharged by rainfall originating from a mixture of Atlantic and Mediterranean vapor masses.

  13. In situ ultrahigh-resolution optical coherence tomography characterization of eye bank corneal tissue processed for lamellar keratoplasty.

    PubMed

    Brown, Jamin S; Wang, Danling; Li, Xiaoli; Baluyot, Florence; Iliakis, Bernie; Lindquist, Thomas D; Shirakawa, Rika; Shen, Tueng T; Li, Xingde

    2008-08-01

    To use optical coherence tomography (OCT) as a noninvasive tool to perform in situ characterization of eye bank corneal tissue processed for lamellar keratoplasty. A custom-built ultrahigh-resolution OCT (UHR-OCT) was used to characterize donor corneal tissue that had been processed for lamellar keratoplasty. Twenty-seven donor corneas were analyzed. Four donor corneas were used as controls, whereas the rest were processed into donor corneal buttons for lamellar transplantation by using hand dissection, a microkeratome, or a femtosecond laser. UHR-OCT was also used to noninvasively characterize and monitor the viable corneal tissue immersed in storage medium over 3 weeks. The UHR-OCT captured high-resolution images of the donor corneal tissue in situ. This noninvasive technique showed the changes in donor corneal tissue morphology with time while in storage medium. The characteristics of the lamellar corneal tissue with each processing modality were clearly visible by UHR-OCT. The in situ characterization of the femtosecond laser-cut corneal tissue was noted to have more interface debris than shown by routine histology. The effects of the femtosecond laser microcavitation bubbles on the corneal tissue were well visualized at the edges of the lamellar flap while in storage medium. The results of our feasibility study show that UHR-OCT can provide superb, in situ microstructural characterization of eye bank corneal tissue noninvasively. The UHR-OCT interface findings and corneal endothelial disc thickness uniformity analysis are valuable information that may be used to optimize the modalities and parameters for lamellar tissue processing. The UHR-OCT is a powerful approach that will allow us to further evaluate the tissue response to different processing techniques for posterior lamellar keratoplasty. It may also provide information that can be used to correlate with postoperative clinical outcomes. UHR-OCT has the potential to become a routine part of tissue analysis for any eye bank or centers creating customized lamellar corneal tissue for transplantation.

  14. Categorization for Faces and Tools—Two Classes of Objects Shaped by Different Experience—Differs in Processing Timing, Brain Areas Involved, and Repetition Effects

    PubMed Central

    Kozunov, Vladimir; Nikolaeva, Anastasia; Stroganova, Tatiana A.

    2018-01-01

    The brain mechanisms that integrate the separate features of sensory input into a meaningful percept depend upon the prior experience of interaction with the object and differ between categories of objects. Recent studies using representational similarity analysis (RSA) have characterized either the spatial patterns of brain activity for different categories of objects or described how category structure in neuronal representations emerges in time, but never simultaneously. Here we applied a novel, region-based, multivariate pattern classification approach in combination with RSA to magnetoencephalography data to extract activity associated with qualitatively distinct processing stages of visual perception. We asked participants to name what they see whilst viewing bitonal visual stimuli of two categories predominantly shaped by either value-dependent or sensorimotor experience, namely faces and tools, and meaningless images. We aimed to disambiguate the spatiotemporal patterns of brain activity between the meaningful categories and determine which differences in their processing were attributable to either perceptual categorization per se, or later-stage mentalizing-related processes. We have extracted three stages of cortical activity corresponding to low-level processing, category-specific feature binding, and supra-categorical processing. All face-specific spatiotemporal patterns were associated with bilateral activation of ventral occipito-temporal areas during the feature binding stage at 140–170 ms. The tool-specific activity was found both within the categorization stage and in a later period not thought to be associated with binding processes. The tool-specific binding-related activity was detected within a 210–220 ms window and was located to the intraparietal sulcus of the left hemisphere. Brain activity common for both meaningful categories started at 250 ms and included widely distributed assemblies within parietal, temporal, and prefrontal regions. Furthermore, we hypothesized and tested whether activity within face and tool-specific binding-related patterns would demonstrate oppositely acting effects following procedural perceptual learning. We found that activity in the ventral, face-specific network increased following the stimuli repetition. In contrast, tool processing in the dorsal network adapted by reducing its activity over the repetition period. Altogether, we have demonstrated that activity associated with visual processing of faces and tools during the categorization stage differ in processing timing, brain areas involved, and in their dynamics underlying stimuli learning. PMID:29379426

  15. Software project management tools in global software development: a systematic mapping study.

    PubMed

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  16. Perturbation Experiments: Approaches for Metabolic Pathway Analysis in Bioreactors.

    PubMed

    Weiner, Michael; Tröndle, Julia; Albermann, Christoph; Sprenger, Georg A; Weuster-Botz, Dirk

    2016-01-01

    In the last decades, targeted metabolic engineering of microbial cells has become one of the major tools in bioprocess design and optimization. For successful application, a detailed knowledge is necessary about the relevant metabolic pathways and their regulation inside the cells. Since in vitro experiments cannot display process conditions and behavior properly, process data about the cells' metabolic state have to be collected in vivo. For this purpose, special techniques and methods are necessary. Therefore, most techniques enabling in vivo characterization of metabolic pathways rely on perturbation experiments, which can be divided into dynamic and steady-state approaches. To avoid any process disturbance, approaches which enable perturbation of cell metabolism in parallel to the continuing production process are reasonable. Furthermore, the fast dynamics of microbial production processes amplifies the need of parallelized data generation. These points motivate the development of a parallelized approach for multiple metabolic perturbation experiments outside the operating production reactor. An appropriate approach for in vivo characterization of metabolic pathways is presented and applied exemplarily to a microbial L-phenylalanine production process on a 15 L-scale.

  17. MATLAB tools for improved characterization and quantification of volcanic incandescence in Webcam imagery; applications at Kilauea Volcano, Hawai'i

    USGS Publications Warehouse

    Patrick, Matthew R.; Kauahikaua, James P.; Antolik, Loren

    2010-01-01

    Webcams are now standard tools for volcano monitoring and are used at observatories in Alaska, the Cascades, Kamchatka, Hawai'i, Italy, and Japan, among other locations. Webcam images allow invaluable documentation of activity and provide a powerful comparative tool for interpreting other monitoring datastreams, such as seismicity and deformation. Automated image processing can improve the time efficiency and rigor of Webcam image interpretation, and potentially extract more information on eruptive activity. For instance, Lovick and others (2008) provided a suite of processing tools that performed such tasks as noise reduction, eliminating uninteresting images from an image collection, and detecting incandescence, with an application to dome activity at Mount St. Helens during 2007. In this paper, we present two very simple automated approaches for improved characterization and quantification of volcanic incandescence in Webcam images at Kilauea Volcano, Hawai`i. The techniques are implemented in MATLAB (version 2009b, Copyright: The Mathworks, Inc.) to take advantage of the ease of matrix operations. Incandescence is a useful indictor of the location and extent of active lava flows and also a potentially powerful proxy for activity levels at open vents. We apply our techniques to a period covering both summit and east rift zone activity at Kilauea during 2008?2009 and compare the results to complementary datasets (seismicity, tilt) to demonstrate their integrative potential. A great strength of this study is the demonstrated success of these tools in an operational setting at the Hawaiian Volcano Observatory (HVO) over the course of more than a year. Although applied only to Webcam images here, the techniques could be applied to any type of sequential images, such as time-lapse photography. We expect that these tools are applicable to many other volcano monitoring scenarios, and the two MATLAB scripts, as they are implemented at HVO, are included in the appendixes. These scripts would require minor to moderate modifications for use elsewhere, primarily to customize directory navigation. If the user has some familiarity with MATLAB, or programming in general, these modifications should be easy. Although we originally anticipated needing the Image Processing Toolbox, the scripts in the appendixes do not require it. Thus, only the base installation of MATLAB is needed. Because fairly basic MATLAB functions are used, we expect that the script can be run successfully by versions earlier than 2009b.

  18. Deposition and micro electrical discharge machining of CVD-diamond layers incorporated with silicon

    NASA Astrophysics Data System (ADS)

    Kühn, R.; Berger, T.; Prieske, M.; Börner, R.; Hackert-Oschätzchen, M.; Zeidler, H.; Schubert, A.

    2017-10-01

    In metal forming, lubricants have to be used to prevent corrosion or to reduce friction and tool wear. From an economical and ecological point of view, the aim is to avoid the usage of lubricants. For dry deep drawing of aluminum sheets it is intended to apply locally micro-structured wear-resistant carbon based coatings onto steel tools. One type of these coatings are diamond layers prepared by chemical vapor deposition (CVD). Due to the high strength of diamond, milling processes are unsuitable for micro-structuring of these layers. In contrast to this, micro electrical discharge machining (micro EDM) is a suitable process for micro-structuring CVD-diamond layers. Due to its non-contact nature and its process principle of ablating material by melting and evaporating, it is independent of the hardness, brittleness or toughness of the workpiece material. In this study the deposition and micro electrical discharge machining of silicon incorporated CVD-diamond (Si-CVD-diamond) layers were presented. For this, 10 µm thick layers were deposited on molybdenum plates by a laser-induced plasma CVD process (LaPlas-CVD). For the characterization of the coatings RAMAN- and EDX-analyses were conducted. Experiments in EDM were carried out with a tungsten carbide tool electrode with a diameter of 90 µm to investigate the micro-structuring of Si-CVD-diamond. The impact of voltage, discharge energy and tool polarity on process speed and resulting erosion geometry were analyzed. The results show that micro EDM is a suitable technology for micro-structuring of silicon incorporated CVD-diamond layers.

  19. Single droplet drying step characterization in microsphere preparation.

    PubMed

    Al Zaitone, Belal; Lamprecht, Alf

    2013-05-01

    Spray drying processes are difficult to characterize since process parameters are not directly accessible. Acoustic levitation was used to investigate microencapsulation by spray drying on one single droplet facilitating the analyses of droplet behavior upon drying. Process parameters were simulated on a poly(lactide-co-glycolide)/ethyl acetate combination for microencapsulation. The results allowed quantifying the influence of process parameters such as temperature (0-40°C), polymer concentration (5-400 mg/ml), and droplet size (0.5-1.37 μl) on the drying time and drying kinetics as well as the particle morphology. The drying of polymer solutions at temperature of 21°C and concentration of 5 mg/ml, shows that the dimensionless particle diameter (Dp/D0) approaches 0.25 and the particle needs 350 s to dry. At 400 mg/ml, Dp/D0=0.8 and the drying time increases to one order of magnitude and a hollow particle is formed. The study demonstrates the benefit of using the acoustic levitator as a lab scale method to characterize and study the microparticle formation. This method can be considered as a helpful tool to mimic the full scale spray drying process by providing identical operational parameters such as air velocity, temperature, and variable droplet sizes. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Automatic Fault Characterization via Abnormality-Enhanced Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Laguna, I; de Supinski, B R

    Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less

  1. Prioritizing Seafloor Mapping for Washington’s Pacific Coast

    PubMed Central

    Battista, Timothy; Buja, Ken; Christensen, John; Hennessey, Jennifer; Lassiter, Katrina

    2017-01-01

    Remote sensing systems are critical tools used for characterizing the geological and ecological composition of the seafloor. However, creating comprehensive and detailed maps of ocean and coastal environments has been hindered by the high cost of operating ship- and aircraft-based sensors. While a number of groups (e.g., academic research, government resource management, and private sector) are engaged in or would benefit from the collection of additional seafloor mapping data, disparate priorities, dauntingly large data gaps, and insufficient funding have confounded strategic planning efforts. In this study, we addressed these challenges by implementing a quantitative, spatial process to facilitate prioritizing seafloor mapping needs in Washington State. The Washington State Prioritization Tool (WASP), a custom web-based mapping tool, was developed to solicit and analyze mapping priorities from each participating group. The process resulted in the identification of several discrete, high priority mapping hotspots. As a result, several of the areas have been or will be subsequently mapped. Furthermore, information captured during the process about the intended application of the mapping data was paramount for identifying the optimum remote sensing sensors and acquisition parameters to use during subsequent mapping surveys. PMID:28350338

  2. The Exoplanet Characterization ToolKit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia

    2018-01-01

    The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.

  3. Optimization and Surface Modification of Al-6351 Alloy Using SiC-Cu Green Compact Electrode by Electro Discharge Coating Process

    NASA Astrophysics Data System (ADS)

    Chakraborty, Sujoy; Kar, Siddhartha; Dey, Vidyut; Ghosh, Subrata Kumar

    2017-06-01

    This paper introduces the surface modification of Al-6351 alloy by green compact SiC-Cu electrode using electro-discharge coating (EDC) process. A Taguchi L-16 orthogonal array is employed to investigate the process by varying tool parameters like composition and compaction load and electro-discharge machining (EDM) parameters like pulse-on time and peak current. Material deposition rate (MDR), tool wear rate (TWR) and surface roughness (SR) are measured on the coated specimens. An optimum condition is achieved by formulating overall evaluation criteria (OEC), which combines multi-objective task into a single index. The signal-to-noise (S/N) ratio, and the analysis of variance (ANOVA) is employed to investigate the effect of relevant process parameters. A confirmation test is conducted based on optimal process parameters and experimental results are provided to illustrate the effectiveness of this approach. The modified surface is characterized by optical microscope and X-ray diffraction (XRD) analysis. XRD analysis of the deposited layer confirmed the transfer of tool materials to the work surface and formation of inter-metallic phases. The micro-hardness of the resulting composite layer is also measured which is 1.5-3 times more than work material’s one and highest layer thickness (LT) of 83.644μm has been successfully achieved.

  4. A population MRI brain template and analysis tools for the macaque.

    PubMed

    Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam

    2018-04-15

    The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.

  5. Addressing and Presenting Quality of Satellite Data via Web-Based Services

    NASA Technical Reports Server (NTRS)

    Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.

    2011-01-01

    With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.

  6. Distinguishing dose, focus, and blur for lithography characterization and control

    NASA Astrophysics Data System (ADS)

    Ausschnitt, Christopher P.; Brunner, Timothy A.

    2007-03-01

    We derive a physical model to describe the dependence of pattern dimensions on dose, defocus and blur. The coefficients of our model are constants of a given lithographic process. Model inversion applied to dimensional measurements then determines effective dose, defocus and blur for wafers patterned with the same process. In practice, our approach entails the measurement of proximate grating targets of differing dose and focus sensitivity. In our embodiment, the measured attribute of one target is exclusively sensitive to dose, whereas the measured attributes of a second target are distinctly sensitive to defocus and blur. On step-and-scan exposure tools, z-blur is varied in a controlled manner by adjusting the across slit tilt of the image plane. The effects of z-blur and x,y-blur are shown to be equivalent. Furthermore, the exposure slit width is shown to determine the tilt response of the grating attributes. Thus, the response of the measured attributes can be characterized by a conventional focus-exposure matrix (FEM), over which the exposure tool settings are intentionally changed. The model coefficients are determined by a fit to the measured FEM response. The model then fully defines the response for wafers processed under "fixed" dose, focus and blur conditions. Model inversion applied to measurements from the same targets on all such wafers enables the simultaneous determination of effective dose and focus/tilt (DaFT) at each measurement site.

  7. Optical fiber loops and helices: tools for integrated photonic device characterization and microfluidic trapping

    NASA Astrophysics Data System (ADS)

    Ren, Yundong; Zhang, Rui; Ti, Chaoyang; Liu, Yuxiang

    2016-09-01

    Tapered optical fibers can deliver guided light into and carry light out of micro/nanoscale systems with low loss and high spatial resolution, which makes them ideal tools in integrated photonics and microfluidics. Special geometries of tapered fibers are desired for probing monolithic devices in plane as well as optical manipulation of micro particles in fluids. However, for many specially shaped tapered fibers, it remains a challenge to fabricate them in a straightforward, controllable, and repeatable way. In this work, we fabricated and characterized two special geometries of tapered optical fibers, namely fiber loops and helices, that could be switched between one and the other. The fiber loops in this work are distinct from previous ones in terms of their superior mechanical stability and high optical quality factors in air, thanks to a post-annealing process. We experimentally measured an intrinsic optical quality factor of 32,500 and a finesse of 137 from a fiber loop. A fiber helix was used to characterize a monolithic cavity optomechanical device. Moreover, a microfluidic "roller coaster" was demonstrated, where microscale particles in water were optically trapped and transported by a fiber helix. Tapered fiber loops and helices can find various applications ranging from on-the-fly characterization of integrated photonic devices to particle manipulation and sorting in microfluidics.

  8. Micro-balance sensor integrated with atomic layer deposition chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinson, Alex B. F.; Libera, Joseph A.; Elam, Jeffrey W.

    The invention is directed to QCM measurements in monitoring ALD processes. Previously, significant barriers remain in the ALD processes and accurate execution. To turn this exclusively dedicated in situ technique into a routine characterization method, an integral QCM fixture was developed. This new design is easily implemented on a variety of ALD tools, allows rapid sample exchange, prevents backside deposition, and minimizes both the footprint and flow disturbance. Unlike previous QCM designs, the fast thermal equilibration enables tasks such as temperature-dependent studies and ex situ sample exchange, further highlighting the feasibility of this QCM design for day-to-day use. Finally, themore » in situ mapping of thin film growth rates across the ALD reactor was demonstrated in a popular commercial tool operating in both continuous and quasi-static ALD modes.« less

  9. Optimization of droplets for UV-NIL using coarse-grain simulation of resist flow

    NASA Astrophysics Data System (ADS)

    Sirotkin, Vadim; Svintsov, Alexander; Zaitsev, Sergey

    2009-03-01

    A mathematical model and numerical method are described, which make it possible to simulate ultraviolet ("step and flash") nanoimprint lithography (UV-NIL) process adequately even using standard Personal Computers. The model is derived from 3D Navier-Stokes equations with the understanding that the resist motion is largely directed along the substrate surface and characterized by ultra-low values of the Reynolds number. By the numerical approximation of the model, a special finite difference method is applied (a coarse-grain method). A coarse-grain modeling tool for detailed analysis of resist spreading in UV-NIL at the structure-scale level is tested. The obtained results demonstrate the high ability of the tool to calculate optimal dispensing for given stamp design and process parameters. This dispensing provides uniform filled areas and a homogeneous residual layer thickness in UV-NIL.

  10. Overlay Tolerances For VLSI Using Wafer Steppers

    NASA Astrophysics Data System (ADS)

    Levinson, Harry J.; Rice, Rory

    1988-01-01

    In order for VLSI circuits to function properly, the masking layers used in the fabrication of those devices must overlay each other to within the manufacturing tolerance incorporated in the circuit design. The capabilities of the alignment tools used in the masking process determine the overlay tolerances to which circuits can be designed. It is therefore of considerable importance that these capabilities be well characterized. Underestimation of the overlay accuracy results in unnecessarily large devices, resulting in poor utilization of wafer area and possible degradation of device performance. Overestimation will result in significant yield loss because of the failure to conform to the tolerances of the design rules. The proper methodology for determining the overlay capabilities of wafer steppers, the most commonly used alignment tool for the production of VLSI circuits, is the subject of this paper. Because cost-effective manufacturing process technology has been the driving force of VLSI, the impact on productivity is a primary consideration in all discussions. Manufacturers of alignment tools advertise the capabilities of their equipment. It is notable that no manufacturer currently characterizes his aligners in a manner consistent with the requirements of producing very large integrated circuits, as will be discussed. This has resulted in the situation in which the evaluation and comparison of the capabilities of alignment tools require the attention of a lithography specialist. Unfortunately, lithographic capabilities must be known by many other people, particularly the circuit designers and the managers responsible for the financial consequences of the high prices of modern alignment tools. All too frequently, the designer or manager is confronted with contradictory data, one set coming from his lithography specialist, and the other coming from a sales representative of an equipment manufacturer. Since the latter generally attempts to make his merchandise appear as attractive as possible, the lithographer is frequently placed in the position of having to explain subtle issues in order to justify his decisions. It is the purpose of this paper to provide that explanation.

  11. Physician perceptions of primary prevention: qualitative base for the conceptual shaping of a practice intervention tool

    PubMed Central

    Mirand, Amy L; Beehler, Gregory P; Kuo, Christina L; Mahoney, Martin C

    2002-01-01

    Background A practice intervention must have its basis in an understanding of the physician and practice to secure its benefit and relevancy. We used a formative process to characterize primary care physician attitudes, needs, and practice obstacles regarding primary prevention. The characterization will provide the conceptual framework for the development of a practice tool to facilitate routine delivery of primary preventive care. Methods A focus group of primary care physician Opinion Leaders was audio-taped, transcribed, and qualitatively analyzed to identify emergent themes that described physicians' perceptions of prevention in daily practice. Results The conceptual worth of primary prevention, including behavioral counseling, was high, but its practice was significantly countered by the predominant clinical emphasis on and rewards for secondary care. In addition, lack of health behavior training, perceived low self-efficacy, and patient resistance to change were key deterrents to primary prevention delivery. Also, the preventive focus in primary care is not on cancer, but on predominant chronic nonmalignant conditions. Conclusions The success of the future practice tool will be largely dependent on its ability to "fit" primary prevention into the clinical culture of diagnoses and treatment sustained by physicians, patients, and payers. The tool's message output must be formatted to facilitate physician delivery of patient-tailored behavioral counseling in an accurate, confident, and efficacious manner. Also, the tool's health behavior messages should be behavior-specific, not disease-specific, to draw on shared risk behaviors of numerous diseases and increase the likelihood of perceived salience and utility of the tool in primary care. PMID:12204096

  12. Extrinsic Fluorescent Dyes as Tools for Protein Characterization

    PubMed Central

    Hawe, Andrea; Sutter, Marc

    2008-01-01

    Noncovalent, extrinsic fluorescent dyes are applied in various fields of protein analysis, e.g. to characterize folding intermediates, measure surface hydrophobicity, and detect aggregation or fibrillation. The main underlying mechanisms, which explain the fluorescence properties of many extrinsic dyes, are solvent relaxation processes and (twisted) intramolecular charge transfer reactions, which are affected by the environment and by interactions of the dyes with proteins. In recent time, the use of extrinsic fluorescent dyes such as ANS, Bis-ANS, Nile Red, Thioflavin T and others has increased, because of their versatility, sensitivity and suitability for high-throughput screening. The intention of this review is to give an overview of available extrinsic dyes, explain their spectral properties, and show illustrative examples of their various applications in protein characterization. PMID:18172579

  13. Frequent Pitfalls in the Characterization of Electrodes Designed for Electrochemical Energy Conversion and Storage.

    PubMed

    Zeradjanin, Aleksandar R

    2018-04-25

    Focus on the importance of energy conversion and storage boosted research interest in various electrocatalytic materials. Characterization of solid-liquid interfaces during faradaic and non-faradaic processes is routinely conducted in many laboratories worldwide on a daily basis. This can be deemed as a very positive tendency. However, careful insight into modern literature suggests frequent misuse of electroanalytical tools. This can have very negative implications and postpone overall development of electrocatalytic materials with the desired properties. This work points out some of the frequent pitfalls in electrochemical characterization, suggests potential solutions, and above all encourages comprehensive analysis and in-depth thinking about electrochemical phenomena. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. The coming of age of the first hybrid metrology software platform dedicated to nanotechnologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian

    2017-03-01

    The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control

  15. Characterization of cytochrome c as marker for retinal cell degeneration by uv/vis spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Hollmach, Julia; Schweizer, Julia; Steiner, Gerald; Knels, Lilla; Funk, Richard H. W.; Thalheim, Silko; Koch, Edmund

    2011-07-01

    Retinal diseases like age-related macular degeneration have become an important cause of visual loss depending on increasing life expectancy and lifestyle habits. Due to the fact that no satisfying treatment exists, early diagnosis and prevention are the only possibilities to stop the degeneration. The protein cytochrome c (cyt c) is a suitable marker for degeneration processes and apoptosis because it is a part of the respiratory chain and involved in the apoptotic pathway. The determination of the local distribution and oxidative state of cyt c in living cells allows the characterization of cell degeneration processes. Since cyt c exhibits characteristic absorption bands between 400 and 650 nm wavelength, uv/vis in situ spectroscopic imaging was used for its characterization in retinal ganglion cells. The large amount of data, consisting of spatial and spectral information, was processed by multivariate data analysis. The challenge consists in the identification of the molecular information of cyt c. Baseline correction, principle component analysis (PCA) and cluster analysis (CA) were performed in order to identify cyt c within the spectral dataset. The combination of PCA and CA reveals cyt c and its oxidative state. The results demonstrate that uv/vis spectroscopic imaging in conjunction with sophisticated multivariate methods is a suitable tool to characterize cyt c under in situ conditions.

  16. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    PubMed

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    PubMed

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  18. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  19. Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)

    NASA Astrophysics Data System (ADS)

    Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia

    2018-06-01

    Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.

  20. An Excel Macro to Plot the HFE-Diagram to Identify Sea Water Intrusion Phases.

    PubMed

    Giménez-Forcada, Elena; Sánchez San Román, F Javier

    2015-01-01

    A hydrochemical facies evolution diagram (HFE-D) is a multirectangular diagram, which is a useful tool in the interpretation of sea water intrusion processes. This method note describes a simple method for generating an HFE-D plot using the spreadsheet software package, Microsoft Excel. The code was applied to groundwater from the alluvial coastal plain of Grosseto (Tuscany, Italy), which is characterized by a complex salinization process in which sea water mixes with sulfate or bicarbonate recharge water. © 2014, National GroundWater Association.

  1. Operation of a sampling train for the analysis of environmental species in coal gasification gas-phase process streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pochan, M.J.; Massey, M.J.

    1979-02-01

    This report discusses the results of actual raw product gas sampling efforts and includes: Rationale for raw product gas sampling efforts; design and operation of the CMU gas sampling train; development and analysis of a sampling train data base; and conclusions and future application of results. The results of sampling activities at the CO/sub 2/-Acceptor and Hygas pilot plants proved that: The CMU gas sampling train is a valid instrument for characterization of environmental parameters in coal gasification gas-phase process streams; depending on the particular process configuration, the CMU gas sampling train can reduce gasifier effluent characterization activity to amore » single location in the raw product gas line; and in contrast to the slower operation of the EPA SASS Train, CMU's gas sampling train can collect representative effluent data at a rapid rate (approx. 2 points per hour) consistent with the rate of change of process variables, and thus function as a tool for process engineering-oriented analysis of environmental characteristics.« less

  2. Lifting the veil: a typological survey of the methodological features of Islamic ethical reasoning on biomedical issues.

    PubMed

    Abdur-Rashid, Khalil; Furber, Steven Woodward; Abdul-Basser, Taha

    2013-04-01

    We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa-i.e., the products of the application of the tools that we describe-are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as "aims" (maqāṣid), "universals" (kulliyyāt), "interest" (maṣlaḥa), "maxims" (qawā`id), "controls" (ḍawābit), "differentiators" (furūq), "preponderization" (tarjīḥ), and "extension" (tafrī`).

  3. Internet MEMS design tools based on component technology

    NASA Astrophysics Data System (ADS)

    Brueck, Rainer; Schumer, Christian

    1999-03-01

    The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.

  4. Microstructural Characterization of Friction Stir Welded Aluminum-Steel Joints

    NASA Astrophysics Data System (ADS)

    Patterson, Erin E.; Hovanski, Yuri; Field, David P.

    2016-06-01

    This work focuses on the microstructural characterization of aluminum to steel friction stir welded joints. Lap weld configuration coupled with scribe technology used for the weld tool have produced joints of adequate quality, despite the significant differences in hardness and melting temperatures of the alloys. Common to friction stir processes, especially those of dissimilar alloys, are microstructural gradients including grain size, crystallographic texture, and precipitation of intermetallic compounds. Because of the significant influence that intermetallic compound formation has on mechanical and ballistic behavior, the characterization of the specific intermetallic phases and the degree to which they are formed in the weld microstructure is critical to predicting weld performance. This study used electron backscatter diffraction, energy dispersive spectroscopy, scanning electron microscopy, and Vickers micro-hardness indentation to explore and characterize the microstructures of lap friction stir welds between an applique 6061-T6 aluminum armor plate alloy and a RHA homogeneous armor plate steel alloy. Macroscopic defects such as micro-cracks were observed in the cross-sectional samples, and binary intermetallic compound layers were found to exist at the aluminum-steel interfaces of the steel particles stirred into the aluminum weld matrix and across the interfaces of the weld joints. Energy dispersive spectroscopy chemical analysis identified the intermetallic layer as monoclinic Al3Fe. Dramatic decreases in grain size in the thermo-mechanically affected zones and weld zones that evidenced grain refinement through plastic deformation and recrystallization. Crystallographic grain orientation and texture were examined using electron backscatter diffraction. Striated regions in the orientations of the aluminum alloy were determined to be the result of the severe deformation induced by the complex weld tool geometry. Many of the textures observed in the weld zone and thermo-mechanically affected zones exhibited shear texture components; however, there were many textures that deviated from ideal simple shear. Factors affecting the microstructure which are characteristic of the friction stir welding process, such as post-recrystallization deformation and complex deformation induced by tool geometry were discussed as causes for deviation from simple shear textures.

  5. [Coupling AFM fluid imaging with micro-flocculation filtration process for the technological optimization].

    PubMed

    Zheng, Bei; Ge, Xiao-peng; Yu, Zhi-yong; Yuan, Sheng-guang; Zhang, Wen-jing; Sun, Jing-fang

    2012-08-01

    Atomic force microscope (AFM) fluid imaging was applied to the study of micro-flocculation filtration process and the optimization of micro-flocculation time and the agitation intensity of G values. It can be concluded that AFM fluid imaging proves to be a promising tool in the observation and characterization of floc morphology and the dynamic coagulation processes under aqueous environmental conditions. Through the use of AFM fluid imaging technique, optimized conditions for micro-flocculation time of 2 min and the agitation intensity (G value) of 100 s(-1) were obtained in the treatment of dye-printing industrial tailing wastewater by the micro-flocculation filtration process with a good performance.

  6. Developing Coastal Adaptation to Climate Change in the New York City Infrastructure-Shed: Process, Approach, Tools, and Strategies

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia; Solecki, William D.; Blake, Reginald; Bowman, Malcolm; Faris, Craig; Gornitz, Vivien; Horton, Radley; Jacob, Klaus; LeBlanc, Alice; Leichenko, Robin; hide

    2010-01-01

    While current rates of sea level rise and associated coastal flooding in the New York City region appear to be manageable by stakeholders responsible for communications, energy, transportation, and water infrastructure, projections for sea level rise and associated flooding in the future, especially those associated with rapid icemelt of the Greenland and West Antarctic Icesheets, may be beyond the range of current capacity because an extreme event might cause flooding and inundation beyond the planning and preparedness regimes. This paper describes the comprehensive process, approach, and tools developed by the New York City Panel on Climate Change (NPCC) in conjunction with the region s stakeholders who manage its critical infrastructure, much of which lies near the coast. It presents the adaptation approach and the sea-level rise and storm projections related to coastal risks developed through the stakeholder process. Climate change adaptation planning in New York City is characterized by a multi-jurisdictional stakeholder-scientist process, state-of-the-art scientific projections and mapping, and development of adaptation strategies based on a risk-management approach.

  7. T.R.I.C.K.-Tire/Road Interaction Characterization & Knowledge - A tool for the evaluation of tire and vehicle performances in outdoor test sessions

    NASA Astrophysics Data System (ADS)

    Farroni, Flavio

    2016-05-01

    The most powerful engine, the most sophisticated aerodynamic devices or the most complex control systems will not improve vehicle performances if the forces exchanged with the road are not optimized by proper employment and knowledge of tires. The vehicle interface with the ground is constituted by the sum of small surfaces, wide about as one of our palms, in which tire/road interaction forces are exchanged. From this it is clear to see how the optimization of tire behavior represents a key-factor in the definition of the best setup of the whole vehicle. Nowadays, people and companies playing a role in automotive sector are looking for the optimal solution to model and understand tire's behavior both in experimental and simulation environments. The studies carried out and the tool developed herein demonstrate a new approach in tire characterization and in vehicle simulation procedures. This enables the reproduction of the dynamic response of a tire through the use of specific track sessions, carried out with the aim to employ the vehicle as a moving lab. The final product, named TRICK tool (Tire/Road Interaction Characterization and Knowledge), comprises of a vehicle model which processes experimental signals acquired from vehicle CAN bus and from sideslip angle estimation additional instrumentation. The output of the tool is several extra "virtual telemetry" channels, based on the time history of the acquired signals and containing force and slip estimations, useful to provide tire interaction characteristics. TRICK results can be integrated with the physical models developed by the Vehicle Dynamics UniNa research group, providing a multitude of working solutions and constituting an ideal instrument for the prediction and the simulation of the real tire dynamics.

  8. Psychometric characterization of the obstetric communication assessment tool for medical education: a pilot study.

    PubMed

    Rodriguez, A Noel; DeWitt, Peter; Fisher, Jennifer; Broadfoot, Kirsten; Hurt, K Joseph

    2016-06-11

    To characterize the psychometric properties of a novel Obstetric Communication Assessment Tool (OCAT) in a pilot study of standardized difficult OB communication scenarios appropriate for undergraduate medical evaluation. We developed and piloted four challenging OB Standardized Patient (SP) scenarios in a sample of twenty-one third year OB/GYN clerkship students: Religious Beliefs (RB), Angry Father (AF), Maternal Smoking (MS), and Intimate Partner Violence (IPV). Five trained Standardized Patient Reviewers (SPRs) independently scored twenty-four randomized video-recorded encounters using the OCAT. Cronbach's alpha and Intraclass Correlation Coefficient-2 (ICC-2) were used to estimate internal consistency (IC) and inter-rater reliability (IRR), respectively. Systematic variation in reviewer scoring was assessed using the Stuart-Maxwell test. IC was acceptable to excellent with Cronbach's alpha values (and 95% Confidence Intervals [CI]): RB 0.91 (0.86, 0.95), AF 0.76 (0.62, 0.87), MS 0.91 (0.86, 0.95), and IPV 0.94 (0.91, 0.97). IRR was unacceptable to poor with ICC-2 values: RB 0.46 (0.40, 0.53), AF 0.48 (0.41, 0.54), MS 0.52 (0.45, 0.58), and IPV 0.67 (0.61, 0.72). Stuart-Maxwell analysis indicated systematic differences in reviewer stringency. Our initial characterization of the OCAT demonstrates important issues in communications assessment. We identify scoring inconsistencies due to differences in SPR rigor that require enhanced training to improve assessment reliability. We outline a rational process for initial communication tool validation that may be useful in undergraduate curriculum development, and acknowledge that rigorous validation of OCAT training and implementation is needed to create a valuable OB communication assessment tool.

  9. State of the Art Assessment of Simulation in Advanced Materials Development

    NASA Technical Reports Server (NTRS)

    Wise, Kristopher E.

    2008-01-01

    Advances in both the underlying theory and in the practical implementation of molecular modeling techniques have increased their value in the advanced materials development process. The objective is to accelerate the maturation of emerging materials by tightly integrating modeling with the other critical processes: synthesis, processing, and characterization. The aims of this report are to summarize the state of the art of existing modeling tools and to highlight a number of areas in which additional development is required. In an effort to maintain focus and limit length, this survey is restricted to classical simulation techniques including molecular dynamics and Monte Carlo simulations.

  10. Lunar Processing Cabinet 2.0: Retrofitting Gloveboxes into the 21st Century

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.

    2015-01-01

    In 2014, the Apollo 16 Lunar Processing Glovebox (cabinet 38) in the Lunar Curation Laboratory at NASA JSC received an upgrade including new technology interfaces. A Jacobs - Technology Innovation Project provided the primary resources to retrofit this glovebox into the 21st century. NASA Astromaterials Acquisition & Curation Office continues the over 40 year heritage of preserving lunar materials for future scientific studies in state-of-the-art facilities. This enhancement has not only modernized the contamination controls, but provides new innovative tools for processing and characterizing lunar samples as well as supports real-time exchange of sample images and information with the scientific community throughout the world.

  11. Friction Stir Welding of Metal Matrix Composites for use in aerospace structures

    NASA Astrophysics Data System (ADS)

    Prater, Tracie

    2014-01-01

    Friction Stir Welding (FSW) is a relatively nascent solid state joining technique developed at The Welding Institute (TWI) in 1991. The process was first used at NASA to weld the super lightweight external tank for the Space Shuttle. Today FSW is used to join structural components of the Delta IV, Atlas V, and Falcon IX rockets as well as the Orion Crew Exploration Vehicle. A current focus of FSW research is to extend the process to new materials which are difficult to weld using conventional fusion techniques. Metal Matrix Composites (MMCs) consist of a metal alloy reinforced with ceramics and have a very high strength to weight ratio, a property which makes them attractive for use in aerospace and defense applications. MMCs have found use in the space shuttle orbiter's structural tubing, the Hubble Space Telescope's antenna mast, control surfaces and propulsion systems for aircraft, and tank armors. The size of MMC components is severely limited by difficulties encountered in joining these materials using fusion welding. Melting of the material results in formation of an undesirable phase (formed when molten Aluminum reacts with the reinforcement) which leaves a strength depleted region along the joint line. Since FSW occurs below the melting point of the workpiece material, this deleterious phase is absent in FSW-ed MMC joints. FSW of MMCs is, however, plagued by rapid wear of the welding tool, a consequence of the large discrepancy in hardness between the steel tool and the reinforcement material. This work characterizes the effect of process parameters (spindle speed, traverse rate, and length of joint) on the wear process. Based on the results of these experiments, a phenomenological model of the wear process was constructed based on the rotating plug model for FSW. The effectiveness of harder tool materials (such as Tungsten Carbide, high speed steel, and tools with diamond coatings) to combat abrasive wear is explored. In-process force, torque, and vibration signals are analyzed to assess the feasibility of on-line monitoring of tool shape changes as a result of wear (an advancement which would eliminate the need for off-line evaluation of tool condition during joining). Monitoring, controlling, and reducing tool wear in FSW of MMCs is essential to the implementation of these materials in structures (such as launch vehicles) where they would be of maximum benefit.

  12. Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states

    PubMed Central

    Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.

    2012-01-01

    Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984

  13. Study of PVD AlCrN Coating for Reducing Carbide Cutting Tool Deterioration in the Machining of Titanium Alloys.

    PubMed

    Cadena, Natalia L; Cue-Sampedro, Rodrigo; Siller, Héctor R; Arizmendi-Morquecho, Ana M; Rivera-Solorio, Carlos I; Di-Nardo, Santiago

    2013-05-24

    The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum-chromium-nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating.

  14. Study of PVD AlCrN Coating for Reducing Carbide Cutting Tool Deterioration in the Machining of Titanium Alloys

    PubMed Central

    Cadena, Natalia L.; Cue-Sampedro, Rodrigo; Siller, Héctor R.; Arizmendi-Morquecho, Ana M.; Rivera-Solorio, Carlos I.; Di-Nardo, Santiago

    2013-01-01

    The manufacture of medical and aerospace components made of titanium alloys and other difficult-to-cut materials requires the parallel development of high performance cutting tools coated with materials capable of enhanced tribological and resistance properties. In this matter, a thin nanocomposite film made out of AlCrN (aluminum–chromium–nitride) was studied in this research, showing experimental work in the deposition process and its characterization. A heat-treated monolayer coating, competitive with other coatings in the machining of titanium alloys, was analyzed. Different analysis and characterizations were performed on the manufactured coating by scanning electron microscopy and energy-dispersive X-ray spectroscopy (SEM-EDXS), and X-ray diffraction (XRD). Furthermore, the mechanical behavior of the coating was evaluated through hardness test and tribology with pin-on-disk to quantify friction coefficient and wear rate. Finally, machinability tests using coated tungsten carbide cutting tools were executed in order to determine its performance through wear resistance, which is a key issue of cutting tools in high-end cutting at elevated temperatures. It was demonstrated that the specimen (with lower friction coefficient than previous research) is more efficient in machinability tests in Ti6Al4V alloys. Furthermore, the heat-treated monolayer coating presented better performance in comparison with a conventional monolayer of AlCrN coating. PMID:28809266

  15. Role of Polymorphism and Thin-Film Morphology in Organic Semiconductors Processed by Solution Shearing

    PubMed Central

    2018-01-01

    Organic semiconductors (OSCs) are promising materials for cost-effective production of electronic devices because they can be processed from solution employing high-throughput techniques. However, small-molecule OSCs are prone to structural modifications because of the presence of weak van der Waals intermolecular interactions. Hence, controlling the crystallization in these materials is pivotal to achieve high device reproducibility. In this perspective article, we focus on controlling polymorphism and morphology in small-molecule organic semiconducting thin films deposited by solution-shearing techniques compatible with roll-to-roll systems. Special attention is paid to the influence that the different experimental deposition parameters can have on thin films. Further, the main characterization techniques for thin-film structures are reviewed, highlighting the in situ characterization tools that can provide crucial insights into the crystallization mechanisms. PMID:29503976

  16. Advances in the diagnosis of premenstrual syndrome and premenstrual dysphoric disorder.

    PubMed

    Futterman, Lori A

    2010-01-01

    Premenstrual disorders negatively impact the quality of life and functional ability of millions of women. The two generally recognized premenstrual disorders are premenstrual syndrome (PMS) and premenstrual dysphoric disorder (PMDD). These disorders are characterized by a wide variety of nonspecific mood, somatic and behavioral symptoms that occur only during the late luteal phase of a woman's cycle and disappear soon after the onset of menstruation. This paper reviews the diagnostic criteria for PMS and PMDD, describes some of the more common symptom diaries and other tools used to diagnose premenstrual disorders, and discusses the challenges inherent in diagnosing PMS and PMDD. A survey of peer-reviewed articles and relevant texts provided diagnostic criteria, descriptions of diagnostic tools and information about diagnostic challenges. The many nonspecific symptoms associated with premenstrual disorders complicate the diagnostic process. The use of proven symptom diaries and other diagnostic tools should aid in the differential diagnosis of premenstrual disorders. Patients need to report bothersome premenstrual symptoms, and clinicians should become more proficient in the diagnostic process in order to prevent underdiagnosis of these disorders.

  17. Pollution characterization of liquid waste of the factory complex Fertial (Arzew, Algeria).

    PubMed

    Redouane, Fares; Mourad, Lounis

    2016-03-01

    The industrial development in Algeria has made a worrying situation for all socioeconomic stakeholders. Indeed, this economic growth is marked in recent years by the establishment of factories and industrial plants that discharge liquid waste in marine shorelines. These releases could destabilize the environmental balance in the coming years, hence the need to support the processing of all sources of pollution. Remediation of such discharges requires several steps of identifying the various pollutants to their treatments. Therefore, the authors conducted this first work of characterization of industrial effluents generated by the mineral fertilizer factory complex Fertial (Arzew), and discussed the pollution load generated by this type of industry. This monitoring would establish a tool for reflection and decision support developed by a management system capable of ensuring effective and sustainable management of effluents from industrial activities of Fertial. The authors conducted this first work of characterization of industrial effluents generated by the mineral fertilizer factory complex Fertial (Arzew), and discussed the pollution load generated by this type of industry. This monitoring would establish a tool for reflection and decision support developed by a management system capable of ensuring effective and sustainable management of effluents from industrial activities of Fertial.

  18. High flow rate development: process optimization using megasonic immersion development (MID)

    NASA Astrophysics Data System (ADS)

    Courboin, Daniel; Choi, Jong Woo; Jung, Sang Hyun; Baek, Seung Hee; Kim, Lee Ju

    2004-12-01

    In previous study the high impact of development by-products on Critical Dimension (CD) through the microloading effect has been demonstrated for a Novolak resist. In this paper, through further tests involving Chemically Amplified Resist (CAR) and Novolak resist, the microloading effect of development is characterized and tentative mechanism is presented. Megasonic Immersion Development (MID), a high flow rate development technique similar to the Proximity Gap Suction Development (PGSD), was used and compared with spin spray development and puddle development. On TOK IP3600, a Novolak resist, we have explored a wide range of process conditions with MID. Developer temperature was varied from 5°C to 40°C with TMAH developer concentration of 1.9% and 2.38% resulting in an isofocal dose range of 90mJ to 190mJ. Exposure Focus Matrix (EFM) with a specific microloading pattern and resist cross sections were performed. The best conditions are quite far from the standard process advised by the resist supplier. Very nice standing wave profile was obtained at high temperature development. On CAR, JEOL 9000MVII, a 50kV e-beam vector scan tool, and ETEC ALTA 4300, a DUV raster scan tool, were used with different develop process techniques including MID. FujiFilm Arch FEP-171 positive CAR and Sumitomo NEB-22 negative CAR were used on 50kV writing tool. Sumitomo PEK-130 was used on DUV writing tool. FEP-171 and PEK-1300 show microloading effect on high density patterns but not NEB-22. MID shows also improved reproduction of develop features in the chrome and a 20% improvement of CD uniformity. The results of this study seem to indicate that a closer look in their development process is needed for 90nm and 65nm technologies.

  19. Modeling and evaluating user behavior in exploratory visual analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, Khairi; Johnson, Andrew E.; Papka, Michael E.

    Empirical evaluation methods for visualizations have traditionally focused on assessing the outcome of the visual analytic process as opposed to characterizing how that process unfolds. There are only a handful of methods that can be used to systematically study how people use visualizations, making it difficult for researchers to capture and characterize the subtlety of cognitive and interaction behaviors users exhibit during visual analysis. To validate and improve visualization design, however, it is important for researchers to be able to assess and understand how users interact with visualization systems under realistic scenarios. This paper presents a methodology for modeling andmore » evaluating the behavior of users in exploratory visual analysis. We model visual exploration using a Markov chain process comprising transitions between mental, interaction, and computational states. These states and the transitions between them can be deduced from a variety of sources, including verbal transcripts, videos and audio recordings, and log files. This model enables the evaluator to characterize the cognitive and computational processes that are essential to insight acquisition in exploratory visual analysis, and reconstruct the dynamics of interaction between the user and the visualization system. We illustrate this model with two exemplar user studies, and demonstrate the qualitative and quantitative analytical tools it affords.« less

  20. Characterization of N-doped polycrystalline diamond films deposited on microgrinding tools

    NASA Astrophysics Data System (ADS)

    Jackson, M. J.; Ahmed, W.

    2005-10-01

    Chemical vapor deposited diamond films have many industrial applications but are assuming increasing importance in the area of microengineering, most notably in the development of diamond coated microgrinding tools. For these applications the control of structure and morphology is of critical importance. The crystallite size, orientation, surface roughness, and the degree of sp 3 character have a profound effect on the tribological properties of the films deposited. In this article, we present experimental results on the effects of nitrogen doping on the surface morphology, crystallite size, and wear of microgrinding tools. The sp 3 character optimizes at 200 ppm nitrogen, and above this value the surface becomes much smoother and crystal sizes decrease considerably. Fracture-induced wear of the diamond grain is the most important mechanism of material removal from a microgrinding tool during the grinding process. Fracture occurs as a consequence of tensile stresses induced into diamond grains by grinding forces to which they are subjected. The relationship between the wear of diamond coated grinding tools, component grinding forces, and induced stresses in the model diamond grains is described in detail. A significant correlation was found between the maximum value of tensile stress induced in the diamond grain and the appropriate wheel-wear parameter (grinding ratio). It was concluded that the magnitude of tensile stresses induced in the diamond grain by grinding forces at the rake face is the best indicator of tool wear during the grinding process.

  1. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.

  2. Distinct contribution of the parietal and temporal cortex to hand configuration and contextual judgements about tools.

    PubMed

    Andres, Michael; Pelgrims, Barbara; Olivier, Etienne

    2013-09-01

    Neuropsychological studies showed that manipulatory and semantic knowledge can be independently impaired in patients with upper-limb apraxia, leading to different tool use disorders. The present study aimed to dissociate the brain regions involved in judging the hand configuration or the context associated to tool use. We focussed on the left supramarginalis gyrus (SMG) and left middle temporal gyrus (MTG), whose activation, as evidenced by functional magnetic resonance imaging (fMRI) studies, suggests that they may play a critical role in tool use. The distinctive location of SMG in the dorsal visual stream led us to postulate that this parietal region could play a role in processing incoming information about tools to shape hand posture. In contrast, we hypothesized that MTG, because of its interconnections with several cortical areas involved in semantic memory, could contribute to retrieving semantic information necessary to create a contextual representation of tool use. To test these hypotheses, we used neuronavigated transcranial magnetic stimulation (TMS) to interfere transiently with the function of either left SMG or left MTG in healthy participants performing judgement tasks about either hand configuration or context of tool use. We found that SMG virtual lesions impaired hand configuration but not contextual judgements, whereas MTG lesions selectively interfered with judgements about the context of tool use while leaving hand configuration judgements unaffected. This double dissociation demonstrates that the ability to infer a context of use or a hand posture from tool perception relies on distinct processes, performed in the temporal and parietal regions. The present findings suggest that tool use disorders caused by SMG lesions will be characterized by difficulties in selecting the appropriate hand posture for tool use, whereas MTG lesions will yield difficulties in using tools in the appropriate context. Copyright © 2012. Published by Elsevier Ltd.

  3. Molecular Imaging of Vulnerable Atherosclerotic Plaques in Animal Models

    PubMed Central

    Gargiulo, Sara; Gramanzini, Matteo; Mancini, Marcello

    2016-01-01

    Atherosclerosis is characterized by intimal plaques of the arterial vessels that develop slowly and, in some cases, may undergo spontaneous rupture with subsequent heart attack or stroke. Currently, noninvasive diagnostic tools are inadequate to screen atherosclerotic lesions at high risk of acute complications. Therefore, the attention of the scientific community has been focused on the use of molecular imaging for identifying vulnerable plaques. Genetically engineered murine models such as ApoE−/− and ApoE−/−Fbn1C1039G+/− mice have been shown to be useful for testing new probes targeting biomarkers of relevant molecular processes for the characterization of vulnerable plaques, such as vascular endothelial growth factor receptor (VEGFR)-1, VEGFR-2, intercellular adhesion molecule (ICAM)-1, P-selectin, and integrins, and for the potential development of translational tools to identify high-risk patients who could benefit from early therapeutic interventions. This review summarizes the main animal models of vulnerable plaques, with an emphasis on genetically altered mice, and the state-of-the-art preclinical molecular imaging strategies. PMID:27618031

  4. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    DOE PAGES

    Géneaux, R.; Camper, A.; Auguste, T.; ...

    2016-08-30

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterizemore » helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. Furthermore, these breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices.« less

  5. A Non-Competitive Inhibitor of VCP/p97 and VPS4 Reveals Conserved Allosteric Circuits in Type I and II AAA ATPases.

    PubMed

    Pöhler, Robert; Krahn, Jan H; van den Boom, Johannes; Dobrynin, Grzegorz; Kaschani, Farnusch; Eggenweiler, Hans-Michael; Zenke, Frank T; Kaiser, Markus; Meyer, Hemmo

    2018-02-05

    AAA ATPases have pivotal functions in diverse cellular processes essential for survival and proliferation. Revealing strategies for chemical inhibition of this class of enzymes is therefore of great interest for the development of novel chemotherapies or chemical tools. Here, we characterize the compound MSC1094308 as a reversible, allosteric inhibitor of the type II AAA ATPase human ubiquitin-directed unfoldase (VCP)/p97 and the type I AAA ATPase VPS4B. Subsequent proteomic, genetic and biochemical studies indicate that MSC1094308 binds to a previously characterized drugable hotspot of p97, thereby inhibiting the D2 ATPase activity. Our results furthermore indicate that a similar allosteric site exists in VPS4B, suggesting conserved allosteric circuits and drugable sites in both type I and II AAA ATPases. Our results may thus guide future chemical tool and drug discovery efforts for the biomedically relevant AAA ATPases. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Primary culture of glial cells from mouse sympathetic cervical ganglion: a valuable tool for studying glial cell biology.

    PubMed

    de Almeida-Leite, Camila Megale; Arantes, Rosa Maria Esteves

    2010-12-15

    Central nervous system glial cells as astrocytes and microglia have been investigated in vitro and many intracellular pathways have been clarified upon various stimuli. Peripheral glial cells, however, are not as deeply investigated in vitro despite its importance role in inflammatory and neurodegenerative diseases. Based on our previous experience of culturing neuronal cells, our objective was to standardize and morphologically characterize a primary culture of mouse superior cervical ganglion glial cells in order to obtain a useful tool to study peripheral glial cell biology. Superior cervical ganglia from neonatal C57BL6 mice were enzymatically and mechanically dissociated and cells were plated on diluted Matrigel coated wells in a final concentration of 10,000cells/well. Five to 8 days post plating, glial cell cultures were fixed for morphological and immunocytochemical characterization. Glial cells showed a flat and irregular shape, two or three long cytoplasm processes, and round, oval or long shaped nuclei, with regular outline. Cell proliferation and mitosis were detected both qualitative and quantitatively. Glial cells were able to maintain their phenotype in our culture model including immunoreactivity against glial cell marker GFAP. This is the first description of immunocytochemical characterization of mouse sympathetic cervical ganglion glial cells in primary culture. This work discusses the uses and limitations of our model as a tool to study many aspects of peripheral glial cell biology. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Process development of a New Haemophilus influenzae type b conjugate vaccine and the use of mathematical modeling to identify process optimization possibilities.

    PubMed

    Hamidi, Ahd; Kreeftenberg, Hans; V D Pol, Leo; Ghimire, Saroj; V D Wielen, Luuk A M; Ottens, Marcel

    2016-05-01

    Vaccination is one of the most successful public health interventions being a cost-effective tool in preventing deaths among young children. The earliest vaccines were developed following empirical methods, creating vaccines by trial and error. New process development tools, for example mathematical modeling, as well as new regulatory initiatives requiring better understanding of both the product and the process are being applied to well-characterized biopharmaceuticals (for example recombinant proteins). The vaccine industry is still running behind in comparison to these industries. A production process for a new Haemophilus influenzae type b (Hib) conjugate vaccine, including related quality control (QC) tests, was developed and transferred to a number of emerging vaccine manufacturers. This contributed to a sustainable global supply of affordable Hib conjugate vaccines, as illustrated by the market launch of the first Hib vaccine based on this technology in 2007 and concomitant price reduction of Hib vaccines. This paper describes the development approach followed for this Hib conjugate vaccine as well as the mathematical modeling tool applied recently in order to indicate options for further improvements of the initial Hib process. The strategy followed during the process development of this Hib conjugate vaccine was a targeted and integrated approach based on prior knowledge and experience with similar products using multi-disciplinary expertise. Mathematical modeling was used to develop a predictive model for the initial Hib process (the 'baseline' model) as well as an 'optimized' model, by proposing a number of process changes which could lead to further reduction in price. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:568-580, 2016. © 2016 American Institute of Chemical Engineers.

  8. Positronics of subnanometer atomistic imperfections in solids as a high-informative structure characterization tool.

    PubMed

    Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej

    2015-01-01

    Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.

  9. Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.

    PubMed

    Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan

    2017-09-10

    Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.

  10. Environmental exposures and health impacts of PFAS ...

    EPA Pesticide Factsheets

    Environmental exposures and health impacts of PFAS The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  11. Parkia pendula lectin as histochemistry marker for meningothelial tumour.

    PubMed

    Beltrão, E I C; Medeiros, P L; Rodrigues, O G; Figueredo-Silva, J; Valença, M M; Coelho, L C B B; Carvalho, L B

    2003-01-01

    Lectins have been intensively used in histochemical techniques for cell surface characterization. These proteins are involved in several biological processes and their use as histochemical markers have been evaluated since they can indicate differences in cell surfaces. Parkia pendula lectin (PpeL) was evaluated as histochemical marker for meningothelial meningioma biopsies. Tissue slices were incubated with PpeL conjugated to horseradish peroxidase (PpeL-HRP) and Concanavalin A-HRP (ConA-HPR) and the binding visualized with diaminobenzidine and hydrogen peroxide. The lectin-tissue binding was inhibited with D-glucose. PpeL showed to be a useful tool for the characterization of meningothelial tumour and clinico-pathological diagnosis.

  12. Positronics of subnanometer atomistic imperfections in solids as a high-informative structure characterization tool

    NASA Astrophysics Data System (ADS)

    Shpotyuk, Oleh; Filipecki, Jacek; Ingram, Adam; Golovchak, Roman; Vakiv, Mykola; Klym, Halyna; Balitska, Valentyna; Shpotyuk, Mykhaylo; Kozdras, Andrzej

    2015-02-01

    Methodological possibilities of positron annihilation lifetime (PAL) spectroscopy applied to characterize different types of nanomaterials treated within three-term fitting procedure are critically reconsidered. In contrast to conventional three-term analysis based on admixed positron- and positronium-trapping modes, the process of nanostructurization is considered as substitutional positron-positronium trapping within the same host matrix. Developed formalism allows estimate interfacial void volumes responsible for positron trapping and characteristic bulk positron lifetimes in nanoparticle-affected inhomogeneous media. This algorithm was well justified at the example of thermally induced nanostructurization occurring in 80GeSe2-20Ga2Se3 glass.

  13. Constraining anomalous Higgs boson couplings to the heavy-flavor fermions using matrix element techniques

    NASA Astrophysics Data System (ADS)

    Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng

    2016-09-01

    In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.

  14. Additive manufacturing of materials: Opportunities and challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Dehoff, Ryan R.

    Additive manufacturing (also known as 3D printing) is considered a disruptive technology for producing components with topologically optimized complex geometries as well as functionalities that are not achievable by traditional methods. The realization of the full potential of 3D printing is stifled by a lack of computational design tools, generic material feedstocks, techniques for monitoring thermomechanical processes under in situ conditions, and especially methods for minimizing anisotropic static and dynamic properties brought about by microstructural heterogeneity. In this paper, we discuss the role of interdisciplinary research involving robotics and automation, process control, multiscale characterization of microstructure and properties, and high-performancemore » computational tools to address each of these challenges. In addition, emerging pathways to scale up additive manufacturing of structural materials to large sizes (>1 m) and higher productivities (5–20 kg/h) while maintaining mechanical performance and geometrical flexibility are also discussed.« less

  15. Additive manufacturing of materials: Opportunities and challenges

    DOE PAGES

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Dehoff, Ryan R.; ...

    2015-11-01

    Additive manufacturing (also known as 3D printing) is considered a disruptive technology for producing components with topologically optimized complex geometries as well as functionalities that are not achievable by traditional methods. The realization of the full potential of 3D printing is stifled by a lack of computational design tools, generic material feedstocks, techniques for monitoring thermomechanical processes under in situ conditions, and especially methods for minimizing anisotropic static and dynamic properties brought about by microstructural heterogeneity. In this paper, we discuss the role of interdisciplinary research involving robotics and automation, process control, multiscale characterization of microstructure and properties, and high-performancemore » computational tools to address each of these challenges. In addition, emerging pathways to scale up additive manufacturing of structural materials to large sizes (>1 m) and higher productivities (5–20 kg/h) while maintaining mechanical performance and geometrical flexibility are also discussed.« less

  16. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  17. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  18. Implementation of "Quality by Design (QbD)" Approach for the Development of 5-Fluorouracil Loaded Thermosensitive Hydrogel.

    PubMed

    Dalwadi, Chintan; Patel, Gayatri

    2016-01-01

    The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.

  19. A Practical Framework Toward Prediction of Breaking Force and Disintegration of Tablet Formulations Using Machine Learning Tools.

    PubMed

    Akseli, Ilgaz; Xie, Jingjin; Schultz, Leon; Ladyzhynsky, Nadia; Bramante, Tommasina; He, Xiaorong; Deanne, Rich; Horspool, Keith R; Schwabe, Robert

    2017-01-01

    Enabling the paradigm of quality by design requires the ability to quantitatively correlate material properties and process variables to measureable product performance attributes. Conventional, quality-by-test methods for determining tablet breaking force and disintegration time usually involve destructive tests, which consume significant amount of time and labor and provide limited information. Recent advances in material characterization, statistical analysis, and machine learning have provided multiple tools that have the potential to develop nondestructive, fast, and accurate approaches in drug product development. In this work, a methodology to predict the breaking force and disintegration time of tablet formulations using nondestructive ultrasonics and machine learning tools was developed. The input variables to the model include intrinsic properties of formulation and extrinsic process variables influencing the tablet during manufacturing. The model has been applied to predict breaking force and disintegration time using small quantities of active pharmaceutical ingredient and prototype formulation designs. The novel approach presented is a step forward toward rational design of a robust drug product based on insight into the performance of common materials during formulation and process development. It may also help expedite drug product development timeline and reduce active pharmaceutical ingredient usage while improving efficiency of the overall process. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  20. Driving factors for torrential mass-movements occurrence in the Western Alps

    NASA Astrophysics Data System (ADS)

    Tiranti, Davide; Cremonini, Roberto; Asprea, Irene; Marco, Federica

    2016-02-01

    To understand the behaviour of torrential processes in the alpine environment, the conditions mainly responsiblefor the occurrence of these phenomena have to be identified and distinguished(classified) aspredisposing and triggering factors. In this regard, this study is aimed to understanding which factors lead to the occurrence of a given torrential processes in alpine catchments in the Western Alps, where information on past events are exhaustive and characterized by a long historical series. More than 769 documented torrential eventsoccurred from 1728 to 2015 within 78 catchments. Datasets concerning climate, geology and morphology, land use and the presence of historical landslide activity have been elaborated as input for multivariate statistical analysis to characterize the behaviour of the catchments. The results pinpoint the factors that mainly drive the type of torrential dominant process occurring in a given catchment, its occurrence probability, and its frequency. This study has demonstrated that catchments characterized by a significant percentage of outcropping rocks show a greater occurrence of torrential processes, especially hyperconcentrated flows and debris flows; on the contrary highly vegetated catchments are typically subject to water flows. This result can be a useful tool for the evaluation of hazards related to this specific phenomenon, making it possible to predict the most likely torrential processes that can be generated in a specific basin, given the characteristics of outcropping rock and vegetation cover.

  1. Characteristic time scales for diffusion processes through layers and across interfaces

    NASA Astrophysics Data System (ADS)

    Carr, Elliot J.

    2018-04-01

    This paper presents a simple tool for characterizing the time scale for continuum diffusion processes through layered heterogeneous media. This mathematical problem is motivated by several practical applications such as heat transport in composite materials, flow in layered aquifers, and drug diffusion through the layers of the skin. In such processes, the physical properties of the medium vary across layers and internal boundary conditions apply at the interfaces between adjacent layers. To characterize the time scale, we use the concept of mean action time, which provides the mean time scale at each position in the medium by utilizing the fact that the transition of the transient solution of the underlying partial differential equation model, from initial state to steady state, can be represented as a cumulative distribution function of time. Using this concept, we define the characteristic time scale for a multilayer diffusion process as the maximum value of the mean action time across the layered medium. For given initial conditions and internal and external boundary conditions, this approach leads to simple algebraic expressions for characterizing the time scale that depend on the physical and geometrical properties of the medium, such as the diffusivities and lengths of the layers. Numerical examples demonstrate that these expressions provide useful insight into explaining how the parameters in the model affect the time it takes for a multilayer diffusion process to reach steady state.

  2. Characteristic time scales for diffusion processes through layers and across interfaces.

    PubMed

    Carr, Elliot J

    2018-04-01

    This paper presents a simple tool for characterizing the time scale for continuum diffusion processes through layered heterogeneous media. This mathematical problem is motivated by several practical applications such as heat transport in composite materials, flow in layered aquifers, and drug diffusion through the layers of the skin. In such processes, the physical properties of the medium vary across layers and internal boundary conditions apply at the interfaces between adjacent layers. To characterize the time scale, we use the concept of mean action time, which provides the mean time scale at each position in the medium by utilizing the fact that the transition of the transient solution of the underlying partial differential equation model, from initial state to steady state, can be represented as a cumulative distribution function of time. Using this concept, we define the characteristic time scale for a multilayer diffusion process as the maximum value of the mean action time across the layered medium. For given initial conditions and internal and external boundary conditions, this approach leads to simple algebraic expressions for characterizing the time scale that depend on the physical and geometrical properties of the medium, such as the diffusivities and lengths of the layers. Numerical examples demonstrate that these expressions provide useful insight into explaining how the parameters in the model affect the time it takes for a multilayer diffusion process to reach steady state.

  3. Independent Assessment of Technology Characterizations to Support the Biomass Program Annual State-of-Technology Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeh, B.

    2011-03-01

    This report discusses an investigation that addressed two thermochemical conversion pathways for the production of liquid fuels and addressed the steps to the process, the technology providers, a method for determining the state of technology and a tool to continuously assess the state of technology. This report summarizes the findings of the investigation as well as recommendations for improvements for future studies.

  4. Analysis of Volatile Organic Compounds in a Controlled Environment: Ethylene Gas Measurement Studies on Radish

    NASA Technical Reports Server (NTRS)

    Kong, Suk Bin

    2001-01-01

    Volatile organic compound(VOC), ethylene gas, was characterized and quantified by GC/FID. 20-50 ppb levels were detected during the growth stages of radish. SPME could be a good analytical tool for the purpose. Low temperature trapping method using dry ice/diethyl ether and liquid nitrogen bath was recommended for the sampling process for GC/PID and GC/MS analysis.

  5. A RSM-based predictive model to characterize heat treating parameters of D2 steel using combined Barkhausen noise and hysteresis loop methods

    NASA Astrophysics Data System (ADS)

    Kahrobaee, Saeed; Hejazi, Taha-Hossein

    2017-07-01

    Austenitizing and tempering temperatures are the effective characteristics in heat treating process of AISI D2 tool steel. Therefore, controlling them enables the heat treatment process to be designed more accurately which results in more balanced mechanical properties. The aim of this work is to develop a multiresponse predictive model that enables finding these characteristics based on nondestructive tests by a set of parameters of the magnetic Barkhausen noise technique and hysteresis loop method. To produce various microstructural changes, identical specimens from the AISI D2 steel sheet were austenitized in the range 1025-1130 °C, for 30 min, oil-quenched and finally tempered at various temperatures between 200 °C and 650 °C. A set of nondestructive data have been gathered based on general factorial design of experiments and used for training and testing the multiple response surface model. Finally, an optimization model has been proposed to achieve minimal error prediction. Results revealed that applying Barkhausen and hysteresis loop methods, simultaneously, coupling to the multiresponse model, has a potential to be used as a reliable and accurate nondestructive tool for predicting austenitizing and tempering temperatures (which, in turn, led to characterizing the microstructural changes) of the parts with unknown heat treating conditions.

  6. Risk Informed Margins Management as part of Risk Informed Safety Margin Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith

    2014-06-01

    The ability to better characterize and quantify safety margin is important to improved decision making about Light Water Reactor (LWR) design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margin management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. In addition, as research and development in the LWR Sustainability (LWRS) Program and other collaborative efforts yield new data, sensors, and improved scientific understanding of physical processes that govern the aging and degradation of plant SSCs needs and opportunities to better optimize plantmore » safety and performance will become known. To support decision making related to economics, readability, and safety, the Risk Informed Safety Margin Characterization (RISMC) Pathway provides methods and tools that enable mitigation options known as risk informed margins management (RIMM) strategies.« less

  7. Manufacturing Challenges Associated with the Use of Metal Matrix Composites in Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Prater, Tracie

    2014-01-01

    Metal Matrix Composites (MMCs) consist of a metal alloy reinforced with ceramic particles or fibers. These materials possess a very high strength to weight ratio, good resistance to impact and wear, and a number of other properties which make them attractive for use in aerospace and defense applications. MMCs have found use in the space shuttle orbiter's structural tubing, the Hubble Space Telescope's antenna mast, control surfaces and propulsion systems for aircraft, and tank armors. The size of MMC components is severely limited by difficulties encountered in joining these materials using fusion welding. Melting of the material results in formation of an undesirable phase (formed when molten Aluminum reacts with the reinforcement) which leaves a strength depleted region along the joint line. Friction Stir Welding (FSW) is a relatively nascent solid state joining technique developed at The Welding Institute (TWI) in 1991. The process was first used at NASA to weld the super lightweight external tank for the Space Shuttle. Today FSW is used to join structural components of the Delta IV, Atlas V, and Falcon IX rockets as well as NASA's Orion Crew Exploration Vehicle and Space Launch System. A current focus of FSW research is to extend the process to new materials, such as MMCs, which are difficult to weld using conventional fusion techniques. Since Friction Stir Welding occurs below the melting point of the workpiece material, this deleterious phase is absent in FSW-ed MMC joints. FSW of MMCs is, however, plagued by rapid wear of the welding tool, a consequence of the large discrepancy in hardness between the steel tool and the reinforcement material. This chapter summarizes the challenges encountered when joining MMCs to themselves or to other materials in structures. Specific attention is paid to the influence of process variables in Friction Stir Welding on the wear process characterizes the effect of process parameters (spindle speed, traverse rate, and length of joint) on the wear process. A phenomenological model of the wear process was constructed based on the rotating plug model of Friction Stir Welding. The effectiveness of harder tool materials (such as Tungsten Carbide, high speed steel, and tools with diamond coatings) to combat abrasive wear is also explored. In-process force, torque, and vibration signals are analyzed to assess the feasibility of in situ monitoring of tool shape changes as a result of wear (an advancement which would eliminate the need for off-line evaluation of tool condition during joining). Monitoring, controlling, and reducing tool wear in FSW of MMCs is essential to implementation of these materials in structures (such as launch vehicles) where they would be of maximum benefit. The work presented here is extendable to machining of MMCs, where wear of the tool is also a limiting factor.

  8. Micro-Raman spectroscopy as a tool for the characterization of silicon carbide in power semiconductor material processing

    NASA Astrophysics Data System (ADS)

    De Biasio, M.; Kraft, M.; Schultz, M.; Goller, B.; Sternig, D.; Esteve, R.; Roesner, M.

    2017-05-01

    Silicon carbide (SiC) is a wide band-gap semi-conductor material that is used increasingly for high voltage power devices, since it has a higher breakdown field strength and better thermal conductivity than silicon. However, in particular its hardness makes wafer processing difficult and many standard semi-conductor processes have to be specially adapted. We measure the effects of (i) mechanical processing (i.e. grinding of the backside) and (ii) chemical and thermal processing (i.e. doping and annealing), using confocal microscopy to measure the surface roughness of ground wafers and micro-Raman spectroscopy to measure the stresses induced in the wafers by grinding. 4H-SiC wafers with different dopings were studied before and after annealing, using depth-resolved micro-Raman spectroscopy to observe how doping and annealing affect: i.) the damage and stresses induced on the crystalline structure of the samples and ii.) the concentration of free electrical carriers. Our results show that mechanical, chemical and thermal processing techniques have effects on this semiconductor material that can be observed and characterized using confocal microscopy and high resolution micro Raman spectroscopy.

  9. Förster resonance energy transfer as a tool to study photoreceptor biology

    PubMed Central

    Hovan, Stephanie C.; Howell, Scott; Park, Paul S.-H.

    2010-01-01

    Vision is initiated in photoreceptor cells of the retina by a set of biochemical events called phototransduction. These events occur via coordinated dynamic processes that include changes in secondary messenger concentrations, conformational changes and post-translational modifications of signaling proteins, and protein-protein interactions between signaling partners. A complete description of the orchestration of these dynamic processes is still unavailable. Described in this work is the first step in the development of tools combining fluorescent protein technology, Förster resonance energy transfer (FRET), and transgenic animals that have the potential to reveal important molecular insights about the dynamic processes occurring in photoreceptor cells. We characterize the fluorescent proteins SCFP3A and SYFP2 for use as a donor-acceptor pair in FRET assays, which will facilitate the visualization of dynamic processes in living cells. We also demonstrate the targeted expression of these fluorescent proteins to the rod photoreceptor cells of Xenopus laevis, and describe a general method for detecting FRET in these cells. The general approaches described here can address numerous types of questions related to phototransduction and photoreceptor biology by providing a platform to visualize dynamic processes in molecular detail within a native context. PMID:21198205

  10. Effects of Process Parameters and Cryotreated Electrode on the Radial Overcut of Aisi 304 IN SiC Powder Mixed Edm

    NASA Astrophysics Data System (ADS)

    Bhaumik, Munmun; Maity, Kalipada

    Powder mixed electro discharge machining (PMEDM) is further advancement of conventional electro discharge machining (EDM) where the powder particles are suspended in the dielectric medium to enhance the machining rate as well as surface finish. Cryogenic treatment is introduced in this process for improving the tool life and cutting tool properties. In the present investigation, the characterization of the cryotreated tempered electrode was performed. An attempt has been made to study the effect of cryotreated double tempered electrode on the radial overcut (ROC) when SiC powder is mixed in the kerosene dielectric during electro discharge machining of AISI 304. The process performance has been evaluated by means of ROC when peak current, pulse on time, gap voltage, duty cycle and powder concentration are considered as process parameters and machining is performed by using tungsten carbide electrodes (untreated and double tempered electrodes). A regression analysis was performed to correlate the data between the response and the process parameters. Microstructural analysis was carried out on the machined surfaces. Least radial overcut was observed for conventional EDM as compared to powder mixed EDM. Cryotreated double tempered electrode significantly reduced the radial overcut than untreated electrode.

  11. Friction-Stir Welding of Aluminum For the Space Program

    NASA Technical Reports Server (NTRS)

    Jones, Clyde S.; Smelser, Jerry W. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center is developing and characterizing the friction stir welding process for the Space Shuttle and other space programs. This revolutionary process, invented and patented by The Weld Institute in England, offers tremendous advantages for joining aluminum for high performance applications. It is particularly suited for advanced aluminum-lithium alloys, such as 2195, the primary structural alloy used in the External Tank. The friction stir welding process joins metals with minimal heat input, resulting in high-strength joints with high ductility. It is a simple process to demonstrate using a common milling machine for sample parts, but relatively expensive to implement on large-scale hardware, due to the high cost of tooling needed to handle the high forging pressures characteristic of the process. Recent developments at the Marshall Space Flight Center have demonstrated friction stir welding on linear joints up to 5 meters (15 ft.), with material thickness ranging between 2.5 mm and 16.5 mm (0.100" to 0.650"). High efficiency weld joints have been produced in aluminum from the 2000, 5000, and 6000 series alloy systems. A "retractable pin tool" system was patented by MSFC that allows use of friction stir welding for joints with changing material thickness, and with less rigid tooling than previously considered. This presentation will describe the details of alloys welded to-date and technical advances under development at MSFC. These developments could have substantial benefit to industrial applications for welding aluminum.

  12. Differential dynamic microscopy to characterize Brownian motion and bacteria motility

    NASA Astrophysics Data System (ADS)

    Germain, David; Leocmach, Mathieu; Gibaud, Thomas

    2016-03-01

    We have developed a lab module for undergraduate students, which involves the process of quantifying the dynamics of a suspension of microscopic particles using Differential Dynamic Microscopy (DDM). DDM is a relatively new technique that constitutes an alternative method to more classical techniques such as dynamic light scattering (DLS) or video particle tracking (VPT). The technique consists of imaging a particle dispersion with a standard light microscope and a camera and analyzing the images using a digital Fourier transform to obtain the intermediate scattering function, an autocorrelation function that characterizes the dynamics of the dispersion. We first illustrate DDM in the textbook case of colloids under Brownian motion, where we measure the diffusion coefficient. Then we show that DDM is a pertinent tool to characterize biological systems such as motile bacteria.

  13. CORSEN, a new software dedicated to microscope-based 3D distance measurements: mRNA-mitochondria distance, from single-cell to population analyses.

    PubMed

    Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde

    2010-07-01

    Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.

  14. Free-energy landscape of protein oligomerization from atomistic simulations

    PubMed Central

    Barducci, Alessandro; Bonomi, Massimiliano; Prakash, Meher K.; Parrinello, Michele

    2013-01-01

    In the realm of protein–protein interactions, the assembly process of homooligomers plays a fundamental role because the majority of proteins fall into this category. A comprehensive understanding of this multistep process requires the characterization of the driving molecular interactions and the transient intermediate species. The latter are often short-lived and thus remain elusive to most experimental investigations. Molecular simulations provide a unique tool to shed light onto these complex processes complementing experimental data. Here we combine advanced sampling techniques, such as metadynamics and parallel tempering, to characterize the oligomerization landscape of fibritin foldon domain. This system is an evolutionarily optimized trimerization motif that represents an ideal model for experimental and computational mechanistic studies. Our results are fully consistent with previous experimental nuclear magnetic resonance and kinetic data, but they provide a unique insight into fibritin foldon assembly. In particular, our simulations unveil the role of nonspecific interactions and suggest that an interplay between thermodynamic bias toward native structure and residual conformational disorder may provide a kinetic advantage. PMID:24248370

  15. Free-energy landscape of protein oligomerization from atomistic simulations.

    PubMed

    Barducci, Alessandro; Bonomi, Massimiliano; Prakash, Meher K; Parrinello, Michele

    2013-12-03

    In the realm of protein-protein interactions, the assembly process of homooligomers plays a fundamental role because the majority of proteins fall into this category. A comprehensive understanding of this multistep process requires the characterization of the driving molecular interactions and the transient intermediate species. The latter are often short-lived and thus remain elusive to most experimental investigations. Molecular simulations provide a unique tool to shed light onto these complex processes complementing experimental data. Here we combine advanced sampling techniques, such as metadynamics and parallel tempering, to characterize the oligomerization landscape of fibritin foldon domain. This system is an evolutionarily optimized trimerization motif that represents an ideal model for experimental and computational mechanistic studies. Our results are fully consistent with previous experimental nuclear magnetic resonance and kinetic data, but they provide a unique insight into fibritin foldon assembly. In particular, our simulations unveil the role of nonspecific interactions and suggest that an interplay between thermodynamic bias toward native structure and residual conformational disorder may provide a kinetic advantage.

  16. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  17. Benefits of object-oriented models and ModeliChart: modern tools and methods for the interdisciplinary research on smart biomedical technology.

    PubMed

    Gesenhues, Jonas; Hein, Marc; Ketelhut, Maike; Habigt, Moriz; Rüschen, Daniel; Mechelinck, Mare; Albin, Thivaharan; Leonhardt, Steffen; Schmitz-Rode, Thomas; Rossaint, Rolf; Autschbach, Rüdiger; Abel, Dirk

    2017-04-01

    Computational models of biophysical systems generally constitute an essential component in the realization of smart biomedical technological applications. Typically, the development process of such models is characterized by a great extent of collaboration between different interdisciplinary parties. Furthermore, due to the fact that many underlying mechanisms and the necessary degree of abstraction of biophysical system models are unknown beforehand, the steps of the development process of the application are iteratively repeated when the model is refined. This paper presents some methods and tools to facilitate the development process. First, the principle of object-oriented (OO) modeling is presented and the advantages over classical signal-oriented modeling are emphasized. Second, our self-developed simulation tool ModeliChart is presented. ModeliChart was designed specifically for clinical users and allows independently performing in silico studies in real time including intuitive interaction with the model. Furthermore, ModeliChart is capable of interacting with hardware such as sensors and actuators. Finally, it is presented how optimal control methods in combination with OO models can be used to realize clinically motivated control applications. All methods presented are illustrated on an exemplary clinically oriented use case of the artificial perfusion of the systemic circulation.

  18. Microstructure Modeling of 3rd Generation Disk Alloys

    NASA Technical Reports Server (NTRS)

    Jou, Herng-Jeng

    2010-01-01

    The objective of this program is to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool will be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishment achieved during the second year (2008) of the program is summarized. The activities of this year include final selection of multicomponent thermodynamics and mobility databases, precipitate surface energy determination from nucleation experiment, multiscale comparison of predicted versus measured intragrain precipitation microstructure in quench samples showing good agreement, isothermal coarsening experiment and interaction of grain boundary and intergrain precipitates, primary microstructure of subsolvus treatment, and finally the software implementation plan for the third year of the project. In the following year, the calibrated models and simulation tools will be validated against an independently developed experimental data set, with actual disc heat treatment process conditions. Furthermore, software integration and implementation will be developed to provide material engineers valuable information in order to optimize the processing of the 3rd generation gas turbine disc alloys.

  19. Assessment of multiple geophysical techniques for the characterization of municipal waste deposit sites

    NASA Astrophysics Data System (ADS)

    Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen

    2017-10-01

    In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.

  20. Respirometric screening of several types of manure and mixtures intended for composting.

    PubMed

    Barrena, Raquel; Turet, Josep; Busquets, Anna; Farrés, Moisès; Font, Xavier; Sánchez, Antoni

    2011-01-01

    The viability of mixtures from manure and agricultural wastes as composting sources were systematically studied using a physicochemical and biological characterization. The combination of different parameters such as C:N ratio, free air space (FAS) and moisture content can help in the formulation of the mixtures. Nevertheless, the composting process may be challenging, particularly at industrial scales. The results of this study suggest that if the respirometric potential is known, it is possible to predict the behaviour of a full scale composting process. Respiration indices can be used as a tool for determining the suitability of composting as applied to manure and complementary wastes. Accordingly, manure and agricultural wastes with a high potential for composting and some proposed mixtures have been characterized in terms of respiration activity. Specifically, the potential of samples to be composted has been determined by means of the oxygen uptake rate (OUR) and the dynamic respirometric index (DRI). During this study, four of these mixtures were composted at full scale in a system consisting of a confined pile with forced aeration. The biological activity was monitored by means of the oxygen uptake rate inside the material (OURinsitu). This new parameter represents the real activity of the process. The comparison between the potential respirometric activities at laboratory scale with the in situ respirometric activity observed at full scale may be a useful tool in the design and optimization of composting systems for manure and other organic agricultural wastes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Sperm Cell Population Dynamics in Ram Semen during the Cryopreservation Process

    PubMed Central

    Ramón, Manuel; Pérez-Guzmán, M. Dolores; Jiménez-Rabadán, Pilar; Esteso, Milagros C.; García-Álvarez, Olga; Maroto-Morales, Alejandro; Anel-López, Luis; Soler, Ana J.; Fernández-Santos, M. Rocío; Garde, J. Julián

    2013-01-01

    Background Sperm cryopreservation has become an indispensable tool in biology. Initially, studies were aimed towards the development of efficient freezing protocols in different species that would allow for an efficient storage of semen samples for long periods of time, ensuring its viability. Nowadays, it is widely known that an important individual component exists in the cryoresistance of semen, and efforts are aimed at identifying those sperm characteristics that may allow us to predict this cryoresistance. This knowledge would lead, ultimately, to the design of optimized freezing protocols for the sperm characteristics of each male. Methodology/Principal Findings We have evaluated the changes that occur in the sperm head dimensions throughout the cryopreservation process. We have found three different patterns of response, each of one related to a different sperm quality at thawing. We have been able to characterize males based on these patterns. For each male, its pattern remained constant among different ejaculates. This latter would imply that males always respond in the same way to freezing, giving even more importance to this sperm feature. Conclusions/Significance Changes in the sperm head during cryopreservation process have resulted useful to identify the ability of semen of males for freezing. We suggest that analyses of these response patterns would represent an important tool to characterize the cryoresistance of males when implemented within breeding programs. We also propose follow-up experiments to examine the outcomes of the use of different freezing protocols depending on the pattern of response of males. PMID:23544054

  2. Characterization and control of fungal morphology for improved production performance in biotechnology.

    PubMed

    Krull, Rainer; Wucherpfennig, Thomas; Esfandabadi, Manely Eslahpazir; Walisko, Robert; Melzer, Guido; Hempel, Dietmar C; Kampen, Ingo; Kwade, Arno; Wittmann, Christoph

    2013-01-20

    Filamentous fungi have been widely applied in industrial biotechnology for many decades. In submerged culture processes, they typically exhibit a complex morphological life cycle that is related to production performance--a link that is of high interest for process optimization. The fungal forms can vary from dense spherical pellets to viscous mycelia. The resulting morphology has been shown to be influenced strongly by process parameters, including power input through stirring and aeration, mass transfer characteristics, pH value, osmolality and the presence of solid micro-particles. The surface properties of fungal spores and hyphae also play a role. Due to their high industrial relevance, the past years have seen a substantial development of tools and techniques to characterize the growth of fungi and obtain quantitative estimates on their morphological properties. Based on the novel insights available from such studies, more recent studies have been aimed at the precise control of morphology, i.e., morphology engineering, to produce superior bio-processes with filamentous fungi. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. 64nm pitch metal1 double patterning metrology: CD and OVL control by SEMCD, image based overlay and diffraction based overlay

    NASA Astrophysics Data System (ADS)

    Ducoté, Julien; Dettoni, Florent; Bouyssou, Régis; Le-Gratiet, Bertrand; Carau, Damien; Dezauzier, Christophe

    2015-03-01

    Patterning process control of advanced nodes has required major changes over the last few years. Process control needs of critical patterning levels since 28nm technology node is extremely aggressive showing that metrology accuracy/sensitivity must be finely tuned. The introduction of pitch splitting (Litho-Etch-Litho-Etch) at 14FDSOInm node requires the development of specific metrologies to adopt advanced process control (for CD, overlay and focus corrections). The pitch splitting process leads to final line CD uniformities that are a combination of the CD uniformities of the two exposures, while the space CD uniformities are depending on both CD and OVL variability. In this paper, investigations of CD and OVL process control of 64nm minimum pitch at Metal1 level of 14FDSOI technology, within the double patterning process flow (Litho, hard mask etch, line etch) are presented. Various measurements with SEMCD tools (Hitachi), and overlay tools (KT for Image Based Overlay - IBO, and ASML for Diffraction Based Overlay - DBO) are compared. Metrology targets are embedded within a block instanced several times within the field to perform intra-field process variations characterizations. Specific SEMCD targets were designed for independent measurement of both line CD (A and B) and space CD (A to B and B to A) for each exposure within a single measurement during the DP flow. Based on those measurements correlation between overlay determined with SEMCD and with standard overlay tools can be evaluated. Such correlation at different steps through the DP flow is investigated regarding the metrology type. Process correction models are evaluated with respect to the measurement type and the intra-field sampling.

  4. Microseismic monitoring: a tool for reservoir characterization.

    NASA Astrophysics Data System (ADS)

    Shapiro, S. A.

    2011-12-01

    Characterization of fluid-transport properties of rocks is one of the most important, yet one of most challenging goals of reservoir geophysics. There are some fundamental difficulties related to using active seismic methods for estimating fluid mobility. However, it would be very attractive to have a possibility of exploring hydraulic properties of rocks using seismic methods because of their large penetration range and their high resolution. Microseismic monitoring of borehole fluid injections is exactly the tool to provide us with such a possibility. Stimulation of rocks by fluid injections belong to a standard development practice of hydrocarbon and geothermal reservoirs. Production of shale gas and of heavy oil, CO2 sequestrations, enhanced recovery of oil and of geothermal energy are branches that require broad applications of this technology. The fact that fluid injection causes seismicity has been well-established for several decades. Observations and data analyzes show that seismicity is triggered by different processes ranging from linear pore pressure diffusion to non-linear fluid impact onto rocks leading to their hydraulic fracturing and strong changes of their structure and permeability. Understanding and monitoring of fluid-induced seismicity is necessary for hydraulic characterization of reservoirs, for assessments of reservoir stimulation and for controlling related seismic hazard. This presentation provides an overview of several theoretical, numerical, laboratory and field studies of fluid-induced microseismicity, and it gives an introduction into the principles of seismicity-based reservoir characterization.

  5. Accounting for host cell protein behavior in anion-exchange chromatography.

    PubMed

    Swanson, Ryan K; Xu, Ruo; Nettleton, Daniel S; Glatz, Charles E

    2016-11-01

    Host cell proteins (HCP) are a problematic set of impurities in downstream processing (DSP) as they behave most similarly to the target protein during separation. Approaching DSP with the knowledge of HCP separation behavior would be beneficial for the production of high purity recombinant biologics. Therefore, this work was aimed at characterizing the separation behavior of complex mixtures of HCP during a commonly used method: anion-exchange chromatography (AEX). An additional goal was to evaluate the performance of a statistical methodology, based on the characterization data, as a tool for predicting protein separation behavior. Aqueous two-phase partitioning followed by two-dimensional electrophoresis provided data on the three physicochemical properties most commonly exploited during DSP for each HCP: pI (isoelectric point), molecular weight, and surface hydrophobicity. The protein separation behaviors of two alternative expression host extracts (corn germ and E. coli) were characterized. A multivariate random forest (MVRF) statistical methodology was then applied to the database of characterized proteins creating a tool for predicting the AEX behavior of a mixture of proteins. The accuracy of the MVRF method was determined by calculating a root mean squared error value for each database. This measure never exceeded a value of 0.045 (fraction of protein populating each of the multiple separation fractions) for AEX. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1453-1463, 2016. © 2016 American Institute of Chemical Engineers.

  6. HTP-OligoDesigner: An Online Primer Design Tool for High-Throughput Gene Cloning and Site-Directed Mutagenesis.

    PubMed

    Camilo, Cesar M; Lima, Gustavo M A; Maluf, Fernando V; Guido, Rafael V C; Polikarpov, Igor

    2016-01-01

    Following burgeoning genomic and transcriptomic sequencing data, biochemical and molecular biology groups worldwide are implementing high-throughput cloning and mutagenesis facilities in order to obtain a large number of soluble proteins for structural and functional characterization. Since manual primer design can be a time-consuming and error-generating step, particularly when working with hundreds of targets, the automation of primer design process becomes highly desirable. HTP-OligoDesigner was created to provide the scientific community with a simple and intuitive online primer design tool for both laboratory-scale and high-throughput projects of sequence-independent gene cloning and site-directed mutagenesis and a Tm calculator for quick queries.

  7. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  8. Chromatibody, a novel non-invasive molecular tool to explore and manipulate chromatin in living cells

    PubMed Central

    Jullien, Denis; Vignard, Julien; Fedor, Yoann; Béry, Nicolas; Olichon, Aurélien; Crozatier, Michèle; Erard, Monique; Cassard, Hervé; Ducommun, Bernard; Salles, Bernard

    2016-01-01

    ABSTRACT Chromatin function is involved in many cellular processes, its visualization or modification being essential in many developmental or cellular studies. Here, we present the characterization of chromatibody, a chromatin-binding single-domain, and explore its use in living cells. This non-intercalating tool specifically binds the heterodimer of H2A–H2B histones and displays a versatile reactivity, specifically labeling chromatin from yeast to mammals. We show that this genetically encoded probe, when fused to fluorescent proteins, allows non-invasive real-time chromatin imaging. Chromatibody is a dynamic chromatin probe that can be modulated. Finally, chromatibody is an efficient tool to target an enzymatic activity to the nucleosome, such as the DNA damage-dependent H2A ubiquitylation, which can modify this epigenetic mark at the scale of the genome and result in DNA damage signaling and repair defects. Taken together, these results identify chromatibody as a universal non-invasive tool for either in vivo chromatin imaging or to manipulate the chromatin landscape. PMID:27206857

  9. A Data-Driven Framework for Incorporating New Tools for ...

    EPA Pesticide Factsheets

    This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  10. Design and quasi-static characterization of SMASH (SMA stabilizing handgrip)

    NASA Astrophysics Data System (ADS)

    Pathak, Anupam; Brei, Diann; Luntz, Jonathan; LaVigna, Chris; Kwatny, Harry

    2007-04-01

    Due to physiologically induced body tremors, there is a need for active stabilization in many hand-held devices such as surgical tools, optical equipment (cameras), manufacturing tools, and small arms weapons. While active stabilization has been achieved with electromagnetic and piezoceramics actuators for cameras and surgical equipment, the hostile environment along with larger loads introduced by manufacturing and battlefield environments make these approaches unsuitable. Shape Memory Alloy (SMA) actuators are capable of alleviating these limitations with their large force/stroke generation, smaller size, lower weight, and increased ruggedness. This paper presents the actuator design and quasi-static characterization of a SMA Stabilizing Handgrip (SMASH). SMASH is an antagonistically SMA actuated two degree-of-freedom stabilizer for disturbances in the elevation and azimuth directions. The design of the SMASH for a given application is challenging because of the difficulty in accurately modeling systems loads such as friction and unknown shakedown SMA material behavior (which is dependent upon the system loads). Thus, an iterative empirical design process is introduced that provides a method to estimate system loads, a SMA shakedown procedure using the system loads to reduce material creep, and a final selection and prediction for the full SMASH system performance. As means to demonstrate this process, a SMASH was designed, built and experimentally characterized for the extreme case study of small arms stabilization for a US Army M16 rifle. This study successfully demonstrated the new SMASH technology along with the unique design procedure that can be applied to small arms along with a variety of other hand-held devices.

  11. Variable mass pendulum behaviour processed by wavelet analysis

    NASA Astrophysics Data System (ADS)

    Caccamo, M. T.; Magazù, S.

    2017-01-01

    The present work highlights how, in order to characterize the motion of a variable mass pendulum, wavelet analysis can be an effective tool in furnishing information on the time evolution of the oscillation spectral content. In particular, the wavelet transform is applied to process the motion of a hung funnel that loses fine sand at an exponential rate; it is shown how, in contrast to the Fourier transform which furnishes only an average frequency value for the motion, the wavelet approach makes it possible to perform a joint time-frequency analysis. The work is addressed at undergraduate and graduate students.

  12. Development of a Fiber Laser Welding Capability for the W76, MC4702 Firing Set

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samayoa, Jose

    2010-05-12

    Development work to implement a new welding system for a Firing Set is presented. The new system is significant because it represents the first use of fiber laser welding technology at the KCP. The work used Six-Sigma tools for weld characterization and to define process performance. Determinations of workable weld parameters and comparison to existing equipment were completed. Replication of existing waveforms was done utilizing an Arbitrary Pulse Generator (APG), which was used to modulate the fiber laser’s exclusive continuous wave (CW) output. Fiber laser weld process capability for a Firing Set is demonstrated.

  13. Research into software executives for space operations support

    NASA Technical Reports Server (NTRS)

    Collier, Mark D.

    1990-01-01

    Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.

  14. Characterization and production of multifunctional cationic peptides derived from rice proteins.

    PubMed

    Taniguchi, Masayuki; Ochiai, Akihito

    2017-04-01

    Food proteins have been identified as a source of bioactive peptides. These peptides are inactive within the sequence of the parent protein and must be released during gastrointestinal digestion, fermentation, or food processing. Of bioactive peptides, multifunctional cationic peptides are more useful than other peptides that have specific activity in promotion of health and/or the treatment of diseases. We have identified and characterized cationic peptides from rice enzymes and proteins that possess multiple functions, including antimicrobial, endotoxin-neutralizing, arginine gingipain-inhibitory, and/or angiogenic activities. In particular, we have elucidated the contribution of cationic amino acids (arginine and lysine) in the peptides to their bioactivities. Further, we have discussed the critical parameters, particularly proteinase preparations and fractionation or purification, in the enzymatic hydrolysis process for producing bioactive peptides from food proteins. Using an ampholyte-free isoelectric focusing (autofocusing) technique as a tool for fractionation, we successfully prepared fractions containing cationic peptides with multiple functions.

  15. National Water-Quality Assessment (NAWQA) area-characterization toolbox

    USGS Publications Warehouse

    Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  16. Rational development of solid dispersions via hot-melt extrusion using screening, material characterization, and numeric simulation tools.

    PubMed

    Zecevic, Damir E; Wagner, Karl G

    2013-07-01

    Effective and predictive small-scale selection tools are inevitable during the development of a solubility enhanced drug product. For hot-melt extrusion, this selection process can start with a microscale performance evaluation on a hot-stage microscope (HSM). A batch size of 400 mg can provide sufficient materials to assess the drug product attributes such as solid-state properties, solubility enhancement, and physical stability as well as process related attributes such as processing temperature in a twin-screw extruder (TSE). Prototype formulations will then be fed into a 5 mm TSE (~1-2 g) to confirm performance from the HSM under additional shear stress. Small stress stability testing might be performed with these samples or a larger batch (20-40 g) made by 9 or 12 mm TSE. Simultaneously, numeric process simulations are performed using process data as well as rheological and thermal properties of the formulations. Further scale up work to 16 and 18 mm TSE confirmed and refined the simulation model. Thus, at the end of the laboratory-scale development, not only the clinical trial supply could be manufactured, but also one can form a sound risk assessment to support further scale up even without decades of process experience. Copyright © 2013 Wiley Periodicals, Inc.

  17. Optimization of preservation and processing of sea anemones for microbial community analysis using molecular tools.

    PubMed

    Rocha, Joana; Coelho, Francisco J R C; Peixe, Luísa; Gomes, Newton C M; Calado, Ricardo

    2014-11-11

    For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H'). Statistical analyses indicated that preservation significantly affects H'. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H'. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar &pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer.

  18. Optimization of preservation and processing of sea anemones for microbial community analysis using molecular tools

    PubMed Central

    Rocha, Joana; Coelho, Francisco J. R. C.; Peixe, Luísa; Gomes, Newton C. M.; Calado, Ricardo

    2014-01-01

    For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H′). Statistical analyses indicated that preservation significantly affects H′. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H′. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar & pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer. PMID:25384534

  19. Investigation of the influence of process parameters on adhesive wear under hot stamping conditions

    NASA Astrophysics Data System (ADS)

    Schwingenschlögl, P.; Weldi, M.; Merklein, M.

    2017-09-01

    Current challenges like increasing safety standards and reducing fuel consumption motivate lightweight construction in modern car bodies. Besides using lightweight workpiece materials like aluminum, hot stamping has been established as a key technology for producing safety relevant components. Producing hot stamped parts out of ultra-high strength steels offers the possibility to improve the crash performance. At the same time the weight of car structure is reduced by using thinner sheet thicknesses. In order to avoid oxide scale formation and ensure corrosion protection, AlSi coatings are commonly deposited on the sheet surfaces used for direct hot stamping. This workpiece coating has a critical impact on the tribological conditions within the forming process and, as a consequence, influences the quality of hot stamped parts as well as tool wear. AlSi coatings have been identified as major reason for adhesive wear, which represents the main wear mechanism in hot stamping. Within this study, the influence of the process parameters on adhesive wear are investigated in dependency of workpiece and tool temperatures, drawing velocities and contact pressures. The tribological behavior is analyzed based on strip drawing experiments under direct hot stamping conditions. The experiments are performed with AlSi coated 22MnB5 in contact with the hot working tool steel 1.2367. For analyzing the amount of adhesion on the friction jaws, the surfaces are characterized by optical measurements. The experiments indicate that higher workpiece temperatures cause severe adhesive wear on the tool surface, while an increase of drawing velocity or contact pressure led to reduced adhesion. The measured friction coefficients decreased with rising amount of adhesion and remained at a constant level after a certain adhesive layer was built up on the tool surface.

  20. Intelligent multi-sensor integrations

    NASA Technical Reports Server (NTRS)

    Volz, Richard A.; Jain, Ramesh; Weymouth, Terry

    1989-01-01

    Growth in the intelligence of space systems requires the use and integration of data from multiple sensors. Generic tools are being developed for extracting and integrating information obtained from multiple sources. The full spectrum is addressed for issues ranging from data acquisition, to characterization of sensor data, to adaptive systems for utilizing the data. In particular, there are three major aspects to the project, multisensor processing, an adaptive approach to object recognition, and distributed sensor system integration.

  1. Nanoparticle exposure biomonitoring: exposure/effect indicator development approaches

    NASA Astrophysics Data System (ADS)

    Marie-Desvergne, C.; Dubosson, M.; Lacombe, M.; Brun, V.; Mossuz, V.

    2015-05-01

    The use of engineered nanoparticles (NP) is more and more widespread in various industrial sectors. The inhalation route of exposure is a matter of concern (adverse effects of air pollution by ultrafine particles and asbestos). No NP biomonitoring recommendations or standards are available so far. The LBM laboratory is currently studying several approaches to develop bioindicators for occupational health applications. As regards exposure indicators, new tools are being implemented to assess potentially inhaled NP in non-invasive respiratory sampling (nasal sampling and exhaled breath condensates (EBC)). Diverse NP analytical characterization methods are used (ICP-MS, dynamic light scattering and electron microscopy coupled to energy-dispersive X-ray analysis). As regards effect indicators, a methodology has been developed to assess a range of 29 cytokines in EBCs (potential respiratory inflammation due to NP exposure). Secondly, collaboration between the LBM laboratory and the EDyp team has allowed the EBC proteome to be characterized by means of an LC-MS/MS process. These projects are expected to facilitate the development of individual NP exposure biomonitoring tools and the analysis of early potential impacts on health. Innovative techniques such as field-flow fractionation combined with ICP-MS and single particle-ICPMS are currently being explored. These tools are directly intended to assist occupational physicians in the identification of exposure situations.

  2. Acoustic/seismic signal propagation and sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Marlin, David H.; Mackay, Sean

    2007-04-01

    Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).

  3. Final Report: Hot Carrier Collection in Thin Film Silicon with Tailored Nanocrystalline/Amorphous Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Reuben T.

    This project developed, characterized, and perfected a new type of highly tunable nanocrystalline silicon (nc-Si:H) incorporating quantum confined silicon nanoparticles (SiNPs). A dual zone deposition process and system were developed and demonstrated. The depositions of SiNPs, the amorphous phase, and co-deposited material were characterized and optimized. Material design and interpretation of results were guided by new theoretical tools that examined both the electronic structure and carrier dynamics of this hybrid material. Heterojunction and p-i-n solar cells were demonstrated and characterized. Photo-thin-film-transistors allowed mobility to be studied as a function SiNP density in the films. Rapid (hot) transfer of carriers frommore » the amorphous matrix to the quantum confined SiNPs was observed and connected to reduced photo-degradation. The results carry quantum confined Si dots from a novelty to materials that can be harnessed for PV and optoelectronic applications. The growth process is broadly extendable with alternative amorphous matrices, novel layered structures, and alternative NPs easily accessible. The hot carrier effects hold the potential for third generation photovoltaics.« less

  4. Characterization of Chinese liquor aroma components during aging process and liquor age discrimination using gas chromatography combined with multivariable statistics

    NASA Astrophysics Data System (ADS)

    Xu, M. L.; Yu, Y.; Ramaswamy, H. S.; Zhu, S. M.

    2017-01-01

    Chinese liquor aroma components were characterized during the aging process using gas chromatography (GC). Principal component and cluster analysis (PCA, CA) were used to discriminate the Chinese liquor age which has a great economic value. Of a total of 21 major aroma components identified and quantified, 13 components which included several acids, alcohols, esters, aldehydes and furans decreased significantly in the first year of aging, maintained the same levels (p > 0.05) for next three years and decreased again (p < 0.05) in the fifth year. On the contrary, a significant increase was observed in propionic acid, furfural and phenylethanol. Ethyl lactate was found to be the most stable aroma component during aging process. Results of PCA and CA demonstrated that young liquor (fresh) and aged liquors were well separated from each other, which is in consistent with the evolution of aroma components along with the aging process. These findings provide a quantitative basis for discriminating the Chinese liquor age and a scientific basis for further research on elucidating the liquor aging process, and a possible tool to guard against counterfeit and defective products.

  5. Surface modification of hydroturbine steel using friction stir processing

    NASA Astrophysics Data System (ADS)

    Grewal, H. S.; Arora, H. S.; Singh, H.; Agrawal, A.

    2013-03-01

    Friction stir processing (FSP) has proved to be a viable tool for enhancing the mechanical properties of materials, however, the major focus has been upon improving the bulk properties of light metals and their alloys. Hydroturbines are susceptible to damage owing to slurry and cavitation erosion. In this study, FSP of a commonly employed hydroturbine steel, 13Cr4Ni was undertaken. Microstructural characterization of the processed steel was conducted using optical microscopy (OM), scanning electron microscopy (SEM) equipped with energy dispersive spectroscopy (EDS), X-ray diffraction (XRD) and electron back scatter diffraction (EBSD) techniques. Mechanical characterization of the steel was undertaken in terms of microhardness and resistance to cavitation erosion (CE). FSP resulted in the refinement of the microstructure with reduction in grain size by a factor of 10. EBSD results confirmed the existence of submicron and ultrafine grained microstructure. The microhardness of the steel was found to enhance by 2.6 times after processing. The processed steel also showed 2.4 times higher resistance against cavitation erosion in comparison to unprocessed steel. The primary erosion mechanism for both the steels was identical in nature, with plastic deformation responsible for the loss of material.

  6. Formulation and Characterization of Solid Dispersion Prepared by Hot Melt Mixing: A Fast Screening Approach for Polymer Selection

    PubMed Central

    Enose, Arno A.; Dasan, Priya K.; Sivaramakrishnan, H.; Shah, Sanket M.

    2014-01-01

    Solid dispersion is molecular dispersion of drug in a polymer matrix which leads to improved solubility and hence better bioavailability. Solvent evaporation technique was employed to prepare films of different combinations of polymers, plasticizer, and a modal drug sulindac to narrow down on a few polymer-plasticizer-sulindac combinations. The sulindac-polymer-plasticizer combination that was stable with good film forming properties was processed by hot melt mixing, a technique close to hot melt extrusion, to predict its behavior in a hot melt extrusion process. Hot melt mixing is not a substitute to hot melt extrusion but is an aid in predicting the formation of molecularly dispersed form of a given set of drug-polymer-plasticizer combination in a hot melt extrusion process. The formulations were characterized by advanced techniques like optical microscopy, differential scanning calorimetry, hot stage microscopy, dynamic vapor sorption, and X-ray diffraction. Subsequently, the best drug-polymer-plasticizer combination obtained by hot melt mixing was subjected to hot melt extrusion process to validate the usefulness of hot melt mixing as a predictive tool in hot melt extrusion process. PMID:26556187

  7. Scattering effects of machined optical surfaces

    NASA Astrophysics Data System (ADS)

    Thompson, Anita Kotha

    1998-09-01

    Optical fabrication is one of the most labor-intensive industries in existence. Lensmakers use pitch to affix glass blanks to metal chucks that hold the glass as they grind it with tools that have not changed much in fifty years. Recent demands placed on traditional optical fabrication processes in terms of surface accuracy, smoothnesses, and cost effectiveness has resulted in the exploitation of precision machining technology to develop a new generation of computer numerically controlled (CNC) optical fabrication equipment. This new kind of precision machining process is called deterministic microgrinding. The most conspicuous feature of optical surfaces manufactured by the precision machining processes (such as single-point diamond turning or deterministic microgrinding) is the presence of residual cutting tool marks. These residual tool marks exhibit a highly structured topography of periodic azimuthal or radial deterministic marks in addition to random microroughness. These distinct topographic features give rise to surface scattering effects that can significantly degrade optical performance. In this dissertation project we investigate the scattering behavior of machined optical surfaces and their imaging characteristics. In particular, we will characterize the residual optical fabrication errors and relate the resulting scattering behavior to the tool and machine parameters in order to evaluate and improve the deterministic microgrinding process. Other desired information derived from the investigation of scattering behavior is the optical fabrication tolerances necessary to satisfy specific image quality requirements. Optical fabrication tolerances are a major cost driver for any precision optical manufacturing technology. The derivation and control of the optical fabrication tolerances necessary for different applications and operating wavelength regimes will play a unique and central role in establishing deterministic microgrinding as a preferred and a cost-effective optical fabrication process. Other well understood optical fabrication processes will also be reviewed and a performance comparison with the conventional grinding and polishing technique will be made to determine any inherent advantages in the optical quality of surfaces produced by other techniques.

  8. Etna_NETVIS: A dedicated tool for automatically pre-processing high frequency data useful to extract geometrical parameters and track the evolution of the lava field

    NASA Astrophysics Data System (ADS)

    Marsella, Maria; Junior Valentino D'Aranno, Peppe; De Bonis, Roberto; Nardinocchi, Carla; Scifoni, Silvia; Scutti, Marianna; Sonnessa, Alberico; Wahbeh, Wissam; Biale, Emilio; Coltelli, Mauro; Pecora, Emilio; Prestifilippo, Michele; Proietti, Cristina

    2016-04-01

    In volcanic areas, where it could be difficult to gain access to the most critical zones for carrying out direct surveys, digital photogrammetry techniques are rarely experimented, although in many cases they proved to have remarkable potentialities, as the possibility to follow the evolution of volcanic (fracturing, vent positions, lava fields, lava front positions) and deformation processes (inflation/deflation and instability phenomena induced by volcanic activity). These results can be obtained, in the framework of standard surveillance activities, by acquiring multi-temporal datasets including Digital Orthophotos (DO) and Digital Elevation Models (DEM) to be used for implementing a quantitative and comparative analysis. The frequency of the surveys can be intensified during emergency phases to implement a quasi real-time monitoring for supporting civil protection actions. The high level of accuracy and the short time required for image processing make digital photogrammetry a suitable tool for controlling the evolution of volcanic processes which are usually characterized by large and rapid mass displacements. In order to optimize and extend the existing permanent ground NEtwork of Thermal and VIsible Sensors located on Mt. Etna (Etna_NETVIS) and to improve the observation of the most active areas, an approach for monitoring surface sin-eruptive processes was implemented. A dedicated tool for automatically pre-processing high frequency data, useful to extract geometrical parameters as well as to track the evolution of the lava field, was developed and tested both in simulated and real scenarios. The tool allows to extract a coherent multi-temporal dataset of orthophotos useful to evaluate active flow area and to estimate effusion rates. Furthermore, Etna_NETVIS data were used to downscale the information derived from satellite data and/or to integrate the satellite datasets in case of incomplete coverage or missing acquisitions. This work was developed in the framework of the EU-FP7 project "MED-SUV" (MEDiterranean SUpersite Volcanoes).

  9. Systems engineering medicine: engineering the inflammation response to infectious and traumatic challenges

    PubMed Central

    Parker, Robert S.; Clermont, Gilles

    2010-01-01

    The complexity of the systemic inflammatory response and the lack of a treatment breakthrough in the treatment of pathogenic infection demand that advanced tools be brought to bear in the treatment of severe sepsis and trauma. Systems medicine, the translational science counterpart to basic science's systems biology, is the interface at which these tools may be constructed. Rapid initial strides in improving sepsis treatment are possible through the use of phenomenological modelling and optimization tools for process understanding and device design. Higher impact, and more generalizable, treatment designs are based on mechanistic understanding developed through the use of physiologically based models, characterization of population variability, and the use of control-theoretic systems engineering concepts. In this review we introduce acute inflammation and sepsis as an example of just one area that is currently underserved by the systems medicine community, and, therefore, an area in which contributions of all types can be made. PMID:20147315

  10. Microstructure and Mechanical Characterization of Friction-Stir-Welded Dual-Phase Brass

    NASA Astrophysics Data System (ADS)

    Ramesh, R.; Dinaharan, I.; Akinlabi, E. T.; Murugan, N.

    2018-03-01

    Friction stir welding (FSW) is an ideal process to join brass to avoid the evaporation of zinc. In the present investigation, 6-mm-thick dual-phase brass plates were joined efficiently using FSW at various tool rotational speeds. The microstructures were studied using optical microscopy, electron backscattered diffraction and transmission electron microscopy. The optical micrographs revealed the evolution of various zones across the joint line. The microstructure of the heat-affected zone was similar to that of base metal. The weld zone exhibited finer grains due to dynamic recrystallization. The recrystallization was inhomogeneous and the inhomogeneity reduced with increased tool rotational speed. The dual phase was preserved in the weld zone due to the retention of zinc. The severe plastic deformation created a lot of dislocations in the weld zone. The weld zone was strengthened after welding. The role of tool rotational speed on the joint strength is further reported.

  11. Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging

    PubMed Central

    Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.

    2015-01-01

    Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288

  12. THE DURABILITY OF LARGE-SCALE ADDITIVE MANUFACTURING COMPOSITE MOLDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, Brian K; Love, Lonnie J; Duty, Chad

    2016-01-01

    Oak Ridge National Laboratory s Big Area Additive Manufacturing (BAAM) technology permits the rapid production of thermoplastic composite molds using a carbon fiber filled Acrylonitrile-Butadiene-Styrene (ABS) thermoplastic. Demonstration tools (i.e. 0.965 m X 0.559 m X 0.152 m) for composite part fabrication have been printed, coated, and finished with a traditional tooling gel. We present validation results demonstrating the stability of thermoplastic printed molds for room temperature Vacuum Assisted Resin Transfer Molding (VARTM) processes. Arkema s Elium thermoplastic resin was investigated with a variety of reinforcement materials. Experimental results include dimensional characterization of the tool surface using laser scanning techniquemore » following demolding of 10 parts. Thermoplastic composite molds offer rapid production compared to traditionally built thermoset molds in that near-net deposition allows direct digital production of the net geometry at production rate of 45 kg/hr.« less

  13. Systems engineering medicine: engineering the inflammation response to infectious and traumatic challenges.

    PubMed

    Parker, Robert S; Clermont, Gilles

    2010-07-06

    The complexity of the systemic inflammatory response and the lack of a treatment breakthrough in the treatment of pathogenic infection demand that advanced tools be brought to bear in the treatment of severe sepsis and trauma. Systems medicine, the translational science counterpart to basic science's systems biology, is the interface at which these tools may be constructed. Rapid initial strides in improving sepsis treatment are possible through the use of phenomenological modelling and optimization tools for process understanding and device design. Higher impact, and more generalizable, treatment designs are based on mechanistic understanding developed through the use of physiologically based models, characterization of population variability, and the use of control-theoretic systems engineering concepts. In this review we introduce acute inflammation and sepsis as an example of just one area that is currently underserved by the systems medicine community, and, therefore, an area in which contributions of all types can be made.

  14. Overview and development of EDA tools for integration of DSA into patterning solutions

    NASA Astrophysics Data System (ADS)

    Torres, J. Andres; Fenger, Germain; Khaira, Daman; Ma, Yuansheng; Granik, Yuri; Kapral, Chris; Mitra, Joydeep; Krasnova, Polina; Ait-Ferhat, Dehia

    2017-03-01

    Directed Self-Assembly is the method by which a self-assembly polymer is forced to follow a desired geometry defined or influenced by a guiding pattern. Such guiding pattern uses surface potentials, confinement or both to achieve polymer configurations that result in circuit-relevant topologies, which can be patterned onto a substrate. Chemo, and grapho epitaxy of lines and space structures are now routinely inspected at full wafer level to understand the defectivity limits of the materials and their maximum resolution. In the same manner, there is a deeper understanding about the formation of cylinders using grapho-epitaxy processes. Academia has also contributed by developing methods that help reduce the number of masks in advanced nodes by "combining" DSA-compatible groups, thus reducing the total cost of the process. From the point of view of EDA, new tools are required when a technology is adopted, and most technologies are adopted when they show a clear cost-benefit over alternative techniques. In addition, years of EDA development have led to the creation of very flexible toolkits that permit rapid prototyping and evaluation of new process alternatives. With the development of high-chi materials, and by moving away of the well characterized PS-PMMA systems, as well as novel integrations in the substrates that work in tandem with diblock copolymer systems, it is necessary to assess any new requirements that may or may not need custom tools to support such processes. Hybrid DSA processes (which contain both chemo and grapho elements), are currently being investigated as possible contenders for sub-5nm process techniques. Because such processes permit the re-distribution of discontinuities in the regular arrays between the substrate and a cut operation, they have the potential to extend the number of applications for DSA. This paper illustrates the reason as to why some DSA processes can be supported by existing rules and technology, while other processes require the development of highly customized correction tools and models. It also illustrates how developing DSA cannot be done in isolation, and it requires the full collaboration of EDA, Material's suppliers, Manufacturing equipment, Metrology, and electronic manufacturers.

  15. High-throughput electrical measurement and microfluidic sorting of semiconductor nanowires.

    PubMed

    Akin, Cevat; Feldman, Leonard C; Durand, Corentin; Hus, Saban M; Li, An-Ping; Hui, Ho Yee; Filler, Michael A; Yi, Jingang; Shan, Jerry W

    2016-05-24

    Existing nanowire electrical characterization tools not only are expensive and require sophisticated facilities, but are far too slow to enable statistical characterization of highly variable samples. They are also generally not compatible with further sorting and processing of nanowires. Here, we demonstrate a high-throughput, solution-based electro-orientation-spectroscopy (EOS) method, which is capable of automated electrical characterization of individual nanowires by direct optical visualization of their alignment behavior under spatially uniform electric fields of different frequencies. We demonstrate that EOS can quantitatively characterize the electrical conductivities of nanowires over a 6-order-of-magnitude range (10(-5) to 10 S m(-1), corresponding to typical carrier densities of 10(10)-10(16) cm(-3)), with different fluids used to suspend the nanowires. By implementing EOS in a simple microfluidic device, continuous electrical characterization is achieved, and the sorting of nanowires is demonstrated as a proof-of-concept. With measurement speeds two orders of magnitude faster than direct-contact methods, the automated EOS instrument enables for the first time the statistical characterization of highly variable 1D nanomaterials.

  16. On the performances and wear of WC-diamond like carbon coated tools in drilling of CFRP/Titanium stacks

    NASA Astrophysics Data System (ADS)

    Boccarusso, L.; Durante, M.; Impero, F.; Minutolo, F. Memola Capece; Scherillo, F.; Squillace, A.

    2016-10-01

    The use of hybrid structures made of CFRP and titanium alloys is growing more and more in the last years in the aerospace industry due to the high strength to weight ratio. Because of their very different characteristics, the mechanical fastening represent the most effective joining technique for these materials. As a consequence, drilling process plays a key role in the assembly. The one shot drilling, i.e. the contemporary drilling of the stack of the two materials, seems to be the best option both in terms of time saving and assembly accuracy. Nevertheless, due to the considerable different machinability of fiber reinforced plastics and metallic materials, the one shot drilling is a critical process both for the holes quality and for the tools wear. This research was carried out to study the effectiveness of new generation tools in the drilling of CFRP/Titanium stacks. The tools are made of sintered grains of tungsten carbide (WC) in a binder of cobalt and coated with Diamond like carbon (DLC), and are characterized by a patented geometry; they mainly differ in parent WC grain size and binder percentage. Both the cutting forces and the wear phenomena were accurately investigated and the results were analyzed as a function of number of holes and their quality. The results show a clear increase of the cutting forces with the number of holes for all the used drilling tools. Moreover, abrasive wear phenomena that affect initially the tools coating layer were observed.

  17. Quantifiable and objective approach to organizational performance enhancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholand, Andrew Joseph; Tausczik, Yla R.

    This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less

  18. Famine Early Warning Systems Network (FEWS NET) Agro-climatology Analysis Tools and Knowledge Base Products for Food Security Applications

    NASA Astrophysics Data System (ADS)

    Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.

    2017-12-01

    The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.

  19. Atomic Force Microscopy Techniques for Nanomechanical Characterization: A Polymeric Case Study

    NASA Astrophysics Data System (ADS)

    Reggente, Melania; Rossi, Marco; Angeloni, Livia; Tamburri, Emanuela; Lucci, Massimiliano; Davoli, Ivan; Terranova, Maria Letizia; Passeri, Daniele

    2015-04-01

    Atomic force microscopy (AFM) is a versatile tool to perform mechanical characterization of surface samples at the nanoscale. In this work, we review two of such methods, namely contact resonance AFM (CR-AFM) and torsional harmonics AFM (TH-AFM). First, such techniques are illustrated and their applicability on materials with elastic moduli in different ranges are discussed, together with their main advantages and limitations. Then, a case study is presented in which we report the mechanical characterization using both CR-AFM and TH-AFM of polyaniline and polyaniniline doped with nanodiamond particles tablets prepared by a pressing process. We determined the indentation modulus values of their surfaces, which were found in fairly good agreement, thus demonstrating the accuracy of the techniques. Finally, the determined surface elastic moduli have been compared with the bulk ones measured through standard indentation testing.

  20. Optical Spectroscopy of New Materials

    NASA Technical Reports Server (NTRS)

    White, Susan M.; Arnold, James O. (Technical Monitor)

    1993-01-01

    Composites are currently used for a rapidly expanding number of applications including aircraft structures, rocket nozzles, thermal protection of spacecraft, high performance ablative surfaces, sports equipment including skis, tennis rackets and bicycles, lightweight automobile components, cutting tools, and optical-grade mirrors. Composites are formed from two or more insoluble materials to produce a material with superior properties to either component. Composites range from dispersion-hardened alloys to advanced fiber-reinforced composites. UV/VIS and FTIR spectroscopy currently is used to evaluate the bonding between the matrix and the fibers, monitor the curing process of a polymer, measure surface contamination, characterize the interphase material, monitor anion transport in polymer phases, characterize the void formation (voids must be minimized because, like cracks in a bulk material, they lead to failure), characterize the surface of the fiber component, and measure the overall optical properties for energy balances.

  1. Flight-vehicle materials, structures, and dynamics - Assessment and future directions. Vol. 4 - Tribological materials and NDE

    NASA Technical Reports Server (NTRS)

    Fusaro, Robert L. (Editor); Achenbach, J. D. (Editor)

    1993-01-01

    The present volume on tribological materials and NDE discusses liquid lubricants for advanced aircraft engines, a liquid lubricant for space applications, solid lubricants for aeronautics, and thin solid-lubricant films in space. Attention is given to the science and technology of NDE, tools for an NDE engineering base, experimental techniques in ultrasonics for NDE and material characterization, and laser ultrasonics. Topics addressed include thermal methods of NDE and quality control, digital radiography in the aerospace industry, materials characterization by ultrasonic methods, and NDE of ceramics and ceramic composites. Also discussed are smart materials and structures, intelligent processing of materials, implementation of NDE technology on flight structures, and solid-state weld evaluation.

  2. How Analysis Informs Regulation:Success and Failure of ...

    EPA Pesticide Factsheets

    How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  3. Material characterization and defect inspection in ultrasound images

    NASA Astrophysics Data System (ADS)

    Zmola, Carl; Segal, Andrew C.; Lovewell, Brian; Mahdavieh, Jacob; Ross, Joseph; Nash, Charles

    1992-08-01

    The use of ultrasonic imaging to analyze defects and characterize materials is critical in the development of non-destructive testing and non-destructive evaluation (NDT/NDE) tools for manufacturing. To develop better quality control and reliability in the manufacturing environment advanced image processing techniques are useful. For example, through the use of texture filtering on ultrasound images, we have been able to filter characteristic textures from highly textured C-scan images of materials. The materials have highly regular characteristic textures which are of the same resolution and dynamic range as other important features within the image. By applying texture filters and adaptively modifying their filter response, we have examined a family of filters for removing these textures.

  4. A Method for Improved Interpretation of "Spot" Biomarker Data ...

    EPA Pesticide Factsheets

    A Method for Improved Interpretation of "Spot" Biomarker Data The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  5. Characterizing L1-norm best-fit subspaces

    NASA Astrophysics Data System (ADS)

    Brooks, J. Paul; Dulá, José H.

    2017-05-01

    Fitting affine objects to data is the basis of many tools and methodologies in statistics, machine learning, and signal processing. The L1 norm is often employed to produce subspaces exhibiting a robustness to outliers and faulty observations. The L1-norm best-fit subspace problem is directly formulated as a nonlinear, nonconvex, and nondifferentiable optimization problem. The case when the subspace is a hyperplane can be solved to global optimality efficiently by solving a series of linear programs. The problem of finding the best-fit line has recently been shown to be NP-hard. We present necessary conditions for optimality for the best-fit subspace problem, and use them to characterize properties of optimal solutions.

  6. Supervised Classification Processes for the Characterization of Heritage Elements, Case Study: Cuenca-Ecuador

    NASA Astrophysics Data System (ADS)

    Briones, J. C.; Heras, V.; Abril, C.; Sinchi, E.

    2017-08-01

    The proper control of built heritage entails many challenges related to the complexity of heritage elements and the extent of the area to be managed, for which the available resources must be efficiently used. In this scenario, the preventive conservation approach, based on the concept that prevent is better than cure, emerges as a strategy to avoid the progressive and imminent loss of monuments and heritage sites. Regular monitoring appears as a key tool to identify timely changes in heritage assets. This research demonstrates that the supervised learning model (Support Vector Machines - SVM) is an ideal tool that supports the monitoring process detecting visible elements in aerial images such as roofs structures, vegetation and pavements. The linear, gaussian and polynomial kernel functions were tested; the lineal function provided better results over the other functions. It is important to mention that due to the high level of segmentation generated by the classification procedure, it was necessary to apply a generalization process through opening a mathematical morphological operation, which simplified the over classification for the monitored elements.

  7. Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Sheng; Santamarina, J. Carlos

    Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less

  8. Biophysics: for HTS hit validation, chemical lead optimization, and beyond.

    PubMed

    Genick, Christine C; Wright, S Kirk

    2017-09-01

    There are many challenges to the drug discovery process, including the complexity of the target, its interactions, and how these factors play a role in causing the disease. Traditionally, biophysics has been used for hit validation and chemical lead optimization. With its increased throughput and sensitivity, biophysics is now being applied earlier in this process to empower target characterization and hit finding. Areas covered: In this article, the authors provide an overview of how biophysics can be utilized to assess the quality of the reagents used in screening assays, to validate potential tool compounds, to test the integrity of screening assays, and to create follow-up strategies for compound characterization. They also briefly discuss the utilization of different biophysical methods in hit validation to help avoid the resource consuming pitfalls caused by the lack of hit overlap between biophysical methods. Expert opinion: The use of biophysics early on in the drug discovery process has proven crucial to identifying and characterizing targets of complex nature. It also has enabled the identification and classification of small molecules which interact in an allosteric or covalent manner with the target. By applying biophysics in this manner and at the early stages of this process, the chances of finding chemical leads with novel mechanisms of action are increased. In the future, focused screens with biophysics as a primary readout will become increasingly common.

  9. U.S. Geological Survey: A synopsis of Three-dimensional Modeling

    USGS Publications Warehouse

    Jacobsen, Linda J.; Glynn, Pierre D.; Phelps, Geoff A.; Orndorff, Randall C.; Bawden, Gerald W.; Grauch, V.J.S.

    2011-01-01

    The U.S. Geological Survey (USGS) is a multidisciplinary agency that provides assessments of natural resources (geological, hydrological, biological), the disturbances that affect those resources, and the disturbances that affect the built environment, natural landscapes, and human society. Until now, USGS map products have been generated and distributed primarily as 2-D maps, occasionally providing cross sections or overlays, but rarely allowing the ability to characterize and understand 3-D systems, how they change over time (4-D), and how they interact. And yet, technological advances in monitoring natural resources and the environment, the ever-increasing diversity of information needed for holistic assessments, and the intrinsic 3-D/4-D nature of the information obtained increases our need to generate, verify, analyze, interpret, confirm, store, and distribute its scientific information and products using 3-D/4-D visualization, analysis, modeling tools, and information frameworks. Today, USGS scientists use 3-D/4-D tools to (1) visualize and interpret geological information, (2) verify the data, and (3) verify their interpretations and models. 3-D/4-D visualization can be a powerful quality control tool in the analysis of large, multidimensional data sets. USGS scientists use 3-D/4-D technology for 3-D surface (i.e., 2.5-D) visualization as well as for 3-D volumetric analyses. Examples of geological mapping in 3-D include characterization of the subsurface for resource assessments, such as aquifer characterization in the central United States, and for input into process models, such as seismic hazards in the western United States.

  10. Evaluation of Pentafluoroethane and 1,1-Difluoroethane for a Dielectric Etch Application in an Inductively Coupled Plasma Etch Tool

    NASA Astrophysics Data System (ADS)

    Karecki, Simon; Chatterjee, Ritwik; Pruette, Laura; Reif, Rafael; Sparks, Terry; Beu, Laurie; Vartanian, Victor

    2000-07-01

    In this work, a combination of two hydrofluorocarbon compounds, pentafluoroethane (FC-125, C2HF5) and 1,1-difluoroethane (FC-152a, CF2H-CH3), was evaluated as a potential replacement for perfluorocompounds in dielectric etch applications. A high aspect ratio oxide via etch was used as the test vehicle for this study, which was conducted in a commercial inductively coupled high density plasma etch tool. Both process and emissions data were collected and compared to those provided by a process utilizing a standard perfluorinated etch chemistry (C2F6). Global warming (CF4, C2F6, CHF3) and hygroscopic gas (HF, SiF4) emissions were characterized using Fourier transform infrared (FTIR) spectroscopy. FC-125/FC-152a was found to produce significant reductions in global warming emissions, on the order of 68 to 76% relative to the reference process. Although etch stopping, caused by a high degree of polymer deposition inside the etched features, was observed, process data otherwise appeared promising for an initial study, with good resist selectivity and etch rates being achieved.

  11. VOSED: a tool for the characterization of developing planetary systems

    NASA Astrophysics Data System (ADS)

    Solano, E.; Gutiérrez, R.; Delgado, A.; Sarro, L. M.; Merín, B.

    2007-08-01

    The transition phase from optically thick disks around young pre-main sequence stars to optically thin debris disks around Vega type stars is not well understood and plays an important role in the theory of planet formation. One of the most promising methods to characterize this process is the fitting of the observed SED with disk models. However, despite its potential, this technique is affected by two major problems if a non-VO methodology is used: on the one hand, SEDs building requires accessing to a variety of astronomical services which provide, in most of the cases, heterogeneous information. On the other hand, model fitting demands a tremendous amount of work and time which makes it very inefficient even for a modest dataset. This is an important issue considering the large volume of data that missions like Spitzer is producing. In the framework of the Spanish Virtual Observatory (SVO) we have developed VOSED an application that permits to characterize the protoplanetary disks around young stars taking advantage of the already existing VO standards and tools. The application allows the user to gather photometric and spectroscopic information from a number of VO services, trace the SED, and fit the photospheric contribution with a stellar model and the IR excess with a disk model. The Kurucz models described in Castelli et al. (1997, A&A, 318, 841) are used to reproduce the photospheric contribution whereas the grid of models of accretion disks irradiated by their central stars developed by D'Alessio et al. (2005, ) is used for the disk contribution. In both cases, the models are retrieved from the SVO Theoretical Model Web Server using the TSAP protocol. As pointed out before, model fitting constitutes a fundamental step in the analysis process. VOSED includes a tool to estimate the model parameters (both stellar and disk) based on bayesian inference. The main aim of the tool is to quantitatively analyse the data in terms of the evidence of models of different complexity, evaluate what other alternative models can compete with the most a posteriori probable one and what are the most discriminant observations to discard alternatives.

  12. In-situ diagnostic tools for hydrogen transfer leak characterization in PEM fuel cell stacks part II: Operational applications

    NASA Astrophysics Data System (ADS)

    Niroumand, Amir M.; Homayouni, Hooman; DeVaal, Jake; Golnaraghi, Farid; Kjeang, Erik

    2016-08-01

    This paper describes a diagnostic tool for in-situ characterization of the rate and distribution of hydrogen transfer leaks in Polymer Electrolyte Membrane (PEM) fuel cell stacks. The method is based on reducing the air flow rate from a high to low value at a fixed current, while maintaining an anode overpressure. At high air flow rates, the reduction in air flow results in lower oxygen concentration in the cathode and therefore reduction in cell voltages. Once the air flow rate in each cell reaches a low value at which the cell oxygen-starves, the voltage of the corresponding cell drops to zero. However, oxygen starvation results from two processes: 1) the electrochemical oxygen reduction reaction which produces current; and 2) the chemical reaction between oxygen and the crossed over hydrogen. In this work, a diagnostic technique has been developed that accounts for the effect of the electrochemical reaction on cell voltage to identify the hydrogen leak rate and number of leaky cells in a fuel cell stack. This technique is suitable for leak characterization during fuel cell operation, as it only requires stack air flow and voltage measurements, which are readily available in an operational fuel cell system.

  13. Use of comparative genomics approaches to characterize interspecies differences in response to environmental chemicals: Challenges, opportunities, and research needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgess-Herbert, Sarah L., E-mail: sarah.burgess@alum.mit.edu; Euling, Susan Y.

    A critical challenge for environmental chemical risk assessment is the characterization and reduction of uncertainties introduced when extrapolating inferences from one species to another. The purpose of this article is to explore the challenges, opportunities, and research needs surrounding the issue of how genomics data and computational and systems level approaches can be applied to inform differences in response to environmental chemical exposure across species. We propose that the data, tools, and evolutionary framework of comparative genomics be adapted to inform interspecies differences in chemical mechanisms of action. We compare and contrast existing approaches, from disciplines as varied as evolutionarymore » biology, systems biology, mathematics, and computer science, that can be used, modified, and combined in new ways to discover and characterize interspecies differences in chemical mechanism of action which, in turn, can be explored for application to risk assessment. We consider how genetic, protein, pathway, and network information can be interrogated from an evolutionary biology perspective to effectively characterize variations in biological processes of toxicological relevance among organisms. We conclude that comparative genomics approaches show promise for characterizing interspecies differences in mechanisms of action, and further, for improving our understanding of the uncertainties inherent in extrapolating inferences across species in both ecological and human health risk assessment. To achieve long-term relevance and consistent use in environmental chemical risk assessment, improved bioinformatics tools, computational methods robust to data gaps, and quantitative approaches for conducting extrapolations across species are critically needed. Specific areas ripe for research to address these needs are recommended.« less

  14. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    USGS Publications Warehouse

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  15. CHOPPI: A Web Tool for the Analysis of Immunogenicity Risk from Host Cell Proteins in CHO-Based Protein Production

    PubMed Central

    Bailey-Kellogg, Chris; Gutiérrez, Andres H; Moise, Leonard; Terry, Frances; Martin, William D; De Groot, Anne S

    2014-01-01

    Despite high quality standards and continual process improvements in manufacturing, host cell protein (HCP) process impurities remain a substantial risk for biological products. Even at low levels, residual HCPs can induce a detrimental immune response compromising the safety and efficacy of a biologic. Consequently, advanced-stage clinical trials have been cancelled due to the identification of antibodies against HCPs. To enable earlier and rapid assessment of the risks in Chinese Hamster Ovary (CHO)-based protein production of residual CHO protein impurities (CHOPs), we have developed a web tool called CHOPPI, for CHO Protein Predicted Immunogenicity. CHOPPI integrates information regarding the possible presence of CHOPs (expression and secretion) with characterizations of their immunogenicity (T cell epitope count and density, and relative conservation with human counterparts). CHOPPI can generate a report for a specified CHO protein (e.g., identified from proteomics or immunoassays) or characterize an entire specified subset of the CHO genome (e.g., filtered based on confidence in transcription and similarity to human proteins). The ability to analyze potential CHOPs at a genomic scale provides a baseline to evaluate relative risk. We show here that CHOPPI can identify clear differences in immunogenicity risk among previously validated CHOPs, as well as identify additional “risky” CHO proteins that may be expressed during production and induce a detrimental immune response upon delivery. We conclude that CHOPPI is a powerful tool that provides a valuable computational complement to existing experimental approaches for CHOP risk assessment and can focus experimental efforts in the most important directions. Biotechnol. Bioeng. 2014;111: 2170–2182. PMID:24888712

  16. National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox

    USGS Publications Warehouse

    Price, Curtis

    2010-01-01

    This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.

  17. Characterization of integrated optical CD for process control

    NASA Astrophysics Data System (ADS)

    Yu, Jackie; Uchida, Junichi; van Dommelen, Youri; Carpaij, Rene; Cheng, Shaunee; Pollentier, Ivan; Viswanathan, Anita; Lane, Lawrence; Barry, Kelly A.; Jakatdar, Nickhil

    2004-05-01

    The accurate measurement of CD (critical dimension) and its application to inline process control are key challenges for high yield and OEE (overall equipment efficiency) in semiconductor production. CD-SEM metrology, although providing the resolution necessary for CD evaluation, suffers from the well-known effect of resist shrinkage, making accuracy and stability of the measurements an issue. For sub-100 nm in-line process control, where accuracy and stability as well as speed are required, CD-SEM metrology faces serious limitations. In contrast, scatterometry, using broadband optical spectra taken from grating structures, does not suffer from such limitations. This technology is non-destructive and, in addition to CD, provides profile information and film thickness in a single measurement. Using Timbre's Optical Digital Profililometry (ODP) technology, we characterized the Process Window, using a iODP101 integrated optical CD metrology into a TEL Clean Track at IMEC. We demonstrate the Optical CD's high sensitivity to process change and its insensitivity to measurement noise. We demonstrate the validity of ODP modeling by showing its accurate response to known process changes built into the evaluation and its excellent correlation to CD-SEM. We will further discuss the intrinsic Optical CD metrology factors that affect the tool precision, accuracy and its correlation to CD-SEM.

  18. Pieces of the Puzzle: Tracking the Chemical Component of the ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  19. The Air Quality Model Evaluation International Initiative ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  20. Neutron Tomography at the Los Alamos Neutron Science Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, William Riley

    Neutron imaging is an incredibly powerful tool for non-destructive sample characterization and materials science. Neutron tomography is one technique that results in a three-dimensional model of the sample, representing the interaction of the neutrons with the sample. This relies both on reliable data acquisition and on image processing after acquisition. Over the course of the project, the focus has changed from the former to the latter, culminating in a large-scale reconstruction of a meter-long fossilized skull. The full reconstruction is not yet complete, though tools have been developed to improve the speed and accuracy of the reconstruction. This project helpsmore » to improve the capabilities of LANSCE and LANL with regards to imaging large or unwieldy objects.« less

  1. Wear Detection of Drill Bit by Image-based Technique

    NASA Astrophysics Data System (ADS)

    Sukeri, Maziyah; Zulhilmi Paiz Ismadi, Mohd; Rahim Othman, Abdul; Kamaruddin, Shahrul

    2018-03-01

    Image processing for computer vision function plays an essential aspect in the manufacturing industries for the tool condition monitoring. This study proposes a dependable direct measurement method to measure the tool wear using image-based analysis. Segmentation and thresholding technique were used as the means to filter and convert the colour image to binary datasets. Then, the edge detection method was applied to characterize the edge of the drill bit. By using cross-correlation method, the edges of original and worn drill bits were correlated to each other. Cross-correlation graphs were able to detect the difference of the worn edge despite small difference between the graphs. Future development will focus on quantifying the worn profile as well as enhancing the sensitivity of the technique.

  2. Hermes III endpoint energy calculation from photonuclear activation of 197Au and 58Ni foils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parzyck, Christopher Thomas

    2014-09-01

    A new process has been developed to characterize the endpoint energy of HERMES III on a shot-to-shot basis using standard dosimetry tools from the Sandia Radiation Measurements Laboratory. Photonuclear activation readings from nickel and gold foils are used in conjunction with calcium fluoride thermoluminescent dosimeters to derive estimated electron endpoint energies for a series of HERMES shots. The results are reasonably consistent with the expected endpoint voltages on those shots.

  3. Characterizing Treatable Causes of Small-Fiber Polyneuropathy in Gulf War Veterans

    DTIC Science & Technology

    2017-10-01

    this study Global experts participated in additional rounds of a Delphi process to determine the most reliable markers for SFPN ( Case Definition). We...We supplemented the pertinent information on SFPN and added a link to this study as a recruitment tool. The public portion of the website may be...propose as part of the Case Definition. The database currently contains data on 3,495 subjects consisting of 3,087 patients and 408 healthy controls

  4. Development and characterization of silk fibroin coated quantum dots

    NASA Astrophysics Data System (ADS)

    Nathwani, B. B.; Needham, C.; Mathur, A. B.; Meissner, K. E.

    2008-02-01

    Recent progress in the field of semiconductor nanocrystals or Quantum Dots (QDs) has seen them find wider acceptance as a tool in biomedical research labs. As produced, high quality QDs, synthesized by high temperature organometallic synthesis, are coated with a hydrophobic ligand. Therefore, they must be further processed to be soluble in water and to be made biocompatible. To accomplish this, the QDs are generally coated with a synthetic polymer (eg. block copolymers) or the hydrophobic surface ligands exchanged with hydrophilic material (eg. thiols). Advances in this area have enabled the QDs to experience a smooth transition from being simple inorganic fluorophores to being smart sensors, which can identify specific cell marker proteins and help in diagnosis of diseases such as cancer. In order to improve the biocompatibility and utility of the QDs, we report the development of a procedure to coat QDs with silk fibroin, a fibrous crystalline protein extracted from Bombyx Mori silkworm. Following the coating process, we characterize the size, quantum yield and two-photon absorption cross section of the silk coated QDs. Additionally, the results of biocompatibility studies carried out to compare the properties of these QD-silks with conventional QDs are presented. These natural polymer coatings on QDs could enhance the intracellular delivery and enable the use of these nanocrystals as an imaging tool for studying subcellular machinery at the molecular level.

  5. Geochemistry and the understanding of ground-water systems

    USGS Publications Warehouse

    Glynn, Pierre D.; Plummer, Niel

    2005-01-01

    Geochemistry has contributed significantly to the understanding of ground-water systems over the last 50 years. Historic advances include development of the hydrochemical facies concept, application of equilibrium theory, investigation of redox processes, and radiocarbon dating. Other hydrochemical concepts, tools, and techniques have helped elucidate mechanisms of flow and transport in ground-water systems, and have helped unlock an archive of paleoenvironmental information. Hydrochemical and isotopic information can be used to interpret the origin and mode of ground-water recharge, refine estimates of time scales of recharge and ground-water flow, decipher reactive processes, provide paleohydrological information, and calibrate ground-water flow models. Progress needs to be made in obtaining representative samples. Improvements are needed in the interpretation of the information obtained, and in the construction and interpretation of numerical models utilizing hydrochemical data. The best approach will ensure an optimized iterative process between field data collection and analysis, interpretation, and the application of forward, inverse, and statistical modeling tools. Advances are anticipated from microbiological investigations, the characterization of natural organics, isotopic fingerprinting, applications of dissolved gas measurements, and the fields of reaction kinetics and coupled processes. A thermodynamic perspective is offered that could facilitate the comparison and understanding of the multiple physical, chemical, and biological processes affecting ground-water systems.

  6. Impact Of The Material Variability On The Stamping Process: Numerical And Analytical Analysis

    NASA Astrophysics Data System (ADS)

    Ledoux, Yann; Sergent, Alain; Arrieux, Robert

    2007-05-01

    The finite element simulation is a very useful tool in the deep drawing industry. It is used more particularly for the development and the validation of new stamping tools. It allows to decrease cost and time for the tooling design and set up. But one of the most important difficulties to have a good agreement between the simulation and the real process comes from the definition of the numerical conditions (mesh, punch travel speed, limit conditions,…) and the parameters which model the material behavior. Indeed, in press shop, when the sheet set changes, often a variation of the formed part geometry is observed according to the variability of the material properties between these different sets. This last parameter represents probably one of the main source of process deviation when the process is set up. That's why it is important to study the influence of material data variation on the geometry of a classical stamped part. The chosen geometry is an omega shaped part because of its simplicity and it is representative one in the automotive industry (car body reinforcement). Moreover, it shows important springback deviations. An isotropic behaviour law is assumed. The impact of the statistical deviation of the three law coefficients characterizing the material and the friction coefficient around their nominal values is tested. A Gaussian distribution is supposed and their impact on the geometry variation is studied by FE simulation. An other approach is envisaged consisting in modeling the process variability by a mathematical model and then, in function of the input parameters variability, it is proposed to define an analytical model which leads to find the part geometry variability around the nominal shape. These two approaches allow to predict the process capability as a function of the material parameter variability.

  7. A new algorithm and system for the characterization of handwriting strokes with delta-lognormal parameters.

    PubMed

    Djioua, Moussa; Plamondon, Réjean

    2009-11-01

    In this paper, we present a new analytical method for estimating the parameters of Delta-Lognormal functions and characterizing handwriting strokes. According to the Kinematic Theory of rapid human movements, these parameters contain information on both the motor commands and the timing properties of a neuromuscular system. The new algorithm, called XZERO, exploits relationships between the zero crossings of the first and second time derivatives of a lognormal function and its four basic parameters. The methodology is described and then evaluated under various testing conditions. The new tool allows a greater variety of stroke patterns to be processed automatically. Furthermore, for the first time, the extraction accuracy is quantified empirically, taking advantage of the exponential relationships that link the dispersion of the extraction errors with its signal-to-noise ratio. A new extraction system which combines this algorithm with two other previously published methods is also described and evaluated. This system provides researchers involved in various domains of pattern analysis and artificial intelligence with new tools for the basic study of single strokes as primitives for understanding rapid human movements.

  8. Application of transit data analysis and artificial neural network in the prediction of discharge of Lor River, NW Spain.

    PubMed

    Astray, G; Soto, B; Lopez, D; Iglesias, M A; Mejuto, J C

    2016-01-01

    Transit data analysis and artificial neural networks (ANNs) have proven to be a useful tool for characterizing and modelling non-linear hydrological processes. In this paper, these methods have been used to characterize and to predict the discharge of Lor River (North Western Spain), 1, 2 and 3 days ahead. Transit data analyses show a coefficient of correlation of 0.53 for a lag between precipitation and discharge of 1 day. On the other hand, temperature and discharge has a negative coefficient of correlation (-0.43) for a delay of 19 days. The ANNs developed provide a good result for the validation period, with R(2) between 0.92 and 0.80. Furthermore, these prediction models have been tested with discharge data from a period 16 years later. Results of this testing period also show a good correlation, with R(2) between 0.91 and 0.64. Overall, results indicate that ANNs are a good tool to predict river discharge with a small number of input variables.

  9. Effect of the Machining Processes on Low Cycle Fatigue Behavior of a Powder Metallurgy Disk

    NASA Technical Reports Server (NTRS)

    Telesman, J.; Kantzos, P.; Gabb, T. P.; Ghosn, L. J.

    2010-01-01

    A study has been performed to investigate the effect of various machining processes on fatigue life of configured low cycle fatigue specimens machined out of a NASA developed LSHR P/M nickel based disk alloy. Two types of configured specimen geometries were employed in the study. To evaluate a broach machining processes a double notch geometry was used with both notches machined using broach tooling. EDM machined notched specimens of the same configuration were tested for comparison purposes. Honing finishing process was evaluated by using a center hole specimen geometry. Comparison testing was again done using EDM machined specimens of the same geometry. The effect of these machining processes on the resulting surface roughness, residual stress distribution and microstructural damage were characterized and used in attempt to explain the low cycle fatigue results.

  10. Roll-to-roll suitable short-pulsed laser scribing of organic photovoltaics and close-to-process characterization

    NASA Astrophysics Data System (ADS)

    Kuntze, Thomas; Wollmann, Philipp; Klotzbach, Udo; Fledderus, Henri

    2017-03-01

    The proper long term operation of organic electronic devices like organic photovoltaics OPV depends on their resistance to environmental influences such as permeation of water vapor. Major efforts are spent to encapsulate OPV. State of the art is sandwich-like encapsulation between two ultra-barrier foils. Sandwich encapsulation faces two major disadvantages: high costs ( 1/3 of total costs) and parasitic intrinsic water (sponge effects of the substrate foil). To fight these drawbacks, a promising approach is to use the OPV substrate itself as barrier by integration of an ultra-barrier coating, followed by alternating deposition and structuring of OPV functional layers. In effect, more functionality will be integrated into less material, and production steps are reduced in number. All processing steps must not influence the underneath barrier functionality, while all electrical functionalities must be maintained. As most reasonable structuring tool, short and ultrashort pulsed lasers USP are used. Laser machining applies to three layers: bottom electrode made of transparent conductive materials (P1), organic photovoltaic operative stack (P2) and top electrode (P3). In this paper, the machining of functional 110…250 nm layers of flexible OPV by USP laser systems is presented. Main focus is on structuring without damaging the underneath ultra-barrier layer. The close-to-process machining quality characterization is performed with the analysis tool "hyperspectral imaging" (HSI), which is checked crosswise with the "gold standard" Ca-test. It is shown, that both laser machining and quality controlling, are well suitable for R2R production of OPV.

  11. The emergence of hydrogeophysics for improved understanding of subsurface processes over multiple scales

    DOE PAGES

    Binley, Andrew; Hubbard, Susan S.; Huisman, Johan A.; ...

    2015-06-15

    Geophysics provides a multidimensional suite of investigative methods that are transforming our ability to see into the very fabric of the subsurface environment, and monitor the dynamics of its fluids and the biogeochemical reactions that occur within it. Here we document how geophysical methods have emerged as valuable tools for investigating shallow subsurface processes over the past two decades and offer a vision for future developments relevant to hydrology and also ecosystem science. The field of “hydrogeophysics” arose in the late 1990s, prompted, in part, by the wealth of studies on stochastic subsurface hydrology that argued for better field-based investigativemore » techniques. These new hydrogeophysical approaches benefited from the emergence of practical and robust data inversion techniques, in many cases with a view to quantify shallow subsurface heterogeneity and the associated dynamics of subsurface fluids. Furthermore, the need for quantitative characterization stimulated a wealth of new investigations into petrophysical relationships that link hydrologically relevant properties to measurable geophysical parameters. Development of time-lapse approaches provided a new suite of tools for hydrological investigation, enhanced further with the realization that some geophysical properties may be sensitive to biogeochemical transformations in the subsurface environment, thus opening up the new field of “biogeophysics.” Early hydrogeophysical studies often concentrated on relatively small “plot-scale” experiments. More recently, however, the translation to larger-scale characterization has been the focus of a number of studies. In conclusion, geophysical technologies continue to develop, driven, in part, by the increasing need to understand and quantify key processes controlling sustainable water resources and ecosystem services.« less

  12. The emergence of hydrogeophysics for improved understanding of subsurface processes over multiple scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binley, Andrew; Hubbard, Susan S.; Huisman, Johan A.

    Geophysics provides a multidimensional suite of investigative methods that are transforming our ability to see into the very fabric of the subsurface environment, and monitor the dynamics of its fluids and the biogeochemical reactions that occur within it. Here we document how geophysical methods have emerged as valuable tools for investigating shallow subsurface processes over the past two decades and offer a vision for future developments relevant to hydrology and also ecosystem science. The field of “hydrogeophysics” arose in the late 1990s, prompted, in part, by the wealth of studies on stochastic subsurface hydrology that argued for better field-based investigativemore » techniques. These new hydrogeophysical approaches benefited from the emergence of practical and robust data inversion techniques, in many cases with a view to quantify shallow subsurface heterogeneity and the associated dynamics of subsurface fluids. Furthermore, the need for quantitative characterization stimulated a wealth of new investigations into petrophysical relationships that link hydrologically relevant properties to measurable geophysical parameters. Development of time-lapse approaches provided a new suite of tools for hydrological investigation, enhanced further with the realization that some geophysical properties may be sensitive to biogeochemical transformations in the subsurface environment, thus opening up the new field of “biogeophysics.” Early hydrogeophysical studies often concentrated on relatively small “plot-scale” experiments. More recently, however, the translation to larger-scale characterization has been the focus of a number of studies. In conclusion, geophysical technologies continue to develop, driven, in part, by the increasing need to understand and quantify key processes controlling sustainable water resources and ecosystem services.« less

  13. The emergence of hydrogeophysics for improved understanding of subsurface processes over multiple scales

    PubMed Central

    Hubbard, Susan S.; Huisman, Johan A.; Revil, André; Robinson, David A.; Singha, Kamini; Slater, Lee D.

    2015-01-01

    Abstract Geophysics provides a multidimensional suite of investigative methods that are transforming our ability to see into the very fabric of the subsurface environment, and monitor the dynamics of its fluids and the biogeochemical reactions that occur within it. Here we document how geophysical methods have emerged as valuable tools for investigating shallow subsurface processes over the past two decades and offer a vision for future developments relevant to hydrology and also ecosystem science. The field of “hydrogeophysics” arose in the late 1990s, prompted, in part, by the wealth of studies on stochastic subsurface hydrology that argued for better field‐based investigative techniques. These new hydrogeophysical approaches benefited from the emergence of practical and robust data inversion techniques, in many cases with a view to quantify shallow subsurface heterogeneity and the associated dynamics of subsurface fluids. Furthermore, the need for quantitative characterization stimulated a wealth of new investigations into petrophysical relationships that link hydrologically relevant properties to measurable geophysical parameters. Development of time‐lapse approaches provided a new suite of tools for hydrological investigation, enhanced further with the realization that some geophysical properties may be sensitive to biogeochemical transformations in the subsurface environment, thus opening up the new field of “biogeophysics.” Early hydrogeophysical studies often concentrated on relatively small “plot‐scale” experiments. More recently, however, the translation to larger‐scale characterization has been the focus of a number of studies. Geophysical technologies continue to develop, driven, in part, by the increasing need to understand and quantify key processes controlling sustainable water resources and ecosystem services. PMID:26900183

  14. Native top-down mass spectrometry for the structural characterization of human hemoglobin

    DOE PAGES

    Zhang, Jiang; Malmirchegini, G. Reza; Clubb, Robert T.; ...

    2015-06-09

    Native mass spectrometry (MS) has become an invaluable tool for the characterization of proteins and non-covalent protein complexes under near physiological solution conditions. Here we report the structural characterization of human hemoglobin (Hb), a 64 kDa oxygen-transporting protein complex, by high resolution native top-down mass spectrometry using electrospray ionization (ESI) and a 15-Tesla Fourier transform ion cyclotron resonance (FTICR) mass spectrometer. Native MS preserves the non-covalent interactions between the globin subunits, and electron capture dissociation (ECD) produces fragments directly from the intact Hb complex without dissociating the subunits. Using activated ion ECD, we observe the gradual unfolding process of themore » Hb complex in the gas phase. Without protein ion activation, the native Hb shows very limited ECD fragmentation from the N-termini, suggesting a tightly packed structure of the native complex and therefore low fragmentation efficiency. Precursor ion activation allows steady increase of N-terminal fragment ions, while the C-terminal fragments remain limited (38 c ions and 4 z ions on the α chain; 36 c ions and 2 z ions on the β chain). This ECD fragmentation pattern suggests that upon activation, the Hb complex starts to unfold from the N-termini of both subunits, whereas the C-terminal regions and therefore the potential regions involved in the subunit binding interactions remain intact. ECD-MS of the Hb dimer show similar fragmentation patterns as the Hb tetramer, providing further evidence for the hypothesized unfolding process of the Hb complex in the gas phase. Native top-down ECD-MS allows efficient probing of the Hb complex structure and the subunit binding interactions in the gas phase. Finally, it may provide a fast and effective means to probe the structure of novel protein complexes that are intractable to traditional structural characterization tools.« less

  15. Process characterization and Design Space definition.

    PubMed

    Hakemeyer, Christian; McKnight, Nathan; St John, Rick; Meier, Steven; Trexler-Schmidt, Melody; Kelley, Brian; Zettl, Frank; Puskeiler, Robert; Kleinjans, Annika; Lim, Fred; Wurth, Christine

    2016-09-01

    Quality by design (QbD) is a global regulatory initiative with the goal of enhancing pharmaceutical development through the proactive design of pharmaceutical manufacturing process and controls to consistently deliver the intended performance of the product. The principles of pharmaceutical development relevant to QbD are described in the ICH guidance documents (ICHQ8-11). An integrated set of risk assessments and their related elements developed at Roche/Genentech were designed to provide an overview of product and process knowledge for the production of a recombinant monoclonal antibody (MAb). This chapter describes the tools used for the characterization and validation of MAb manufacturing process under the QbD paradigm. This comprises risk assessments for the identification of potential Critical Process Parameters (pCPPs), statistically designed experimental studies as well as studies assessing the linkage of the unit operations. Outcome of the studies is the classification of process parameters according to their criticality and the definition of appropriate acceptable ranges of operation. The process and product knowledge gained in these studies can lead to the approval of a Design Space. Additionally, the information gained in these studies are used to define the 'impact' which the manufacturing process can have on the variability of the CQAs, which is used to define the testing and monitoring strategy. Copyright © 2016 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  16. Process for selecting engineering tools : applied to selecting a SysML tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Spain, Mark J.; Post, Debra S.; Taylor, Jeffrey L.

    2011-02-01

    Process for Selecting Engineering Tools outlines the process and tools used to select a SysML (Systems Modeling Language) tool. The process is general in nature and users could use the process to select most engineering tools and software applications.

  17. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  18. Investigation of the shape transferability of nanoscale multi-tip diamond tools in the diamond turning of nanostructures

    NASA Astrophysics Data System (ADS)

    Luo, Xichun; Tong, Zhen; Liang, Yingchun

    2014-12-01

    In this article, the shape transferability of using nanoscale multi-tip diamond tools in the diamond turning for scale-up manufacturing of nanostructures has been demonstrated. Atomistic multi-tip diamond tool models were built with different tool geometries in terms of the difference in the tip cross-sectional shape, tip angle, and the feature of tool tip configuration, to determine their effect on the applied forces and the machined nano-groove geometries. The quality of machined nanostructures was characterized by the thickness of the deformed layers and the dimensional accuracy achieved. Simulation results show that diamond turning using nanoscale multi-tip tools offers tremendous shape transferability in machining nanostructures. Both periodic and non-periodic nano-grooves with different cross-sectional shapes can be successfully fabricated using the multi-tip tools. A hypothesis of minimum designed ratio of tool tip distance to tip base width (L/Wf) of the nanoscale multi-tip diamond tool for the high precision machining of nanostructures was proposed based on the analytical study of the quality of the nanostructures fabricated using different types of the multi-tip tools. Nanometric cutting trials using nanoscale multi-tip diamond tools (different in L/Wf) fabricated by focused ion beam (FIB) were then conducted to verify the hypothesis. The investigations done in this work imply the potential of using the nanoscale multi-tip diamond tool for the deterministic fabrication of period and non-periodic nanostructures, which opens up the feasibility of using the process as a versatile manufacturing technique in nanotechnology.

  19. Compact Electron Gun Based on Secondary Emission Through Ionic Bombardment

    PubMed Central

    Diop, Babacar; Bonnet, Jean; Schmid, Thomas; Mohamed, Ajmal

    2011-01-01

    We present a new compact electron gun based on the secondary emission through ionic bombardment principle. The driving parameters to develop such a gun are to obtain a quite small electron gun for an in-flight instrument performing Electron Beam Fluorescence measurements (EBF) on board of a reentry vehicle in the upper atmosphere. These measurements are useful to characterize the gas flow around the vehicle in terms of gas chemical composition, temperatures and velocity of the flow which usually presents thermo-chemical non-equilibrium. Such an instrument can also be employed to characterize the upper atmosphere if placed on another carrier like a balloon. In ground facilities, it appears as a more practical tool to characterize flows in wind tunnel studies or as an alternative to complex electron guns in industrial processes requiring an electron beam. We describe in this paper the gun which has been developed as well as its different features which have been characterized in the laboratory. PMID:22163896

  20. RootGraph: a graphic optimization tool for automated image analysis of plant roots

    PubMed Central

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N.; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J.

    2015-01-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions. PMID:26224880

  1. Monitoring Interfacial Lipid Oxidation in Oil-in-Water Emulsions Using Spatially Resolved Optical Techniques.

    PubMed

    Banerjee, Chiranjib; Westberg, Michael; Breitenbach, Thomas; Bregnhøj, Mikkel; Ogilby, Peter R

    2017-06-06

    The oxidation of lipids is an important phenomenon with ramifications for disciplines that range from food science to cell biology. The development and characterization of tools and techniques to monitor lipid oxidation are thus relevant. Of particular significance in this regard are tools that facilitate the study of oxidations at interfaces in heterogeneous samples (e.g., oil-in-water emulsions, cell membranes). In this article, we establish a proof-of-principle for methods to initiate and then monitor such oxidations with high spatial resolution. The experiments were performed using oil-in-water emulsions of polyunsaturated fatty acids (PUFAs) prepared from cod liver oil. We produced singlet oxygen at a point near the oil-water interface of a given PUFA droplet in a spatially localized two-photon photosensitized process. We then followed the oxidation reactions initiated by this process with the fluorescence-based imaging technique of structured illumination microscopy (SIM). We conclude that the approach reported herein has attributes well-suited to the study of lipid oxidation in heterogeneous samples.

  2. The Assessment of Bipolar Disorder in Children and Adolescents

    PubMed Central

    Youngstrom, Eric A.; Freeman, Andrew J.; Jenkins, Melissa McKeown

    2010-01-01

    The overarching goal of this review is to examine the current best evidence for assessing bipolar disorder in children and adolescents and provide a comprehensive, evidence-based approach to diagnosis. Evidence-based assessment strategies are organized around the “3 Ps” of clinical assessment: Predict important criteria or developmental trajectories, Prescribe a change in treatment choice, and inform Process of treating the youth and his/her family. The review characterizes bipolar disorder in youths - specifically addressing bipolar diagnoses and clinical subtypes; then provides an actuarial approach to assessment - using prevalence of disorder, risk factors, and questionnaires; discusses treatment thresholds; and identifies practical measures of process and outcomes. The clinical tools and risk factors selected for inclusion in this review represent the best empirical evidence in the literature. By the end of the review, clinicians will have a framework and set of clinically useful tools with which to effectively make evidence-based decisions regarding the diagnosis of bipolar disorder in children and adolescents. PMID:19264268

  3. Non-Gated Laser Induced Breakdown Spectroscopy Provides a Powerful Segmentation Tool on Concomitant Treatment of Characteristic and Continuum Emission

    PubMed Central

    Dasari, Ramachandra Rao; Barman, Ishan; Gundawar, Manoj Kumar

    2014-01-01

    We demonstrate the application of non-gated laser induced breakdown spectroscopy (LIBS) for characterization and classification of organic materials with similar chemical composition. While use of such a system introduces substantive continuum background in the spectral dataset, we show that appropriate treatment of the continuum and characteristic emission results in accurate discrimination of pharmaceutical formulations of similar stoichiometry. Specifically, our results suggest that near-perfect classification can be obtained by employing suitable multivariate analysis on the acquired spectra, without prior removal of the continuum background. Indeed, we conjecture that pre-processing in the form of background removal may introduce spurious features in the signal. Our findings in this report significantly advance the prior results in time-integrated LIBS application and suggest the possibility of a portable, non-gated LIBS system as a process analytical tool, given its simple instrumentation needs, real-time capability and lack of sample preparation requirements. PMID:25084522

  4. Friction and lubrication modelling in sheet metal forming: Influence of lubrication amount, tool roughness and sheet coating on product quality

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Carleer, B.

    2017-09-01

    In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. This paper presents a selection of results considering friction and lubrication modelling in sheet metal forming simulations of a front fender product. For varying lubrication conditions, the front fender can either show wrinkling or fractures. The front fender is modelled using different lubrication amounts, tool roughness’s and sheet coatings to show the strong influence of friction on both part quality and the overall production stability. For this purpose, the TriboForm software is used in combination with the AutoForm software. The results demonstrate that the TriboForm software enables the simulation of friction behaviour for varying lubrication conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiang; Malmirchegini, G. Reza; Clubb, Robert T.

    Native mass spectrometry (MS) has become an invaluable tool for the characterization of proteins and non-covalent protein complexes under near physiological solution conditions. Here we report the structural characterization of human hemoglobin (Hb), a 64 kDa oxygen-transporting protein complex, by high resolution native top-down mass spectrometry using electrospray ionization (ESI) and a 15-Tesla Fourier transform ion cyclotron resonance (FTICR) mass spectrometer. Native MS preserves the non-covalent interactions between the globin subunits, and electron capture dissociation (ECD) produces fragments directly from the intact Hb complex without dissociating the subunits. Using activated ion ECD, we observe the gradual unfolding process of themore » Hb complex in the gas phase. Without protein ion activation, the native Hb shows very limited ECD fragmentation from the N-termini, suggesting a tightly packed structure of the native complex and therefore low fragmentation efficiency. Precursor ion activation allows steady increase of N-terminal fragment ions, while the C-terminal fragments remain limited (38 c ions and 4 z ions on the α chain; 36 c ions and 2 z ions on the β chain). This ECD fragmentation pattern suggests that upon activation, the Hb complex starts to unfold from the N-termini of both subunits, whereas the C-terminal regions and therefore the potential regions involved in the subunit binding interactions remain intact. ECD-MS of the Hb dimer show similar fragmentation patterns as the Hb tetramer, providing further evidence for the hypothesized unfolding process of the Hb complex in the gas phase. Native top-down ECD-MS allows efficient probing of the Hb complex structure and the subunit binding interactions in the gas phase. Finally, it may provide a fast and effective means to probe the structure of novel protein complexes that are intractable to traditional structural characterization tools.« less

  6. Partial Discharge Spectral Characterization in HF, VHF and UHF Bands Using Particle Swarm Optimization.

    PubMed

    Robles, Guillermo; Fresno, José Manuel; Martínez-Tarifa, Juan Manuel; Ardila-Rey, Jorge Alfredo; Parrado-Hernández, Emilio

    2018-03-01

    The measurement of partial discharge (PD) signals in the radio frequency (RF) range has gained popularity among utilities and specialized monitoring companies in recent years. Unfortunately, in most of the occasions the data are hidden by noise and coupled interferences that hinder their interpretation and renders them useless especially in acquisition systems in the ultra high frequency (UHF) band where the signals of interest are weak. This paper is focused on a method that uses a selective spectral signal characterization to feature each signal, type of partial discharge or interferences/noise, with the power contained in the most representative frequency bands. The technique can be considered as a dimensionality reduction problem where all the energy information contained in the frequency components is condensed in a reduced number of UHF or high frequency (HF) and very high frequency (VHF) bands. In general, dimensionality reduction methods make the interpretation of results a difficult task because the inherent physical nature of the signal is lost in the process. The proposed selective spectral characterization is a preprocessing tool that facilitates further main processing. The starting point is a clustering of signals that could form the core of a PD monitoring system. Therefore, the dimensionality reduction technique should discover the best frequency bands to enhance the affinity between signals in the same cluster and the differences between signals in different clusters. This is done maximizing the minimum Mahalanobis distance between clusters using particle swarm optimization (PSO). The tool is tested with three sets of experimental signals to demonstrate its capabilities in separating noise and PDs with low signal-to-noise ratio and separating different types of partial discharges measured in the UHF and HF/VHF bands.

  7. Interface thermal conductance characterization by infrared thermography: A tool for the study of insertions in bronze ancient Statuary

    NASA Astrophysics Data System (ADS)

    Mercuri, F.; Caruso, G.; Orazi, N.; Zammit, U.; Cicero, C.; Colacicchi Alessandri, O.; Ferretti, M.; Paoloni, S.

    2018-05-01

    In this paper, a new method based on the use of infrared thermography is proposed for the characterization of repairs and inserted parts on ancient bronzes. In particular, the quality of the contact between different kind of insertions and the main body of bronze statues is investigated by analysing the heat conduction process occurring across the interface between them. The thermographic results have been used to establish the nature of these inserted elements and the way they have been coupled to the main body of the statue during and after the manufacturing process. A model for the heat conduction based on the numerical finite elements method has been applied to compare the obtained results to the theoretical predictions. Measurements have been first carried out on test samples and then in the field on the Boxer at Rest (Museo Nazionale Romano in Rome), a masterpiece of the Greek Statuary, which contains a large variety of inserted items and repairs which are typical of the manufacturing process of bronze artefacts in general.

  8. Watching intracellular lipolysis in mycobacteria using time lapse fluorescence microscopy.

    PubMed

    Dhouib, Rabeb; Ducret, Adrien; Hubert, Pierre; Carrière, Frédéric; Dukan, Sam; Canaan, Stéphane

    2011-04-01

    The fact that Mycobacterium tuberculosis mobilizes lipid bodies (LB) located in the cytosol during infection process has been proposed for decades. However, the mechanisms and dynamics of mobilization of these lipid droplets within mycobacteria are still not completely characterized. Evidence in favour of this characterization was obtained here using a combined fluorescent microscopy and computational image processing approach. The decrease in lipid storage levels observed under nutrient depletion conditions was correlated with a significant increase in the size of the bacteria. LB fragmentation/condensation cycles were monitored in real time. The exact contribution of lipases in this process was confirmed using the lipase inhibitor tetrahydrolipstatin, which was found to prevent LB degradation and to limit the bacterial cell growth. The method presented here provides a powerful tool for monitoring in vivo lipolysis in mycobacteria and for obtaining new insights on the growth of cells and their entry into the dormant or reactivation phase. It should be particularly useful for studying the effects of chemical inhibitors and activators on cells as well as investigating other metabolic pathways. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. High-Fidelity Microstructural Characterization and Performance Modeling of Aluminized Composite Propellant

    DOE PAGES

    Kosiba, Graham D.; Wixom, Ryan R.; Oehlschlaeger, Matthew A.

    2017-10-27

    Image processing and stereological techniques were used to characterize the heterogeneity of composite propellant and inform a predictive burn rate model. Composite propellant samples made up of ammonium perchlorate (AP), hydroxyl-terminated polybutadiene (HTPB), and aluminum (Al) were faced with an ion mill and imaged with a scanning electron microscope (SEM) and x-ray tomography (micro-CT). Properties of both the bulk and individual components of the composite propellant were determined from a variety of image processing tools. An algebraic model, based on the improved Beckstead-Derr-Price model developed by Cohen and Strand, was used to predict the steady-state burning of the aluminized compositemore » propellant. In the presented model the presence of aluminum particles within the propellant was introduced. The thermal effects of aluminum particles are accounted for at the solid-gas propellant surface interface and aluminum combustion is considered in the gas phase using a single global reaction. In conclusion, properties derived from image processing were used directly as model inputs, leading to a sample-specific predictive combustion model.« less

  10. High-Fidelity Microstructural Characterization and Performance Modeling of Aluminized Composite Propellant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kosiba, Graham D.; Wixom, Ryan R.; Oehlschlaeger, Matthew A.

    Image processing and stereological techniques were used to characterize the heterogeneity of composite propellant and inform a predictive burn rate model. Composite propellant samples made up of ammonium perchlorate (AP), hydroxyl-terminated polybutadiene (HTPB), and aluminum (Al) were faced with an ion mill and imaged with a scanning electron microscope (SEM) and x-ray tomography (micro-CT). Properties of both the bulk and individual components of the composite propellant were determined from a variety of image processing tools. An algebraic model, based on the improved Beckstead-Derr-Price model developed by Cohen and Strand, was used to predict the steady-state burning of the aluminized compositemore » propellant. In the presented model the presence of aluminum particles within the propellant was introduced. The thermal effects of aluminum particles are accounted for at the solid-gas propellant surface interface and aluminum combustion is considered in the gas phase using a single global reaction. In conclusion, properties derived from image processing were used directly as model inputs, leading to a sample-specific predictive combustion model.« less

  11. Coal liquefaction process streams characterization and evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.; Davis, A.; Burke, F.P.

    1991-12-01

    This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less

  12. Gustatory processing and taste memory in Drosophila

    PubMed Central

    Masek, Pavel; Keene, Alex C.

    2018-01-01

    Taste allows animals to discriminate the value and potential toxicity of food prior to ingestion. Many tastants elicit an innate attractive or avoidance response that is modifiable with nutritional state and prior experience. A powerful genetic tool kit, well-characterized gustatory system, and standardized behavioral assays make the fruit fly, Drosophila melanogaster, an excellent system for investigating taste processing and memory. Recent studies have used this system to identify the neural basis for acquired taste preference. These studies have revealed a role for dopamine-mediated plasticity of the mushroom bodies that modulate the threshold of response to appetitive tastants. The identification of neural circuitry regulating taste memory provides a system to study the genetic and physiological processes that govern plasticity within a defined memory circuit. PMID:27328844

  13. Laser cleaning of steel for paint removal

    NASA Astrophysics Data System (ADS)

    Chen, G. X.; Kwee, T. J.; Tan, K. P.; Choo, Y. S.; Hong, M. H.

    2010-11-01

    Paint removal is an important part of steel processing for marine and offshore engineering. For centuries, a blasting techniques have been widely used for this surface preparation purpose. But conventional blasting always has intrinsic problems, such as noise, explosion risk, contaminant particles, vibration, and dust. In addition, processing wastes often cause environmental problems. In recent years, laser cleaning has attracted much research effort for its significant advantages, such as precise treatment, and high selectivity and flexibility in comparison with conventional cleaning techniques. In the present study, we use this environmentally friendly technique to overcome the problems of conventional blasting. Processed samples are examined with optical microscopes and other surface characterization tools. Experimental results show that laser cleaning can be a good alternative candidate to conventional blasting.

  14. Investigation of hydrogenation of toluene to methylcyclohexane in a trickle bed reactor by low-field nuclear magnetic resonance spectroscopy.

    PubMed

    Guthausen, Gisela; von Garnier, Agnes; Reimert, Rainer

    2009-10-01

    Low-field nuclear magnetic resonance (NMR) spectroscopy is applied to study the hydrogenation of toluene in a lab-scale reactor. A conventional benchtop NMR system was modified to achieve chemical shift resolution. After an off-line validity check of the approach, the reaction product is analyzed on-line during the process, applying chemometric data processing. The conversion of toluene to methylcyclohexane is compared with off-line gas chromatographic analysis. Both classic analytical and chemometric data processing was applied. As the results, which are obtained within a few tens of seconds, are equivalent within the experimental accuracy of both methods, low-field NMR spectroscopy was shown to provide an analytical tool for reaction characterization and immediate feedback.

  15. Polymeric Packaging for Fully Implantable Wireless Neural Microsensors

    PubMed Central

    Aceros, Juan; Yin, Ming; Borton, David A.; Patterson, William R.; Bull, Christopher; Nurmikko, Arto V.

    2014-01-01

    We present polymeric packaging methods used for subcutaneous, fully implantable, broadband, and wireless neurosensors. A new tool for accelerated testing and characterization of biocompatible polymeric packaging materials and processes is described along with specialized test units to simulate our fully implantable neurosensor components, materials and fabrication processes. A brief description of the implantable systems is presented along with their current encapsulation methods based on polydimethylsiloxane (PDMS). Results from in-vivo testing of multiple implanted neurosensors in swine and non-human primates are presented. Finally, a novel augmenting polymer thin film material to complement the currently employed PDMS is introduced. This thin layer coating material is based on the Plasma Enhanced Chemical Vapor Deposition (PECVD) process of Hexamethyldisiloxane (HMDSO) and Oxygen (O2). PMID:23365999

  16. Eddy current characterization of magnetic treatment of materials

    NASA Technical Reports Server (NTRS)

    Chern, E. James

    1992-01-01

    Eddy current impedance measuring methods have been applied to study the effect that magnetically treated materials have on service life extension. Eddy current impedance measurements have been performed on Nickel 200 specimens that have been subjected to many mechanical and magnetic engineering processes: annealing, applied strain, magnetic field, shot peening, and magnetic field after peening. Experimental results have demonstrated a functional relationship between coil impedance, resistance and reactance, and specimens subjected to various engineering processes. It has shown that magnetic treatment does induce changes in a material's electromagnetic properties and does exhibit evidence of stress relief. However, further fundamental studies are necessary for a thorough understanding of the exact mechanism of the magnetic-field processing effect on machine tool service life.

  17. Quantitative phase imaging characterization of tumor-associated blood vessel formation on a chip

    NASA Astrophysics Data System (ADS)

    Guo, Peng; Huang, Jing; Moses, Marsha A.

    2018-02-01

    Angiogenesis, the formation of new blood vessels from existing ones, is a biological process that has an essential role in solid tumor growth, development, and progression. Recent advances in Lab-on-a-Chip technology has created an opportunity for scientists to observe endothelial cell (EC) behaviors during the dynamic process of angiogenesis using a simple and economical in vitro platform that recapitulates in vivo blood vessel formation. Here, we use quantitative phase imaging (QPI) microscopy to continuously and non-invasively characterize the dynamic process of tumor cell-induced angiogenic sprout formation on a microfluidic chip. The live tumor cell-induced angiogenic sprouts are generated by multicellular endothelial sprouting into 3 dimensional (3D) Matrigel using human umbilical vein endothelial cells (HUVECs). By using QPI, we quantitatively measure a panel of cellular morphological and behavioral parameters of each individual EC participating in this sprouting. In this proof-of-principle study, we demonstrate that QPI is a powerful tool that can provide real-time quantitative analysis of biological processes in in vitro 3D biomimetic devices, which, in turn, can improve our understanding of the biology underlying functional tissue engineering.

  18. New insights in morphological analysis for managing activated sludge systems.

    PubMed

    Oliveira, Pedro; Alliet, Marion; Coufort-Saudejaud, Carole; Frances, Christine

    2018-06-01

    In activated sludge (AS) process, the impact of the operational parameters on process efficiency is assumed to be correlated with the sludge properties. This study provides a better insight into these interactions by subjecting a laboratory-scale AS system to a sequence of operating condition modifications enabling typical situations of a wastewater treatment plant to be represented. Process performance was assessed and AS floc morphology (size, circularity, convexity, solidity and aspect ratio) was quantified by measuring 100,000 flocs per sample with an automated image analysis technique. Introducing 3D distributions, which combine morphological properties, allowed the identification of a filamentous bulking characterized by a floc population shift towards larger sizes and lower solidity and circularity values. Moreover, a washout phenomenon was characterized by smaller AS flocs and an increase in their solidity. Recycle ratio increase and COD:N ratio decrease both promoted a slight reduction of floc sizes and a constant evolution of circularity and convexity values. The analysis of the volume-based 3D distributions turned out to be a smart tool to combine size and shape data, allowing a deeper understanding of the dynamics of floc structure under process disturbances.

  19. An overview of the model integration process: From pre ...

    EPA Pesticide Factsheets

    Integration of models requires linking models which can be developed using different tools, methodologies, and assumptions. We performed a literature review with the aim of improving our understanding of model integration process, and also presenting better strategies for building integrated modeling systems. We identified five different phases to characterize integration process: pre-integration assessment, preparation of models for integration, orchestration of models during simulation, data interoperability, and testing. Commonly, there is little reuse of existing frameworks beyond the development teams and not much sharing of science components across frameworks. We believe this must change to enable researchers and assessors to form complex workflows that leverage the current environmental science available. In this paper, we characterize the model integration process and compare integration practices of different groups. We highlight key strategies, features, standards, and practices that can be employed by developers to increase reuse and interoperability of science software components and systems. The paper provides a review of the literature regarding techniques and methods employed by various modeling system developers to facilitate science software interoperability. The intent of the paper is to illustrate the wide variation in methods and the limiting effect the variation has on inter-framework reuse and interoperability. A series of recommendation

  20. Paths from meso to submesoscale processes in the western Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Capó, Esther; Mason, Evan; Hernández-Carrasco, Ismael; Orfila, Alejandro

    2017-04-01

    In this work we characterize the mesoscale dynamics in the western Mediterranean (WMed) by analyzing the different contributions to the kinetic energy budgets using a 20 year high-resolution numerical model. The length of the numerical solution allows us to consider statistically stationary state of the ocean, a necessary condition for using the quantification of energy budgets as a tool for analyzing dynamical processes. To identify and characterize the different submesoscale processes, we isolate the terms in the energy balance equations (the Lorenz Energy Cycle, LEC, equations) responsible for the production (conversion and generation) of the eddy kinetic energy (EKE). Firstly, by comparing the predominance of each conversion term among the others, three different submesoscale instabilities can be identified in a certain region: baroclinic, barotropic and Kelvin-Helmholtz type. Conversely, given the crucial role of the wind forcing in the dynamics of this area, the generation of kinetic energy by surface winds has been also considered. Finally, a regional analysis of the EKE production terms permits the identification of the areas dominated by submesoscale activity. As will be shown in this work those areas are located near the main currents, and submesoscale processes are strongly influenced by sharp bathymetry-flow interaction.

  1. Clinical microbiology informatics.

    PubMed

    Rhoads, Daniel D; Sintchenko, Vitali; Rauch, Carol A; Pantanowitz, Liron

    2014-10-01

    The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  2. Clinical Microbiology Informatics

    PubMed Central

    Sintchenko, Vitali; Rauch, Carol A.; Pantanowitz, Liron

    2014-01-01

    SUMMARY The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. PMID:25278581

  3. A backing device based on an embedded stiffener and retractable insertion tool for thin-film cochlear arrays

    NASA Astrophysics Data System (ADS)

    Tewari, Radheshyam

    Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hot-embossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.

  4. Radioactive Waste Characterization Strategies; Comparisons Between AK/PK, Dose to Curie Modeling, Gamma Spectroscopy, and Laboratory Analysis Methods- 12194

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singledecker, Steven J.; Jones, Scotty W.; Dorries, Alison M.

    2012-07-01

    In the coming fiscal years of potentially declining budgets, Department of Energy facilities such as the Los Alamos National Laboratory (LANL) will be looking to reduce the cost of radioactive waste characterization, management, and disposal processes. At the core of this cost reduction process will be choosing the most cost effective, efficient, and accurate methods of radioactive waste characterization. Central to every radioactive waste management program is an effective and accurate waste characterization program. Choosing between methods can determine what is classified as low level radioactive waste (LLRW), transuranic waste (TRU), waste that can be disposed of under an Authorizedmore » Release Limit (ARL), industrial waste, and waste that can be disposed of in municipal landfills. The cost benefits of an accurate radioactive waste characterization program cannot be overstated. In addition, inaccurate radioactive waste characterization of radioactive waste can result in the incorrect classification of radioactive waste leading to higher disposal costs, Department of Transportation (DOT) violations, Notice of Violations (NOVs) from Federal and State regulatory agencies, waste rejection from disposal facilities, loss of operational capabilities, and loss of disposal options. Any one of these events could result in the program that mischaracterized the waste losing its ability to perform it primary operational mission. Generators that produce radioactive waste have four characterization strategies at their disposal: - Acceptable Knowledge/Process Knowledge (AK/PK); - Indirect characterization using a software application or other dose to curie methodologies; - Non-Destructive Analysis (NDA) tools such as gamma spectroscopy; - Direct sampling (e.g. grab samples or Surface Contaminated Object smears) and laboratory analytical; Each method has specific advantages and disadvantages. This paper will evaluate each method detailing those advantages and disadvantages including; - Cost benefit analysis (basic materials costs, overall program operations costs, man-hours per sample analyzed, etc.); - Radiation Exposure As Low As Reasonably Achievable (ALARA) program considerations; - Industrial Health and Safety risks; - Overall Analytical Confidence Level. The concepts in this paper apply to any organization with significant radioactive waste characterization and management activities working to within budget constraints and seeking to optimize their waste characterization strategies while reducing analytical costs. (authors)« less

  5. Mechanistic characterization of chloride interferences in electrothermal atomization systems

    USGS Publications Warehouse

    Shekiro, J.M.; Skogerboe, R.K.; Taylor, Howard E.

    1988-01-01

    A computer-controlled spectrometer with a photodiode array detector has been used for wavelength and temperature resolved characterization of the vapor produced by an electrothermal atomizer. The system has been used to study the chloride matrix interference on the atomic absorption spectrometric determination of manganese and copper. The suppression of manganese and copper atom populations by matrix chlorides such as those of calcium and magnesium is due to the gas-phase formation of an analyte chloride species followed by the diffusion of significant fractions of these species from the atom cell prior to completion of the atomization process. The analyte chloride species cannot be formed when matrix chlorides with metal-chloride bond dissociation energies above those of the analyte chlorides are the principal entitles present. The results indicate that multiple wavelength spectrometry used to obtain temperature-resolved spectra is a viable tool in the mechanistic characterization of interference effects observed with electrothermal atomization systems. ?? 1988 American Chemical Society.

  6. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Space-Time Characterization of Laser Plasma Interactions in the Warm Dense Matter Regime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cao, L F; Uschmann, I; Forster, E

    2008-04-30

    Laser plasma interaction experiments have been performed using a fs Titanium Sapphire laser. Plasmas have been generated from planar PMMA targets using single laser pulses with 3.3 mJ pulse energy, 50 fs pulse duration at 800 nm wavelength. The electron density distributions of the plasmas in different delay times have been characterized by means of Nomarski Interferometry. Experimental data were compared with hydrodynamic simulation. First results to characterize the plasma density and temperature as a function of space and time are obtained. This work aims to generate plasmas in the warm dense matter (WDM) regime at near solid-density in anmore » ultra-fast laser target interaction process. Plasmas under these conditions can serve as targets to develop x-ray Thomson scattering as a plasma diagnostic tool, e.g., using the VUV free-electron laser (FLASH) at DESY Hamburg.« less

  8. Structural Characterization of the Low-Molecular-Weight Heparin Dalteparin by Combining Different Analytical Strategies.

    PubMed

    Bisio, Antonella; Urso, Elena; Guerrini, Marco; de Wit, Pauline; Torri, Giangiacomo; Naggi, Annamaria

    2017-06-24

    A number of low molecular weight heparin (LMWH) products are available for clinical use and although all share a similar mechanism of action, they are classified as distinct drugs because of the different depolymerisation processes of the native heparin resulting in substantial pharmacokinetic and pharmacodynamics differences. While enoxaparin has been extensively investigated, little information is available regarding the LMWH dalteparin. The present study is focused on the detailed structural characterization of Fragmin ® by LC-MS and NMR applied both to the whole drug and to its enzymatic products. For a more in-depth approach, size homogeneous octasaccharide and decasaccharide components together with their fractions endowed with high or no affinity toward antithrombin were also isolated and their structural profiles characterized. The combination of different analytical strategies here described represents a useful tool for the assessment of batch-to-batch structural variability and for comparative evaluation of structural features of biosimilar products.

  9. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  10. Tribology in secondary wood machining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, P.L.; Hawthorne, H.M.; Andiappan, J.

    Secondary wood manufacturing covers a wide range of products from furniture, cabinets, doors and windows, to musical instruments. Many of these are now mass produced in sophisticated, high speed numerical controlled machines. The performance and the reliability of the tools are key to an efficient and economical manufacturing process as well as to the quality of the finished products. A program concerned with three aspects of tribology of wood machining, namely, tool wear, tool-wood friction characteristics and wood surface quality characterization, was set up in the Integrated Manufacturing Technologies Institute (IMTI) of the National Research Council of Canada. The studiesmore » include friction and wear mechanism identification and modeling, wear performance of surface-engineered tool materials, friction-induced vibration and cutting efficiency, and the influence of wear and friction on finished products. This research program underlines the importance of tribology in secondary wood manufacturing and at the same time adds new challenges to tribology research since wood is a complex, heterogeneous, material and its behavior during machining is highly sensitive to the surrounding environments and to the moisture content in the work piece.« less

  11. Analysis and characterization of high-resolution and high-aspect-ratio imaging fiber bundles.

    PubMed

    Motamedi, Nojan; Karbasi, Salman; Ford, Joseph E; Lomakin, Vitaliy

    2015-11-10

    High-contrast imaging fiber bundles (FBs) are characterized and modeled for wide-angle and high-resolution imaging applications. Scanning electron microscope images of FB cross sections are taken to measure physical parameters and verify the variations of irregular fibers due to the fabrication process. Modal analysis tools are developed that include irregularities in the fiber core shapes and provide results in agreement with experimental measurements. The modeling demonstrates that the irregular fibers significantly outperform a perfectly regular "ideal" array. Using this method, FBs are designed that can provide high contrast with core pitches of only a few wavelengths of the guided light. Structural modifications of the commercially available FB can reduce the core pitch by 60% for higher resolution image relay.

  12. Synthesis and characterization of non-hydrolysable diphosphoinositol polyphosphate second messengers.

    PubMed

    Wu, Mingxuan; Dul, Barbara E; Trevisan, Alexandra J; Fiedler, Dorothea

    2013-01-01

    The diphosphoinositol polyphosphates (PP-IPs) are a central group of eukaryotic second messengers. They regulate numerous processes, including cellular energy homeostasis and adaptation to environmental stresses. To date, most of the molecular details in PP-IP signalling have remained elusive, due to a lack of appropriate methods and reagents. Here we describe the expedient synthesis of methylene-bisphosphonate PP-IP analogues. Their characterization revealed that the analogues exhibit significant stability and mimic their natural counterparts very well. This was further confirmed in two independent biochemical assays, in which our analogues potently inhibited phosphorylation of the protein kinase Akt and hydrolytic activity of the Ddp1 phosphohydrolase. The non-hydrolysable PP-IPs thus emerge as important tools and hold great promise for a variety of applications.

  13. Characterizing the Nano and Micro Structure of Concrete toImprove its Durability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monteiro, P.J.M.; Kirchheim, A.P.; Chae, S.

    2009-01-13

    New and advanced methodologies have been developed to characterize the nano and microstructure of cement paste and concrete exposed to aggressive environments. High resolution full-field soft X-ray imaging in the water window is providing new insight on the nano scale of the cement hydration process, which leads to a nano-optimization of cement-based systems. Hard X-ray microtomography images of ice inside cement paste and cracking caused by the alkali?silica reaction (ASR) enables three-dimensional structural identification. The potential of neutron diffraction to determine reactive aggregates by measuring their residual strains and preferred orientation is studied. Results of experiments using these tools aremore » shown on this paper.« less

  14. Characterizing the nano and micro structure of concrete to improve its durability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monteiro, P.J.M.; Kirchheim, A.P.; Chae, S.

    2008-10-22

    New and advanced methodologies have been developed to characterize the nano and microstructure of cement paste and concrete exposed to aggressive environments. High resolution full-field soft X-ray imaging in the water window is providing new insight on the nano scale of the cement hydration process, which leads to a nano-optimization of cement-based systems. Hard X-ray microtomography images on ice inside cement paste and cracking caused by the alkali-silica reaction (ASR) enables three-dimensional structural identification. The potential of neutron diffraction to determine reactive aggregates by measuring their residual strains and preferred orientation is studied. Results of experiments using these tools willmore » be shown on this paper.« less

  15. Agricultural use of municipal wastewater treatment plant ...

    EPA Pesticide Factsheets

    Agricultural use of municipal wastewater treatment plant sewage sludge as a source of per- and polyfluoroalkyl substance (PFAS) contamination in the environment The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  16. Morphological characteristics of polyvinyl chloride (PVC) dechlorination during pyrolysis process: Influence of PVC content and heating rate.

    PubMed

    Cao, Qiongmin; Yuan, Guoan; Yin, Lijie; Chen, Dezhen; He, Pinjing; Wang, Hai

    2016-12-01

    In this research morphological techniques were used to characterize dechlorination process of PVC when it is in the mixed waste plastics and the two important factors influencing this process, namely, the proportion of PVC in the mixed plastics and heating rate adopted in the pyrolysis process were investigated. During the pyrolysis process for the mixed plastics containing PVC, the morphologic characteristics describing PVC dechlorination behaviors were obtained with help of a high-speed infrared camera and image processing tools. At the same time emission of hydrogen chloride (HCl) was detected to find out the start and termination of HCl release. The PVC contents in the mixed plastics varied from 0% to 12% in mass and the heating rate for PVC was changed from 10 to 60°C/min. The morphologic parameters including "bubble ratio" (BR) and "pixel area" (PA) were found to have obvious features matching with PVC dechlorination process therefore can be used to characterize dechlorination of PVC alone and in the mixed plastics. It has been also found that shape of HCl emission curve is independent of PVC proportions in the mixed plastics, but shifts to right side with elevated heating rate; and all of which can be quantitatively reflected in morphologic parameters vs. temperature curves. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Heart rate variability in normal and pathological sleep.

    PubMed

    Tobaldini, Eleonora; Nobili, Lino; Strada, Silvia; Casali, Karina R; Braghiroli, Alberto; Montano, Nicola

    2013-10-16

    Sleep is a physiological process involving different biological systems, from molecular to organ level; its integrity is essential for maintaining health and homeostasis in human beings. Although in the past sleep has been considered a state of quiet, experimental and clinical evidences suggest a noteworthy activation of different biological systems during sleep. A key role is played by the autonomic nervous system (ANS), whose modulation regulates cardiovascular functions during sleep onset and different sleep stages. Therefore, an interest on the evaluation of autonomic cardiovascular control in health and disease is growing by means of linear and non-linear heart rate variability (HRV) analyses. The application of classical tools for ANS analysis, such as HRV during physiological sleep, showed that the rapid eye movement (REM) stage is characterized by a likely sympathetic predominance associated with a vagal withdrawal, while the opposite trend is observed during non-REM sleep. More recently, the use of non-linear tools, such as entropy-derived indices, have provided new insight on the cardiac autonomic regulation, revealing for instance changes in the cardiovascular complexity during REM sleep, supporting the hypothesis of a reduced capability of the cardiovascular system to deal with stress challenges. Interestingly, different HRV tools have been applied to characterize autonomic cardiac control in different pathological conditions, from neurological sleep disorders to sleep disordered breathing (SDB). In summary, linear and non-linear analysis of HRV are reliable approaches to assess changes of autonomic cardiac modulation during sleep both in health and diseases. The use of these tools could provide important information of clinical and prognostic relevance.

  18. A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking.

    PubMed

    Alessi, Lauren J; Warmus, Holly R; Schaffner, Erin K; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M

    2018-03-01

    Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children's hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods are organized in a plan-do-study-act cycle. During the "plan" phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the "do" phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45-8.2% and mortality range of 8.2-25% (Table 2). 1-5 . Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement.

  19. On the Use of Machine Learning Techniques for the Mechanical Characterization of Soft Biological Tissues.

    PubMed

    Cilla, M; Pérez-Rey, I; Martínez, M A; Peña, Estefania; Martínez, Javier

    2018-06-23

    Motivated by the search for new strategies for fitting a material model, a new approach is explored in the present work. The use of numerical and complex algorithms based on machine learning techniques such as support vector machines for regression, bagged decision trees and artificial neural networks is proposed for solving the parameter identification of constitutive laws for soft biological tissues. First, the mathematical tools were trained with analytical uniaxial data (circumferential and longitudinal directions) as inputs, and their corresponding material parameters of the Gasser, Ogden and Holzapfel strain energy function as outputs. The train and test errors show great efficiency during the training process in finding correlations between inputs and outputs; besides, the correlation coefficients were very close to 1. Second, the tool was validated with unseen observations of analytical circumferential and longitudinal uniaxial data. The results show an excellent agreement between the prediction of the material parameters of the SEF and the analytical curves. Finally, data from real circumferential and longitudinal uniaxial tests on different cardiovascular tissues were fitted, thus the material model of these tissues was predicted. We found that the method was able to consistently identify model parameters, and we believe that the use of these numerical tools could lead to an improvement in the characterization of soft biological tissues. This article is protected by copyright. All rights reserved.

  20. Designing tools for oil exploration using nuclear modeling

    NASA Astrophysics Data System (ADS)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  1. Computer-aided tracking and characterization of homicides and sexual assaults (CATCH)

    NASA Astrophysics Data System (ADS)

    Kangas, Lars J.; Terrones, Kristine M.; Keppel, Robert D.; La Moria, Robert D.

    1999-03-01

    When a serial offender strikes, it usually means that the investigation is unprecedented for that police agency. The volume of incoming leads and pieces of information in the case(s) can be overwhelming as evidenced by the thousands of leads gathered in the Ted Bundy Murders, Atlanta Child Murders, and the Green River Murders. Serial cases can be long term investigations in which the suspect remains unknown and continues to perpetrate crimes. With state and local murder investigative systems beginning to crop up, it will become important to manage that information in a timely and efficient way by developing computer programs to assist in that task. One vital function will be to compare violent crime cases from different jurisdictions so investigators can approach the investigation knowing that similar cases exist. CATCH (Computer Aided Tracking and Characterization of Homicides) is being developed to assist crime investigations by assessing likely characteristics of unknown offenders, by relating a specific crime case to other cases, and by providing a tool for clustering similar cases that may be attributed to the same offenders. CATCH is a collection of tools that assist the crime analyst in the investigation process by providing advanced data mining and visualization capabilities.These tools include clustering maps, query tools, geographic maps, timelines, etc. Each tool is designed to give the crime analyst a different view of the case data. The clustering tools in CATCH are based on artificial neural networks (ANNs). The ANNs learn to cluster similar cases from approximately 5000 murders and 3000 sexual assaults residing in a database. The clustering algorithm is applied to parameters describing modus operandi (MO), signature characteristics of the offenders, and other parameters describing the victim and offender. The proximity of cases within a two-dimensional representation of the clusters allows the analyst to identify similar or serial murders and sexual assaults.

  2. Integrating ecology into biotechnology.

    PubMed

    McMahon, Katherine D; Martin, Hector Garcia; Hugenholtz, Philip

    2007-06-01

    New high-throughput culture-independent molecular tools are allowing the scientific community to characterize and understand the microbial communities underpinning environmental biotechnology processes in unprecedented ways. By creatively leveraging these new data sources, microbial ecology has the potential to transition from a purely descriptive to a predictive framework, in which ecological principles are integrated and exploited to engineer systems that are biologically optimized for the desired goal. But to achieve this goal, ecology, engineering and microbiology curricula need to be changed from the very root to better promote interdisciplinarity.

  3. Application of modern surface analytical tools in the investigation of surface deterioration processes

    NASA Technical Reports Server (NTRS)

    Buckley, D. H.

    1983-01-01

    Surface profilometry and scanning electron microscopy were utilized to study changes in the surface of polymers when eroded. The X-ray photoelectron spectroscopy (XPS) and depth profile analysis indicate the corrosion of metal and ceramic surfaces and reveal the diffusion of certain species into the surface to produce a change in mechanical properties. Ion implantation, nitriding and plating and their effects on the surface are characterized. Auger spectroscopy analysis identified morphological properties of coatings applied to surfaces by sputter deposition.

  4. Modal control theory and application to aircraft lateral handling qualities design

    NASA Technical Reports Server (NTRS)

    Srinathkumar, S.

    1978-01-01

    A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.

  5. Stereoselectivity of supported alkene metathesis catalysts: a goal and a tool to characterize active sites.

    PubMed

    Copéret, Christophe

    2011-01-05

    Stereoselectivity in alkene metathesis is a challenge and can be used as a tool to study active sites under working conditions. This review describes the stereochemical relevance and problems in alkene metathesis (kinetic vs. thermodynamic issues), the use of (E/Z) ratio at low conversions as a tool to characterize active sites of heterogeneous catalysts and finally to propose strategies to improve catalysts based on the current state of the art.

  6. Flow in the Proximity of the Pin-Tool in Friction Stir Welding and Its Relation to Weld Homogeneity

    NASA Technical Reports Server (NTRS)

    Nunes, Arthur C., Jr.

    2000-01-01

    In the Friction Stir Welding (FSW) process a rotating pin inserted into a seam literally stirs the metal from each side of the seam together. It is proposed that the flow in the vicinity of the pin-tool comprises a primary rapid shear over a cylindrical envelope covering the pin-tool and a relatively slow secondary flow taking the form of a ring vortex about the tool circumference. This model is consistent with a plastic characterization of metal flow, where discontinuities in shear flow are allowed but not viscous effects. It is consistent with experiments employing several different kinds of tracer: atomic markers, shot, and wire. If a rotating disc with angular velocity w is superposed on a translating continuum with linear velocity omega, the trajectories of tracer points become circular arcs centered upon a point displaced laterally a distance v/omega from the center of rotation of the disc in the direction of the advancing side of the disc. In the present model a stream of metal approaching the tool (taken as the coordinate system of observation) is sheared at the slip surface, rapidly rotated around the tool, sheared again on the opposite side of the tool, and deposited in the wake of the tool. Local shearing rates are high, comparable to metal cutting in this model. The flow patterns in the vicinity of the pin-tool determine the level of homogenization and dispersal of contaminants that occurs in the FSW process. The approaching metal streams enfold one another as they are rotated around the tool. Neglecting mixing they return to the same lateral position in the wake of the tool preserving lateral tracer positions as if the metal had flowed past the tool like an extrusion instead of being rotated around it. (The seam is, however, obliterated.) The metal stream of thickness approximately that of the tool diameter D is wiped past the tool at elevated temperatures drawn out to a thickness of v/2(omega) in the wiping zone. Mixing distances in the wiping zone are multiplied in the unfolded metal. Inhomogeneities on a smaller scale than the mixing length are obliterated, but structure on a larger scale may be transmitted to the wake of a FSW weld.

  7. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  8. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  9. In-silico identification and characterization of organic and inorganic chemical stress responding genes in yeast (Saccharomyces cerevisiae).

    PubMed

    Barozai, Muhammad Younas Khan; Bashir, Farrukh; Muzaffar, Shafia; Afzal, Saba; Behlil, Farida; Khan, Muzaffar

    2014-10-15

    To study the life processes of all eukaryotes, yeast (Saccharomyces cerevisiae) is a significant model organism. It is also one of the best models to study the responses of genes at transcriptional level. In a living organism, gene expression is changed by chemical stresses. The genes that give response to chemical stresses will provide good source for the strategies in engineering and formulating mechanisms which are chemical stress resistant in the eukaryotic organisms. The data available through microarray under the chemical stresses like lithium chloride, lactic acid, weak organic acids and tomatidine were studied by using computational tools. Out of 9335 yeast genes, 388 chemical stress responding genes were identified and characterized under different chemical stresses. Some of these are: Enolases 1 and 2, heat shock protein-82, Yeast Elongation Factor 3, Beta Glucanase Protein, Histone H2A1 and Histone H2A2 Proteins, Benign Prostatic Hyperplasia, ras GTPase activating protein, Establishes Silent Chromatin protein, Mei5 Protein, Nondisjunction Protein and Specific Mitogen Activated Protein Kinase. Characterization of these genes was also made on the basis of their molecular functions, biological processes and cellular components. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Growth and characterization of amorphous selenium: An exploration into the glass transition temperature

    NASA Astrophysics Data System (ADS)

    Schaefers, Justin Kyle

    The glass transition temperature (Tg) of alpha-Se films and its correlation to percent As inclusion was explored using such characterization tools as Raman spectroscopy, spectroscopic ellipsometry, atomic force microscopy, and microphotography. The films were deposited under ultra high vacuum conditions in a dedicated molecular beam epitaxy chamber onto semi-insulating GaAs (100) substrates. After deposition, the samples were thermally annealed in 5°C increments until they began to crystallize, as evident in the characterizations performed. It was discovered that not only is Tg directly related to percent As, but that the film thickness is as well. Higher than previously reported values, Tg was found to be 80°C for 0% As, 110°C for 2% As, and 125°C for 5% As. In addition, instead of producing polycrystalline films containing all the allotropes of Se as a result of the annealing process, films of the trigonal allotrope of crystalline selenium (t-Se) were produced through the annealing process. The transition from the amorphous phase to the trigonal phase has never been reported prior to this dissertation. Finally, it was also discovered that the MBE deposition of the films is truly epitaxial in nature.

  11. Statistical Tools And Artificial Intelligence Approaches To Predict Fracture In Bulk Forming Processes

    NASA Astrophysics Data System (ADS)

    Di Lorenzo, R.; Ingarao, G.; Fonti, V.

    2007-05-01

    The crucial task in the prevention of ductile fracture is the availability of a tool for the prediction of such defect occurrence. The technical literature presents a wide investigation on this topic and many contributions have been given by many authors following different approaches. The main class of approaches regards the development of fracture criteria: generally, such criteria are expressed by determining a critical value of a damage function which depends on stress and strain paths: ductile fracture is assumed to occur when such critical value is reached during the analysed process. There is a relevant drawback related to the utilization of ductile fracture criteria; in fact each criterion usually has good performances in the prediction of fracture for particular stress - strain paths, i.e. it works very well for certain processes but may provide no good results for other processes. On the other hand, the approaches based on damage mechanics formulation are very effective from a theoretical point of view but they are very complex and their proper calibration is quite difficult. In this paper, two different approaches are investigated to predict fracture occurrence in cold forming operations. The final aim of the proposed method is the achievement of a tool which has a general reliability i.e. it is able to predict fracture for different forming processes. The proposed approach represents a step forward within a research project focused on the utilization of innovative predictive tools for ductile fracture. The paper presents a comparison between an artificial neural network design procedure and an approach based on statistical tools; both the approaches were aimed to predict fracture occurrence/absence basing on a set of stress and strain paths data. The proposed approach is based on the utilization of experimental data available, for a given material, on fracture occurrence in different processes. More in detail, the approach consists in the analysis of experimental tests in which fracture occurs followed by the numerical simulations of such processes in order to track the stress-strain paths in the workpiece region where fracture is expected. Such data are utilized to build up a proper data set which was utilized both to train an artificial neural network and to perform a statistical analysis aimed to predict fracture occurrence. The developed statistical tool is properly designed and optimized and is able to recognize the fracture occurrence. The reliability and predictive capability of the statistical method were compared with the ones obtained from an artificial neural network developed to predict fracture occurrence. Moreover, the approach is validated also in forming processes characterized by a complex fracture mechanics.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhakaran, Venkateshkumar; Johnson, Grant E.; Wang, Bingbing

    Molecular-level understanding of electrochemical processes occurring at electrode-electrolyte interfaces (EEI) is key to the rational development of high-performance and sustainable electrochemical technologies. This article reports the development and first application of solid-state in situ electrochemical probes to study redox and catalytic processes occurring at well-defined EEI generated using soft-landing of mass- and charge-selected cluster ions (SL). In situ electrochemical probes with excellent mass transfer properties are fabricated using carefully-designed nanoporous ionic liquid membranes. SL enables deposition of pure active species that are not obtainable with other techniques onto electrode surfaces with precise control over charge state, composition, and kinetic energy.more » SL is, therefore, a unique tool for studying fundamental processes occurring at EEI. For the first time using an aprotic electrochemical probe, the effect of charge state (PMo12O403-/2-) and the contribution of building blocks of Keggin polyoxometalate (POM) clusters to redox processes are characterized by populating EEI with novel POM anions generated by electrospray ionization and gas phase dissociation. Additionally, a proton conducting electrochemical probe has been developed to characterize the reactive electrochemistry (oxygen reduction activity) of bare Pt clusters (Pt40 ~1 nm diameter), thus demonstrating the capability of the probe for studying reactions in controlled gaseous environments. The newly developed in situ electrochemical probes combined with ion SL provide a versatile method to characterize the EEI in solid-state redox systems and reactive electrochemistry at precisely-defined conditions. This capability will advance molecular-level understanding of processes occurring at EEI that are critical to many energy-related technologies.« less

  13. Copper-Doped Bioactive Glass as Filler for PMMA-Based Bone Cements: Morphological, Mechanical, Reactivity, and Preliminary Antibacterial Characterization.

    PubMed

    Miola, Marta; Cochis, Andrea; Kumar, Ajay; Arciola, Carla Renata; Rimondini, Lia; Verné, Enrica

    2018-06-06

    To promote osteointegration and simultaneously limit bacterial contamination without using antibiotics, we designed innovative composite cements containing copper (Cu)-doped bioactive glass powders. Cu-doped glass powders were produced by a melt and quenching process, followed by an ion-exchange process in a Cu salt aqueous solution. Cu-doped glass was incorporated into commercial polymethyl methacrylate (PMMA)-based cements with different viscosities. The realized composites were characterized in terms of morphology, composition, leaching ability, bioactivity, mechanical, and antibacterial properties. Glass powders appeared well distributed and exposed on the PMMA surface. Composite cements showed good bioactivity, evidencing hydroxyapatite precipitation on the sample surfaces after seven days of immersion in simulated body fluid. The leaching test demonstrated that composite cements released a significant amount of copper, with a noticeable antibacterial effect toward Staphylococcus epidermidis strain. Thus, the proposed materials represent an innovative and multifunctional tool for orthopedic prostheses fixation, temporary prostheses, and spinal surgery.

  14. Flame extinction limit and particulates formation in fuel blends

    NASA Astrophysics Data System (ADS)

    Subramanya, Mahesh

    Many fuels used in material processing and power generation applications are generally a blend of various hydrocarbons. Although the combustion and aerosol formation dynamics of individual fuels is well understood, the flame dynamics of fuel blends are yet to be characterized. This research uses a twin flame counterflow burner to measure flame velocity, flame extinction, particulate formation and particulate morphology of hydrogen fuel blend flames at different H2 concentration, oscillation frequencies and stretch conditions. Phase resolved spectroscopic measurements (emission spectra) of OH, H, O and CH radical/atom concentrations is used to characterize the heat release processes of the flame. In addition flame generated particulates are collected using thermophoretic sample technique and are qualitative analyzed using Raman Spectroscopy and SEM. Such measurements are essential for the development of advanced computational tools capable of predicting fuel blend flame characteristics at realistic combustor conditions. The data generated through the measurements of this research are representative, and yet accurate, with unique well defined boundary conditions which can be reproduced in numerical computations for kinetic code validations.

  15. Modular Assembly of the Bacterial Large Ribosomal Subunit.

    PubMed

    Davis, Joseph H; Tan, Yong Zi; Carragher, Bridget; Potter, Clinton S; Lyumkis, Dmitry; Williamson, James R

    2016-12-01

    The ribosome is a complex macromolecular machine and serves as an ideal system for understanding biological macromolecular assembly. Direct observation of ribosome assembly in vivo is difficult, as few intermediates have been isolated and thoroughly characterized. Herein, we deploy a genetic system to starve cells of an essential ribosomal protein, which results in the accumulation of assembly intermediates that are competent for maturation. Quantitative mass spectrometry and single-particle cryo-electron microscopy reveal 13 distinct intermediates, which were each resolved to ∼4-5 Å resolution and could be placed in an assembly pathway. We find that ribosome biogenesis is a parallel process, that blocks of structured rRNA and proteins assemble cooperatively, and that the entire process is dynamic and can be "re-routed" through different pathways as needed. This work reveals the complex landscape of ribosome assembly in vivo and provides the requisite tools to characterize additional assembly pathways for ribosomes and other macromolecular machines. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Modular Assembly of the Bacterial Large Ribosomal Subunit

    PubMed Central

    Davis, Joseph H.; Tan, Yong Zi; Carragher, Bridget; Potter, Clinton S.; Lyumkis, Dmitry; Williamson, James R.

    2016-01-01

    SUMMARY The ribosome is a complex macromolecular machine and serves as an ideal system for understanding biological macromolecular assembly. Direct observation of ribosome assembly in vivo is difficult, as few intermediates have been isolated and thoroughly characterized. Herein, we deploy a genetic system to starve cells of an essential ribosomal protein, which results in the accumulation of assembly intermediates that are competent for maturation. Quantitative mass spectrometry and single-particle cryo-electron microscopy reveal 13 distinct intermediates, which were each resolved to ~4–5Å resolution and could be placed in an assembly pathway. We find that ribosome biogenesis is a parallel process, that blocks of structured rRNA and proteins assemble cooperatively, and that the entire process is dynamic and can be ‘re-routed’ through different pathways as needed. This work reveals the complex landscape of ribosome assembly in vivo and provides the requisite tools to characterize additional assembly pathways for ribosomes and other macromolecular machines. PMID:27912064

  17. Reproduction in Leishmania: A focus on genetic exchange.

    PubMed

    Rougeron, V; De Meeûs, T; Bañuls, A-L

    2017-06-01

    One key process of the life cycle of pathogens is their mode of reproduction. Indeed, this fundamental biological process conditions the multiplication and the transmission of genes and thus the propagation of diseases in the environment. Reproductive strategies of protozoan parasites have been a subject of debate for many years, principally due to the difficulty in making direct observations of sexual reproduction (i.e. genetic recombination). Traditionally, these parasites were considered as characterized by a preeminent clonal structure. Nevertheless, with the development of elaborate culture experiments, population genetics and evolutionary and population genomics, several studies suggested that most of these pathogens were also characterized by constitutive genetic recombination events. In this opinion, we focused on Leishmania parasites, pathogens responsible of leishmaniases, a major public health issue. We first discuss the evolutionary advantages of a mixed mating reproductive strategy, then we review the evidence of genetic exchange, and finally we detail available tools to detect naturally occurring genetic recombination in Leishmania parasites and more generally in protozoan parasites. Copyright © 2016. Published by Elsevier B.V.

  18. Subvisible (2-100 μm) particle analysis during biotherapeutic drug product development: Part 2, experience with the application of subvisible particle analysis.

    PubMed

    Corvari, Vincent; Narhi, Linda O; Spitznagel, Thomas M; Afonina, Nataliya; Cao, Shawn; Cash, Patricia; Cecchini, Irene; DeFelippis, Michael R; Garidel, Patrick; Herre, Andrea; Koulov, Atanas V; Lubiniecki, Tony; Mahler, Hanns-Christian; Mangiagalli, Paolo; Nesta, Douglas; Perez-Ramirez, Bernardo; Polozova, Alla; Rossi, Mara; Schmidt, Roland; Simler, Robert; Singh, Satish; Weiskopf, Andrew; Wuchner, Klaus

    2015-11-01

    Measurement and characterization of subvisible particles (including proteinaceous and non-proteinaceous particulate matter) is an important aspect of the pharmaceutical development process for biotherapeutics. Health authorities have increased expectations for subvisible particle data beyond criteria specified in the pharmacopeia and covering a wider size range. In addition, subvisible particle data is being requested for samples exposed to various stress conditions and to support process/product changes. Consequently, subvisible particle analysis has expanded beyond routine testing of finished dosage forms using traditional compendial methods. Over the past decade, advances have been made in the detection and understanding of subvisible particle formation. This article presents industry case studies to illustrate the implementation of strategies for subvisible particle analysis as a characterization tool to assess the nature of the particulate matter and applications in drug product development, stability studies and post-marketing changes. Copyright © 2015 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  19. Overlapping gene expression profiles indicative of antigen processing and the interferon pathway characterize inflammatory fibrotic skin diseases.

    PubMed

    Limpers, Annelies; van Royen-Kerkhof, Annet; van Roon, Joel A G; Radstake, Timothy R D J; Broen, Jasper C A

    2014-02-01

    Inflammatory fibrotic disorders have been of high interest both for dermatologists and rheumatologists. Although the phenotypic end stage of this group of diseases is ultimately the same, namely fibrosis, patients present with different clinical features and are often treated with distinct therapeutic modalities. This review addresses whether there is evidence for different underlying molecular pathways in the various inflammatory fibrotic diseases such as localized scleroderma, pediatric lichen sclerosus, adult lichen sclerosus, eosinophilic fasciitis and systemic sclerosis. To investigate this, a large number of gene expression microarray studies performed on skin or fibroblasts from patients with these aforementioned diseases were described, (re-)analysed, and compared. As suspected by the heterogeneous phenotype, most diseases showed unique gene expression features. Intriguingly, a clear overlap was observed between adult and pediatric lichen sclerosus and localized scleroderma, in antigen processing and the interferon pathway. Delineating the cause and consequence of these pathways may generate novel tools to better characterize and more effectively treat these patients.

  20. Sequence analysis and molecular characterization of Wnt4 gene in metacestodes of Taenia solium.

    PubMed

    Hou, Junling; Luo, Xuenong; Wang, Shuai; Yin, Cai; Zhang, Shaohua; Zhu, Xueliang; Dou, Yongxi; Cai, Xuepeng

    2014-04-01

    Wnt proteins are a family of secreted glycoproteins that are evolutionarily conserved and considered to be involved in extensive developmental processes in metazoan organisms. The characterization of wnt genes may improve understanding the parasite's development. In the present study, a wnt4 gene encoding 491amino acids was amplified from cDNA of metacestodes of Taenia solium using reverse transcription PCR (RT-PCR). Bioinformatics tools were used for sequence analysis. The conserved domain of the wnt gene family was predicted. The expression profile of Wnt4 was investigated using real-time PCR. Wnt4 expression was found to be dramatically increased in scolex evaginated cysticerci when compared to invaginated cysticerci. In situ hybridization showed that wnt4 gene was distributed in the posterior end of the worm along the primary body axis in evaginated cysticerci. These findings indicated that wnt4 may take part in the process of cysticerci evagination and play a role in scolex/bladder development of cysticerci of T. solium.

  1. An experimental analysis of process parameters to manufacture micro-channels in AISI H13 tempered steel by laser micro-milling

    NASA Astrophysics Data System (ADS)

    Teixidor, D.; Ferrer, I.; Ciurana, J.

    2012-04-01

    This paper reports the characterization of laser machining (milling) process to manufacture micro-channels in order to understand the incidence of process parameters on the final features. Selection of process operational parameters is highly critical for successful laser micromachining. A set of designed experiments is carried out in a pulsed Nd:YAG laser system using AISI H13 hardened tool steel as work material. Several micro-channels have been manufactured as micro-mold cavities varying parameters such as scanning speed (SS), pulse intensity (PI) and pulse frequency (PF). Results are obtained by evaluating the dimensions and the surface finish of the micro-channel. The dimensions and shape of the micro-channels produced with laser-micro-milling process exhibit variations. In general the use of low scanning speeds increases the quality of the feature in both surface finishing and dimensional.

  2. Increasing rigor in NMR-based metabolomics through validated and open source tools

    PubMed Central

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2016-01-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism’s phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. PMID:27643760

  3. Microplastic Exposure Assessment in Aquatic Environments: Learning from Similarities and Differences to Engineered Nanoparticles.

    PubMed

    Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo

    2017-03-07

    Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.

  4. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    PubMed

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  5. Optical tools for intermixing diagnostic: application to InGaAs/InGaAsP microstructures

    NASA Astrophysics Data System (ADS)

    Peyre, H.; Alsina, F.; Juillaguet, S.; Massone, E.; Camassel, J.; Pascual, J.; Glew, R. W.

    1993-01-01

    InGaAs quantum wells (QWs), with either InP or InGaAsP barriers, are increasingly considered for optoelectronic device applications. Nevertheless, because interdiffusion across the interfaces (intermixing) results in unwanted modifications of the nominal properties, in-situ controls of the well composition (to be ultimately done during the processing sequences) are of fundamental interest. In this work, we have used a single quantum well of InGaAs/InGaAsP as a prototype structure and we investigate the respective advantage (and/or disadvantage) of both PL and Raman tools as non-destructive techniques. Provided careful analyses are done, we find that both determinations are in satisfactory agreement and constitute alternative but non-equivalent techniques for in-line characterization.

  6. Impacts of Lateral Boundary Conditions on US Ozone ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  7. Characterization of Heterobasidion occidentale transcriptomes reveals candidate genes and DNA polymorphisms for virulence variations.

    PubMed

    Liu, Jun-Jun; Shamoun, Simon Francis; Leal, Isabel; Kowbel, Robert; Sumampong, Grace; Zamany, Arezoo

    2018-05-01

    Characterization of genes involved in differentiation of pathogen species and isolates with variations of virulence traits provides valuable information to control tree diseases for meeting the challenges of sustainable forest health and phytosanitary trade issues. Lack of genetic knowledge and genomic resources hinders novel gene discovery, molecular mechanism studies and development of diagnostic tools in the management of forest pathogens. Here, we report on transcriptome profiling of Heterobasidion occidentale isolates with contrasting virulence levels. Comparative transcriptomic analysis identified orthologous groups exclusive to H. occidentale and its isolates, revealing biological processes involved in the differentiation of isolates. Further bioinformatics analyses identified an H. occidentale secretome, CYPome and other candidate effectors, from which genes with species- and isolate-specific expression were characterized. A large proportion of differentially expressed genes were revealed to have putative activities as cell wall modification enzymes and transcription factors, suggesting their potential roles in virulence and fungal pathogenesis. Next, large numbers of simple sequence repeats (SSRs) and single nucleotide polymorphisms (SNPs) were detected, including more than 14 000 interisolate non-synonymous SNPs. These polymorphic loci and species/isolate-specific genes may contribute to virulence variations and provide ideal DNA markers for development of diagnostic tools and investigation of genetic diversity. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  8. Characterizing the Networks of Digital Information that Support Collaborative Adaptive Forest Management in Sierra Nevada Forests.

    PubMed

    Lei, Shufei; Iles, Alastair; Kelly, Maggi

    2015-07-01

    Some of the factors that can contribute to the success of collaborative adaptive management--such as social learning, open communication, and trust--are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study--the Sierra Nevada Adaptive Management Project--using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.

  9. Characterizing the Networks of Digital Information that Support Collaborative Adaptive Forest Management in Sierra Nevada Forests

    NASA Astrophysics Data System (ADS)

    Lei, Shufei; Iles, Alastair; Kelly, Maggi

    2015-07-01

    Some of the factors that can contribute to the success of collaborative adaptive management—such as social learning, open communication, and trust—are built upon a foundation of the open exchange of information about science and management between participants and the public. Despite the importance of information transparency, the use and flow of information in collaborative adaptive management has not been characterized in detail in the literature, and currently there exist opportunities to develop strategies for increasing the exchange of information, as well as to track information flow in such contexts. As digital information channels and networks have been increased over the last decade, powerful new information monitoring tools have also been evolved allowing for the complete characterization of information products through their production, transport, use, and monitoring. This study uses these tools to investigate the use of various science and management information products in a case study—the Sierra Nevada Adaptive Management Project—using a mixed method (citation analysis, web analytics, and content analysis) research approach borrowed from the information processing and management field. The results from our case study show that information technologies greatly facilitate the flow and use of digital information, leading to multiparty collaborations such as knowledge transfer and public participation in science research. We conclude with recommendations for expanding information exchange in collaborative adaptive management by taking advantage of available information technologies and networks.

  10. Looking back to inform the future: The role of cognition in forest disturbance characterization from remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Bianchetti, Raechel Anne

    Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.

  11. Designing the safety of healthcare. Participation of ergonomics to the design of cooperative systems in radiotherapy.

    PubMed

    Munoz, Maria Isabel; Bouldi, Nadia; Barcellini, Flore; Nascimento, Adelaide

    2012-01-01

    This communication deals with the involvement of ergonomists in a research-action design process of a software platform in radiotherapy. The goal of the design project is to enhance patient safety by designing a workflow software that supports cooperation between professionals producing treatment in radiotherapy. The general framework of our approach is the ergonomics management of a design process, which is based in activity analysis and grounded in participatory design. Two fields are concerned by the present action: a design environment which is a participatory design process that involves software designers, caregivers as future users and ergonomists; and a reference real work setting in radiotherapy. Observations, semi-structured interviews and participatory workshops allow the characterization of activity in radiotherapy dealing with uses of cooperative tools, sources of variability and non-ruled strategies to manage the variability of the situations. This production of knowledge about work searches to enhance the articulation between technocentric and anthropocentric approaches, and helps in clarifying design requirements. An issue of this research-action is to develop a framework to define the parameters of the workflow tool, and the conditions of its deployment.

  12. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  13. Simulating the Composite Propellant Manufacturing Process

    NASA Technical Reports Server (NTRS)

    Williamson, Suzanne; Love, Gregory

    2000-01-01

    There is a strategic interest in understanding how the propellant manufacturing process contributes to military capabilities outside the United States. The paper will discuss how system dynamics (SD) has been applied to rapidly assess the capabilities and vulnerabilities of a specific composite propellant production complex. These facilities produce a commonly used solid propellant with military applications. The authors will explain how an SD model can be configured to match a specific production facility followed by a series of scenarios designed to analyze operational vulnerabilities. By using the simulation model to rapidly analyze operational risks, the analyst gains a better understanding of production complexities. There are several benefits of developing SD models to simulate chemical production. SD is an effective tool for characterizing complex problems, especially the production process where the cascading effect of outages quickly taxes common understanding. By programming expert knowledge into an SD application, these tools are transformed into a knowledge management resource that facilitates rapid learning without requiring years of experience in production operations. It also permits the analyst to rapidly respond to crisis situations and other time-sensitive missions. Most importantly, the quantitative understanding gained from applying the SD model lends itself to strategic analysis and planning.

  14. The current status of biomarkers for predicting toxicity

    PubMed Central

    Campion, Sarah; Aubrecht, Jiri; Boekelheide, Kim; Brewster, David W; Vaidya, Vishal S; Anderson, Linnea; Burt, Deborah; Dere, Edward; Hwang, Kathleen; Pacheco, Sara; Saikumar, Janani; Schomaker, Shelli; Sigman, Mark; Goodsaid, Federico

    2013-01-01

    Introduction There are significant rates of attrition in drug development. A number of compounds fail to progress past preclinical development due to limited tools that accurately monitor toxicity in preclinical studies and in the clinic. Research has focused on improving tools for the detection of organ-specific toxicity through the identification and characterization of biomarkers of toxicity. Areas covered This article reviews what we know about emerging biomarkers in toxicology, with a focus on the 2012 Northeast Society of Toxicology meeting titled ‘Translational Biomarkers in Toxicology.’ The areas covered in this meeting are summarized and include biomarkers of testicular injury and dysfunction, emerging biomarkers of kidney injury and translation of emerging biomarkers from preclinical species to human populations. The authors also provide a discussion about the biomarker qualification process and possible improvements to this process. Expert opinion There is currently a gap between the scientific work in the development and qualification of novel biomarkers for nonclinical drug safety assessment and how these biomarkers are actually used in drug safety assessment. A clear and efficient path to regulatory acceptance is needed so that breakthroughs in the biomarker toolkit for nonclinical drug safety assessment can be utilized to aid in the drug development process. PMID:23961847

  15. Microstructure characterization of the stir zone of submerged friction stir processed aluminum alloy 2219

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Xiuli, E-mail: feng.97@osu.edu; State Key Laboratory of Advanced Welding and Joining, Harbin Institute of Technology, Harbin 150001; Liu, Huijie, E-mail: liuhj@hit.edu.cn

    Aluminum alloy 2219-T6 was friction stir processed using a novel submerged processing technique to facilitate cooling. Processing was conducted at a constant tool traverse speed of 200 mm/min and spindle rotation speeds in the range from 600 to 800 rpm. The microstructural characteristics of the base metal and processed zone, including grain structure and precipitation behavior, were studied using optical microscopy (OM), scanning electron microscopy (SEM) and transmission electron microscopy (TEM). Microhardness maps were constructed on polished cross sections of as-processed samples. The effect of tool rotation speed on the microstructure and hardness of the stir zone was investigated. Themore » average grain size of the stir zone was much smaller than that of the base metal, but the hardness was also lower due to the formation of equilibrium θ precipitates from the base metal θ′ precipitates. Stir zone hardness was found to decrease with increasing rotation speed (heat input). The effect of processing conditions on strength (hardness) was rationalized based on the competition between grain refinement strengthening and softening due to precipitate overaging. - Highlights: • SZ grain size (∼ 1 μm) is reduced by over one order of magnitude relative to the BM. • Hardness in the SZ is lower than that of the precipitation strengthened BM. • Metastable θ′ in the base metal transforms to equilibrium θ in the stir zone. • Softening in the SZ results from a decrease of precipitation strengthening.« less

  16. TSVdb: a web-tool for TCGA splicing variants analysis.

    PubMed

    Sun, Wenjie; Duan, Ting; Ye, Panmeng; Chen, Kelie; Zhang, Guanling; Lai, Maode; Zhang, Honghe

    2018-05-29

    Collaborative projects such as The Cancer Genome Atlas (TCGA) have generated various -omics and clinical data on cancer. Many computational tools have been developed to facilitate the study of the molecular characterization of tumors using data from the TCGA. Alternative splicing of a gene produces splicing variants, and accumulating evidence has revealed its essential role in cancer-related processes, implying the urgent need to discover tumor-specific isoforms and uncover their potential functions in tumorigenesis. We developed TSVdb, a web-based tool, to explore alternative splicing based on TCGA samples with 30 clinical variables from 33 tumors. TSVdb has an integrated and well-proportioned interface for visualization of the clinical data, gene expression, usage of exons/junctions and splicing patterns. Researchers can interpret the isoform expression variations between or across clinical subgroups and estimate the relationships between isoforms and patient prognosis. TSVdb is available at http://www.tsvdb.com , and the source code is available at https://github.com/wenjie1991/TSVdb . TSVdb will inspire oncologists and accelerate isoform-level advances in cancer research.

  17. Whither managerialism in the Italian National Health Service?

    PubMed

    Anessi-Pessina, Eugenio; Cantù, Elena

    2006-01-01

    In the last decade, the Italian National Health Service has been characterized by the introduction of managerial concepts and techniques, according to the New Public Management paradigm. Recently, these reforms have been increasingly criticized. This article examines the implementation of managerialism in an attempt to evaluate its overall achievements and shortcomings. Overall, managerialism seems to have made good progress: managerial skills are improving; several management tools have been adapted to health-care and public-sector peculiarities; health-care organizations have adopted a wide range of technical solutions to fit their specific needs. At the same time, managerial innovations have often focused on structures as opposed to processes, on the way the organization looks as opposed to the way it works, on the tools it has as opposed to those it actually needs and uses. We thus suggest that research, training and policy-making should stop focusing on the technical features and theoretical virtues of specific tools and should redirect their emphasis on change management.

  18. Genome engineering and plant breeding: impact on trait discovery and development.

    PubMed

    Nogué, Fabien; Mara, Kostlend; Collonnier, Cécile; Casacuberta, Josep M

    2016-07-01

    New tools for the precise modification of crops genes are now available for the engineering of new ideotypes. A future challenge in this emerging field of genome engineering is to develop efficient methods for allele mining. Genome engineering tools are now available in plants, including major crops, to modify in a predictable manner a given gene. These new techniques have a tremendous potential for a spectacular acceleration of the plant breeding process. Here, we discuss how genetic diversity has always been the raw material for breeders and how they have always taken advantage of the best available science to use, and when possible, increase, this genetic diversity. We will present why the advent of these new techniques gives to the breeders extremely powerful tools for crop breeding, but also why this will require the breeders and researchers to characterize the genes underlying this genetic diversity more precisely. Tackling these challenges should permit the engineering of optimized alleles assortments in an unprecedented and controlled way.

  19. Water facilities in retrospect and prospect: An illuminating tool for vehicle design

    NASA Technical Reports Server (NTRS)

    Erickson, G. E.; Peak, D. J.; Delfrate, J.; Skow, A. M.; Malcolm, G. N.

    1986-01-01

    Water facilities play a fundamental role in the design of air, ground, and marine vehicles by providing a qualitative, and sometimes quantitative, description of complex flow phenomena. Water tunnels, channels, and tow tanks used as flow-diagnostic tools have experienced a renaissance in recent years in response to the increased complexity of designs suitable for advanced technology vehicles. These vehicles are frequently characterized by large regions of steady and unsteady three-dimensional flow separation and ensuing vortical flows. The visualization and interpretation of the complicated fluid motions about isolated vehicle components and complete configurations in a time and cost effective manner in hydrodynamic test facilities is a key element in the development of flow control concepts, and, hence, improved vehicle designs. A historical perspective of the role of water facilities in the vehicle design process is presented. The application of water facilities to specific aerodynamic and hydrodynamic flow problems is discussed, and the strengths and limitations of these important experimental tools are emphasized.

  20. A feasible injection molding technique for the manufacturing of large diameter aspheric plastic lenses

    NASA Astrophysics Data System (ADS)

    Shieh, Jen-Yu; Wang, Luke K.; Ke, Shih-Ying

    2010-07-01

    A computer aided engineering (CAE) tool-assisted technique, using Moldex3D and aspheric analysis utility (AAU) software in a polycarbonate injection molding design, is proposed to manufacture large diameter aspheric plastic lenses. An experiment is conducted to verify the applicability/feasibility of the proposed technique. Using the preceding two software tools, these crucial process parameters associated with the surface profile errors and birefringence of a molded lens can be attainable. The strategy adopted here is to use the actual quantity of shrinkage after an injection molding trial of an aspherical plastic lens as a reference to perform the core shaping job while keeping the coefficients of aspheric surface, radius, and conic constant unchanged. The design philosophy is characterized by using the CAE tool as a guideline to pursue the best symmetry condition, followed by injection molding trials, to accelerate a product’s developmental time. The advantages are less design complexity and shorter developmental time for a product.

  1. Copy Number Variations Detection: Unravelling the Problem in Tangible Aspects.

    PubMed

    do Nascimento, Francisco; Guimaraes, Katia S

    2017-01-01

    In the midst of the important genomic variants associated to the susceptibility and resistance to complex diseases, Copy Number Variations (CNV) has emerged as a prevalent class of structural variation. Following the flood of next-generation sequencing data, numerous tools publicly available have been developed to provide computational strategies to identify CNV at improved accuracy. This review goes beyond scrutinizing the main approaches widely used for structural variants detection in general, including Split-Read, Paired-End Mapping, Read-Depth, and Assembly-based. In this paper, (1) we characterize the relevant technical details around the detection of CNV, which can affect the estimation of breakpoints and number of copies, (2) we pinpoint the most important insights related to GC-content and mappability biases, and (3) we discuss the paramount caveats in the tools evaluation process. The points brought out in this study emphasize common assumptions, a variety of possible limitations, valuable insights, and directions for desirable contributions to the state-of-the-art in CNV detection tools.

  2. Preliminary Exploration of Encounter During Transit Across Southern Africa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stroud, Phillip David; Cuellar-Hengartner, Leticia; Kubicek, Deborah Ann

    Los Alamos National Laboratory (LANL) is utilizing the Probability Effectiveness Methodology (PEM) tools, particularly the Pathway Analysis, Threat Response and Interdiction Options Tool (PATRIOT) to support the DNDO Architecture and Planning Directorate’s (APD) development of a multi-region terrorist risk assessment tool. The effort is divided into three stages. The first stage is an exploration of what can be done with PATRIOT essentially as is, to characterize encounter rate during transit across a single selected region. The second stage is to develop, condition, and implement required modifications to the data and conduct analysis to generate a well-founded assessment of the transitmore » reliability across that selected region, and to identify any issues in the process. The final stage is to extend the work to a full multi-region global model. This document provides the results of the first stage, namely preliminary explorations with PATRIOT to assess the transit reliability across the region of southern Africa.« less

  3. Molecular epidemiology: new rules for new tools?

    PubMed

    Merlo, Domenico Franco; Sormani, Maria Pia; Bruzzi, Paolo

    2006-08-30

    Molecular epidemiology combines biological markers and epidemiological observations in the study of the environmental and genetic determinants of cancer and other diseases. The potential advantages associated with biomarkers are manifold and include: (a) increased sensitivity and specificity to carcinogenic exposures; (b) more precise evaluation of the interplay between genetic and environmental determinants of cancer; (c) earlier detection of carcinogenic effects of exposure; (d) characterization of disease subtypes-etiologies patterns; (e) evaluation of primary prevention measures. These, in turn, may translate into better tools for etiologic research, individual risk assessment, and, ultimately, primary and secondary prevention. An area that has not received sufficient attention concerns the validation of these biomarkers as surrogate endpoints for cancer risk. Validation of a candidate biomarker's surrogacy is the demonstration that it possesses the properties required for its use as a substitute for a true endpoint. The principles underlying the validation process underwent remarkable developments and discussion in therapeutic research. However, the challenges posed by the application of these principles to epidemiological research, where the basic tool for this validation (i.e., the randomized study) is seldom possible, have not been thoroughly explored. The validation process of surrogacy must be applied rigorously to intermediate biomarkers of cancer risk before using them as risk predictors at the individual as well as at the population level.

  4. Self-assembly kinetics of microscale components: A parametric evaluation

    NASA Astrophysics Data System (ADS)

    Carballo, Jose M.

    The goal of the present work is to develop, and evaluate a parametric model of a basic microscale Self-Assembly (SA) interaction that provides scaling predictions of process rates as a function of key process variables. At the microscale, assembly by "grasp and release" is generally challenging. Recent research efforts have proposed adapting nanoscale self-assembly (SA) processes to the microscale. SA offers the potential for reduced equipment cost and increased throughput by harnessing attractive forces (most commonly, capillary) to spontaneously assemble components. However, there are challenges for implementing microscale SA as a commercial process. The existing lack of design tools prevents simple process optimization. Previous efforts have characterized a specific aspect of the SA process. However, the existing microscale SA models do not characterize the inter-component interactions. All existing models have simplified the outcome of SA interactions as an experimentally-derived value specific to a particular configuration, instead of evaluating it outcome as a function of component level parameters (such as speed, geometry, bonding energy and direction). The present study parameterizes the outcome of interactions, and evaluates the effect of key parameters. The present work closes the gap between existing microscale SA models to add a key piece towards a complete design tool for general microscale SA process modeling. First, this work proposes a simple model for defining the probability of assembly of basic SA interactions. A basic SA interaction is defined as the event where a single part arrives on an assembly site. The model describes the probability of assembly as a function of kinetic energy, binding energy, orientation and incidence angle for the component and the assembly site. Secondly, an experimental SA system was designed, and implemented to create individual SA interactions while controlling process parameters independently. SA experiments measured the outcome of SA interactions, while studying the independent effects of each parameter. As a first step towards a complete scaling model, experiments were performed to evaluate the effects of part geometry and part travel direction under low kinetic energy conditions. Experimental results show minimal dependence of assembly yield on the incidence angle of the parts, and significant effects induced by changes in part geometry. The results from this work indicate that SA could be modeled as an energy-based process due to the small path dependence effects. Assembly probability is linearly related to the orientation probability. The proportionality constant is based on the area fraction of the sites with an amplification factor. This amplification factor accounts for the ability of capillary forces to align parts with only very small areas of contact when they have a low kinetic energy. Results provide unprecedented insight about SA interactions. The present study is a key step towards completing a basic model of a general SA process. Moreover, the outcome from this work can complement existing SA process models, in order to create a complete design tool for microscale SA systems. In addition to SA experiments, Monte Carlo simulations of experimental part-site interactions were conducted. This study confirmed that a major contributor to experimental variation is the stochastic nature of experimental SA interactions and the limited sample size of the experiments. Furthermore, the simulations serve as a tool for defining an optimum sampling strategy to minimize the uncertainty in future SA experiments.

  5. Automated aerial image based CD metrology initiated by pattern marking with photomask layout data

    NASA Astrophysics Data System (ADS)

    Davis, Grant; Choi, Sun Young; Jung, Eui Hee; Seyfarth, Arne; van Doornmalen, Hans; Poortinga, Eric

    2007-05-01

    The photomask is a critical element in the lithographic image transfer process from the drawn layout to the final structures on the wafer. The non-linearity of the imaging process and the related MEEF impose a tight control requirement on the photomask critical dimensions. Critical dimensions can be measured in aerial images with hardware emulation. This is a more recent complement to the standard scanning electron microscope measurement of wafers and photomasks. Aerial image measurement includes non-linear, 3-dimensional, and materials effects on imaging that cannot be observed directly by SEM measurement of the mask. Aerial image measurement excludes the processing effects of printing and etching on the wafer. This presents a unique contribution to the difficult process control and modeling tasks in mask making. In the past, aerial image measurements have been used mainly to characterize the printability of mask repair sites. Development of photomask CD characterization with the AIMS TM tool was motivated by the benefit of MEEF sensitivity and the shorter feedback loop compared to wafer exposures. This paper describes a new application that includes: an improved interface for the selection of meaningful locations using the photomask and design layout data with the Calibre TM Metrology Interface, an automated recipe generation process, an automated measurement process, and automated analysis and result reporting on a Carl Zeiss AIMS TM system.

  6. Coal liquefaction process streams characterization and evaluation. Gold tube carbonization and reflectance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.; Davis, A.; Burke, F.P.

    1991-12-01

    This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less

  7. Genomic signal processing methods for computation of alignment-free distances from DNA sequences.

    PubMed

    Borrayo, Ernesto; Mendizabal-Ruiz, E Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P; Morales, J Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments.

  8. Problems in characterizing barrier performance

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1988-01-01

    The barrier is a synchronization construct which is useful in separating a parallel program into parallel sections which are executed in sequence. The completion of a barrier requires cooperation among all executing processes. This requirement not only introduces the wait for the slowest process delay which is inherent in the definition of the synchronization, but also has implications for the efficient implementation and measurement of barrier performance in different systems. Types of barrier implementation and their relationship to different multiprocessor environments are described. Then the problem of measuring the performance of barrier implementations on specific machine architecture is discussed. The fact that the barrier synchronization requires the cooperation of all processes makes the problem of performance measurement similarly global. Making non-intrusive measurements of sufficient accuracy can be tricky on systems offering only rudimentary measurement tools.

  9. Genomic Signal Processing Methods for Computation of Alignment-Free Distances from DNA Sequences

    PubMed Central

    Borrayo, Ernesto; Mendizabal-Ruiz, E. Gerardo; Vélez-Pérez, Hugo; Romo-Vázquez, Rebeca; Mendizabal, Adriana P.; Morales, J. Alejandro

    2014-01-01

    Genomic signal processing (GSP) refers to the use of digital signal processing (DSP) tools for analyzing genomic data such as DNA sequences. A possible application of GSP that has not been fully explored is the computation of the distance between a pair of sequences. In this work we present GAFD, a novel GSP alignment-free distance computation method. We introduce a DNA sequence-to-signal mapping function based on the employment of doublet values, which increases the number of possible amplitude values for the generated signal. Additionally, we explore the use of three DSP distance metrics as descriptors for categorizing DNA signal fragments. Our results indicate the feasibility of employing GAFD for computing sequence distances and the use of descriptors for characterizing DNA fragments. PMID:25393409

  10. Characterization of the biosolids composting process by hyperspectral analysis.

    PubMed

    Ilani, Talli; Herrmann, Ittai; Karnieli, Arnon; Arye, Gilboa

    2016-02-01

    Composted biosolids are widely used as a soil supplement to improve soil quality. However, the application of immature or unstable compost can cause the opposite effect. To date, compost maturation determination is time consuming and cannot be done at the composting site. Hyperspectral spectroscopy was suggested as a simple tool for assessing compost maturity and quality. Nevertheless, there is still a gap in knowledge regarding several compost maturation characteristics, such as dissolved organic carbon, NO3, and NH4 contents. In addition, this approach has not yet been tested on a sample at its natural water content. Therefore, in the current study, hyperspectral analysis was employed in order to characterize the biosolids composting process as a function of composting time. This goal was achieved by correlating the reflectance spectra in the range of 400-2400nm, using the partial least squares-regression (PLS-R) model, with the chemical properties of wet and oven-dried biosolid samples. The results showed that the proposed method can be used as a reliable means to evaluate compost maturity and stability. Specifically, the PLS-R model was found to be an adequate tool to evaluate the biosolids' total carbon and dissolved organic carbon, total nitrogen and dissolved nitrogen, and nitrate content, as well as the absorbance ratio of 254/365nm (E2/E3) and C/N ratios in the dry and wet samples. It failed, however, to predict the ammonium content in the dry samples since the ammonium evaporated during the drying process. It was found that in contrast to what is commonly assumed, the spectral analysis of the wet samples can also be successfully used to build a model for predicting the biosolids' compost maturity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Annotating novel genes by integrating synthetic lethals and genomic information

    PubMed Central

    Schöner, Daniel; Kalisch, Markus; Leisner, Christian; Meier, Lukas; Sohrmann, Marc; Faty, Mahamadou; Barral, Yves; Peter, Matthias; Gruissem, Wilhelm; Bühlmann, Peter

    2008-01-01

    Background Large scale screening for synthetic lethality serves as a common tool in yeast genetics to systematically search for genes that play a role in specific biological processes. Often the amounts of data resulting from a single large scale screen far exceed the capacities of experimental characterization of every identified target. Thus, there is need for computational tools that select promising candidate genes in order to reduce the number of follow-up experiments to a manageable size. Results We analyze synthetic lethality data for arp1 and jnm1, two spindle migration genes, in order to identify novel members in this process. To this end, we use an unsupervised statistical method that integrates additional information from biological data sources, such as gene expression, phenotypic profiling, RNA degradation and sequence similarity. Different from existing methods that require large amounts of synthetic lethal data, our method merely relies on synthetic lethality information from two single screens. Using a Multivariate Gaussian Mixture Model, we determine the best subset of features that assign the target genes to two groups. The approach identifies a small group of genes as candidates involved in spindle migration. Experimental testing confirms the majority of our candidates and we present she1 (YBL031W) as a novel gene involved in spindle migration. We applied the statistical methodology also to TOR2 signaling as another example. Conclusion We demonstrate the general use of Multivariate Gaussian Mixture Modeling for selecting candidate genes for experimental characterization from synthetic lethality data sets. For the given example, integration of different data sources contributes to the identification of genetic interaction partners of arp1 and jnm1 that play a role in the same biological process. PMID:18194531

  12. Temporal bone borehole accuracy for cochlear implantation influenced by drilling strategy: an in vitro study.

    PubMed

    Kobler, Jan-Philipp; Schoppe, Michael; Lexow, G Jakob; Rau, Thomas S; Majdani, Omid; Kahrs, Lüder A; Ortmaier, Tobias

    2014-11-01

    Minimally invasive cochlear implantation is a surgical technique which requires drilling a canal from the mastoid surface toward the basal turn of the cochlea. The choice of an appropriate drilling strategy is hypothesized to have significant influence on the achievable targeting accuracy. Therefore, a method is presented to analyze the contribution of the drilling process and drilling tool to the targeting error isolated from other error sources. The experimental setup to evaluate the borehole accuracy comprises a drill handpiece attached to a linear slide as well as a highly accurate coordinate measuring machine (CMM). Based on the specific requirements of the minimally invasive cochlear access, three drilling strategies, mainly characterized by different drill tools, are derived. The strategies are evaluated by drilling into synthetic temporal bone substitutes containing air-filled cavities to simulate mastoid cells. Deviations from the desired drill trajectories are determined based on measurements using the CMM. Using the experimental setup, a total of 144 holes were drilled for accuracy evaluation. Errors resulting from the drilling process depend on the specific geometry of the tool as well as the angle at which the drill contacts the bone surface. Furthermore, there is a risk of the drill bit deflecting due to synthetic mastoid cells. A single-flute gun drill combined with a pilot drill of the same diameter provided the best results for simulated minimally invasive cochlear implantation, based on an experimental method that may be used for testing further drilling process improvements.

  13. Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.

    PubMed

    Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej

    2017-05-15

    Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.

  14. Concentration solar power optimization system and method of using the same

    DOEpatents

    Andraka, Charles E

    2014-03-18

    A system and method for optimizing at least one mirror of at least one CSP system is provided. The system has a screen for displaying light patterns for reflection by the mirror, a camera for receiving a reflection of the light patterns from the mirror, and a solar characterization tool. The solar characterization tool has a characterizing unit for determining at least one mirror parameter of the mirror based on an initial position of the camera and the screen, and a refinement unit for refining the determined parameter(s) based on an adjusted position of the camera and screen whereby the mirror is characterized. The system may also be provided with a solar alignment tool for comparing at least one mirror parameter of the mirror to a design geometry whereby an alignment error is defined, and at least one alignment unit for adjusting the mirror to reduce the alignment error.

  15. Requirements Development for the NASA Advanced Engineering Environment (AEE)

    NASA Technical Reports Server (NTRS)

    Rogers, Eric; Hale, Joseph P.; Zook, Keith; Gowda, Sanjay; Salas, Andrea O.

    2003-01-01

    The requirements development process for the Advanced Engineering Environment (AEE) is presented. This environment has been developed to allow NASA to perform independent analysis and design of space transportation architectures and technologies. Given the highly collaborative and distributed nature of AEE, a variety of organizations are involved in the development, operations and management of the system. Furthermore, there are additional organizations involved representing external customers and stakeholders. Thorough coordination and effective communication is essential to translate desired expectations of the system into requirements. Functional, verifiable requirements for this (and indeed any) system are necessary to fulfill several roles. Requirements serve as a contractual tool, configuration management tool, and as an engineering tool, sometimes simultaneously. The role of requirements as an engineering tool is particularly important because a stable set of requirements for a system provides a common framework of system scope and characterization among team members. Furthermore, the requirements provide the basis for checking completion of system elements and form the basis for system verification. Requirements are at the core of systems engineering. The AEE Project has undertaken a thorough process to translate the desires and expectations of external customers and stakeholders into functional system-level requirements that are captured with sufficient rigor to allow development planning, resource allocation and system-level design, development, implementation and verification. These requirements are maintained in an integrated, relational database that provides traceability to governing Program requirements and also to verification methods and subsystem-level requirements.

  16. 0.35-μm excimer DUV photolithography process

    NASA Astrophysics Data System (ADS)

    Arugu, Donald O.; Green, Kent G.; Nunan, Peter D.; Terbeek, Marcel; Crank, Sue E.; Ta, Lam; Capsuto, Elliott S.; Sethi, Satyendra S.

    1993-08-01

    It is becoming increasingly clear that DUV excimer laser based imaging will be one of the technologies for printing sub-half micron devices. This paper reports the investigation of 0.35 micrometers photolithography process using chemically amplified DUV resists on organic anti- reflective coating (ARC). Production data from the GCA XLS excimer DUV tools with nominal gate width of 0.35 micrometers lines, 0.45 micrometers spaces was studied to demonstrate device production worthiness. This data included electrical yield information for device characterization. Exposure overlay was done by mixing and matching DUV and I-line GCA steppers for critical and non critical levels respectively. Working isolated transistors down to 0.2 micrometers have been demonstrated.

  17. Battlefield decision aid for acoustical ground sensors with interface to meteorological data sources

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Noble, John M.; VanAartsen, Bruce H.; Szeto, Gregory L.

    2001-08-01

    The performance of acoustical ground sensors depends heavily on the local atmospheric and terrain conditions. This paper describes a prototype physics-based decision aid, called the Acoustic Battlefield Aid (ABFA), for predicting these environ-mental effects. ABFA integrates advanced models for acoustic propagation, atmospheric structure, and array signal process-ing into a convenient graphical user interface. The propagation calculations are performed in the frequency domain on user-definable target spectra. The solution method involves a parabolic approximation to the wave equation combined with a ter-rain diffraction model. Sensor performance is characterized with Cramer-Rao lower bounds (CRLBs). The CRLB calcula-tions include randomization of signal energy and wavefront orientation resulting from atmospheric turbulence. Available performance characterizations include signal-to-noise ratio, probability of detection, direction-finding accuracy for isolated receiving arrays, and location-finding accuracy for networked receiving arrays. A suite of integrated tools allows users to create new target descriptions from standard digitized audio files and to design new sensor array layouts. These tools option-ally interface with the ARL Database/Automatic Target Recognition (ATR) Laboratory, providing access to an extensive library of target signatures. ABFA also includes a Java-based capability for network access of near real-time data from sur-face weather stations or forecasts from the Army's Integrated Meteorological System. As an example, the detection footprint of an acoustical sensor, as it evolves over a 13-hour period, is calculated.

  18. Investigating the Effects of Pin Tool Design on Friction Stir Welded Ti-6Al-4V

    NASA Technical Reports Server (NTRS)

    Rubisoff, H. A.; Querin, J. A.; Schneider, Judy A.; Magee, D.

    2009-01-01

    Friction stir welding (FSWing), a solid state joining technique, uses a non-consumable rotating pin tool to thermomechanically join materials. Heating of the weldment caused by friction and deformation is a function of the interaction between the pin tool and the work piece. Therefore, the geometry of the pin tool is in part responsible for the resulting microstructure and mechanical properties. In this study microwave sintered tungsten carbide (WC) pin tools with tapers and flats were used to FSW Ti-6Al-4V. Transverse sections of welds were mechanically tested, and the microstructure was characterized using optical microscopy (OM) and scanning election microscopy (SEM). X-ray diffraction (XRD) and electron back-scatter diffraction (EBSD) were used to characterize the texture within the welds produced from the different pin tool designs.

  19. Measurement methods to build up the digital optical twin

    NASA Astrophysics Data System (ADS)

    Prochnau, Marcel; Holzbrink, Michael; Wang, Wenxin; Holters, Martin; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    The realization of the Digital Optical Twin (DOT), which is in short the digital representation of the physical state of an optical system, is particularly useful in the context of an automated assembly process of optical systems. During the assembly process, the physical system status of the optical system is continuously measured and compared with the digital model. In case of deviations between physical state and the digital model, the latter one is adapted to match the physical state. To reach the goal described above, in a first step measurement/characterization technologies concerning their suitability to generate a precise digital twin of an existing optical system have to be identified and evaluated. This paper gives an overview of possible characterization methods and, finally, shows first results of evaluated, compared methods (e.g. spot-radius, MTF, Zernike-polynomials), to create a DOT. The focus initially lies on the unequivocalness of the optimization results as well as on the computational time required for the optimization to reach the characterized system state. Possible sources of error are the measurement accuracy (to characterize the system) , execution time of the measurement, time needed to map the digital to the physical world (optimization step) as well as interface possibilities to integrate the measurement tool into an assembly cell. Moreover, it is to be discussed whether the used measurement methods are suitable for a `seamless' integration into an assembly cell.

  20. Cyclostationarity approach for monitoring chatter and tool wear in high speed milling

    NASA Astrophysics Data System (ADS)

    Lamraoui, M.; Thomas, M.; El Badaoui, M.

    2014-02-01

    Detection of chatter and tool wear is crucial in the machining process and their monitoring is a key issue, for: (1) insuring better surface quality, (2) increasing productivity and (3) protecting both machines and safe workpiece. This paper presents an investigation of chatter and tool wear using the cyclostationary method to process the vibrations signals acquired from high speed milling. Experimental cutting tests were achieved on slot milling operation of aluminum alloy. The experimental set-up is designed for acquisition of accelerometer signals and encoding information picked up from an encoder. The encoder signal is used for re-sampling accelerometers signals in angular domain using a specific algorithm that was developed in LASPI laboratory. The use of cyclostationary on accelerometer signals has been applied for monitoring chatter and tool wear in high speed milling. The cyclostationarity appears on average properties (first order) of signals, on the energetic properties (second order) and it generates spectral lines at cyclic frequencies in spectral correlation. Angular power and kurtosis are used to analyze chatter phenomena. The formation of chatter is characterized by unstable, chaotic motion of the tool and strong anomalous fluctuations of cutting forces. Results show that stable machining generates only very few cyclostationary components of second order while chatter is strongly correlated to cyclostationary components of second order. By machining in the unstable region, chatter results in flat angular kurtosis and flat angular power, such as a pseudo (white) random signal with flat spectrum. Results reveal that spectral correlation and Wigner Ville spectrum or integrated Wigner Ville issued from second-order cyclostationary are an efficient parameter for the early diagnosis of faults in high speed machining, such as chatter, tool wear and bearings, compared to traditional stationary methods. Wigner Ville representation of the residual signal shows that the energy corresponding to the tooth passing decreases when chatter phenomenon occurs. The effect of the tool wear and the number of broken teeth on the excitation of structure resonances appears in Wigner Ville presentation.

  1. Hidden symmetries and equilibrium properties of multiplicative white-noise stochastic processes

    NASA Astrophysics Data System (ADS)

    González Arenas, Zochil; Barci, Daniel G.

    2012-12-01

    Multiplicative white-noise stochastic processes continue to attract attention in a wide area of scientific research. The variety of prescriptions available for defining them makes the development of general tools for their characterization difficult. In this work, we study equilibrium properties of Markovian multiplicative white-noise processes. For this, we define the time reversal transformation for such processes, taking into account that the asymptotic stationary probability distribution depends on the prescription. Representing the stochastic process in a functional Grassmann formalism, we avoid the necessity of fixing a particular prescription. In this framework, we analyze equilibrium properties and study hidden symmetries of the process. We show that, using a careful definition of the equilibrium distribution and taking into account the appropriate time reversal transformation, usual equilibrium properties are satisfied for any prescription. Finally, we present a detailed deduction of a covariant supersymmetric formulation of a multiplicative Markovian white-noise process and study some of the constraints that it imposes on correlation functions using Ward-Takahashi identities.

  2. Decision-making tools in prostate cancer: from risk grouping to nomograms.

    PubMed

    Fontanella, Paolo; Benecchi, Luigi; Grasso, Angelica; Patel, Vipul; Albala, David; Abbou, Claude; Porpiglia, Francesco; Sandri, Marco; Rocco, Bernardo; Bianchi, Giampaolo

    2017-12-01

    Prostate cancer (PCa) is the most common solid neoplasm and the second leading cause of cancer death in men. After the Partin tables were developed, a number of predictive and prognostic tools became available for risk stratification. These tools have allowed the urologist to better characterize this disease and lead to more confident treatment decisions for patients. The purpose of this study is to critically review the decision-making tools currently available to the urologist, from the moment when PCa is first diagnosed until patients experience metastatic progression and death. A systematic and critical analysis through Medline, EMBASE, Scopus and Web of Science databases was carried out in February 2016 as per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement. The search was conducted using the following key words: "prostate cancer," "prediction tools," "nomograms." Seventy-two studies were identified in the literature search. We summarized the results into six sections: Tools for prediction of life expectancy (before treatment), Tools for prediction of pathological stage (before treatment), Tools for prediction of survival and cancer-specific mortality (before/after treatment), Tools for prediction of biochemical recurrence (before/after treatment), Tools for prediction of metastatic progression (after treatment) and in the last section biomarkers and genomics. The management of PCa patients requires a tailored approach to deliver a truly personalized treatment. The currently available tools are of great help in helping the urologist in the decision-making process. These tests perform very well in high-grade and low-grade disease, while for intermediate-grade disease further research is needed. Newly discovered markers, genomic tests, and advances in imaging acquisition through mpMRI will help in instilling confidence that the appropriate treatments are being offered to patients with prostate cancer.

  3. Process Damping and Cutting Tool Geometry in Machining

    NASA Astrophysics Data System (ADS)

    Taylor, C. M.; Sims, N. D.; Turner, S.

    2011-12-01

    Regenerative vibration, or chatter, limits the performance of machining processes. Consequences of chatter include tool wear and poor machined surface finish. Process damping by tool-workpiece contact can reduce chatter effects and improve productivity. Process damping occurs when the flank (also known as the relief face) of the cutting tool makes contact with waves on the workpiece surface, created by chatter motion. Tool edge features can act to increase the damping effect. This paper examines how a tool's edge condition combines with the relief angle to affect process damping. An analytical model of cutting with chatter leads to a two-section curve describing how process damped vibration amplitude changes with surface speed for radiussed tools. The tool edge dominates the process damping effect at the lowest surface speeds, with the flank dominating at higher speeds. A similar curve is then proposed regarding tools with worn edges. Experimental data supports the notion of the two-section curve. A rule of thumb is proposed which could be useful to machine operators, regarding tool wear and process damping. The question is addressed, should a tool of a given geometry, used for a given application, be considered as sharp, radiussed or worn regarding process damping.

  4. Bioinformatics tools for the analysis of NMR metabolomics studies focused on the identification of clinically relevant biomarkers.

    PubMed

    Puchades-Carrasco, Leonor; Palomino-Schätzlein, Martina; Pérez-Rambla, Clara; Pineda-Lucena, Antonio

    2016-05-01

    Metabolomics, a systems biology approach focused on the global study of the metabolome, offers a tremendous potential in the analysis of clinical samples. Among other applications, metabolomics enables mapping of biochemical alterations involved in the pathogenesis of diseases, and offers the opportunity to noninvasively identify diagnostic, prognostic and predictive biomarkers that could translate into early therapeutic interventions. Particularly, metabolomics by Nuclear Magnetic Resonance (NMR) has the ability to simultaneously detect and structurally characterize an abundance of metabolic components, even when their identities are unknown. Analysis of the data generated using this experimental approach requires the application of statistical and bioinformatics tools for the correct interpretation of the results. This review focuses on the different steps involved in the metabolomics characterization of biofluids for clinical applications, ranging from the design of the study to the biological interpretation of the results. Particular emphasis is devoted to the specific procedures required for the processing and interpretation of NMR data with a focus on the identification of clinically relevant biomarkers. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  5. Characterization of the honeybee AmNaV1 channel and tools to assess the toxicity of insecticides.

    PubMed

    Gosselin-Badaroudine, Pascal; Moreau, Adrien; Delemotte, Lucie; Cens, Thierry; Collet, Claude; Rousset, Matthieu; Charnet, Pierre; Klein, Michael L; Chahine, Mohamed

    2015-07-23

    Pollination is important for both agriculture and biodiversity. For a significant number of plants, this process is highly, and sometimes exclusively, dependent on the pollination activity of honeybees. The large numbers of honeybee colony losses reported in recent years have been attributed to colony collapse disorder. Various hypotheses, including pesticide overuse, have been suggested to explain the disorder. Using the Xenopus oocytes expression system and two microelectrode voltage-clamp, we report the functional expression and the molecular, biophysical, and pharmacological characterization of the western honeybee's sodium channel (Apis Mellifera NaV1). The NaV1 channel is the primary target for pyrethroid insecticides in insect pests. We further report that the honeybee's channel is also sensitive to permethrin and fenvalerate, respectively type I and type II pyrethroid insecticides. Molecular docking of these insecticides revealed a binding site that is similar to sites previously identified in other insects. We describe in vitro and in silico tools that can be used to test chemical compounds. Our findings could be used to assess the risks that current and next generation pesticides pose to honeybee populations.

  6. Two states or not two states: Single-molecule folding studies of protein L

    NASA Astrophysics Data System (ADS)

    Aviram, Haim Yuval; Pirchi, Menahem; Barak, Yoav; Riven, Inbal; Haran, Gilad

    2018-03-01

    Experimental tools of increasing sophistication have been employed in recent years to study protein folding and misfolding. Folding is considered a complex process, and one way to address it is by studying small proteins, which seemingly possess a simple energy landscape with essentially only two stable states, either folded or unfolded. The B1-IgG binding domain of protein L (PL) is considered a model two-state folder, based on measurements using a wide range of experimental techniques. We applied single-molecule fluorescence resonance energy transfer (FRET) spectroscopy in conjunction with a hidden Markov model analysis to fully characterize the energy landscape of PL and to extract the kinetic properties of individual molecules of the protein. Surprisingly, our studies revealed the existence of a third state, hidden under the two-state behavior of PL due to its small population, ˜7%. We propose that this minority intermediate involves partial unfolding of the two C-terminal β strands of PL. Our work demonstrates that single-molecule FRET spectroscopy can be a powerful tool for a comprehensive description of the folding dynamics of proteins, capable of detecting and characterizing relatively rare metastable states that are difficult to observe in ensemble studies.

  7. Surprisal analysis of Glioblastoma Multiform (GBM) microRNA dynamics unveils tumor specific phenotype.

    PubMed

    Zadran, Sohila; Remacle, Francoise; Levine, Raphael

    2014-01-01

    Gliomablastoma multiform (GBM) is the most fatal form of all brain cancers in humans. Currently there are limited diagnostic tools for GBM detection. Here, we applied surprisal analysis, a theory grounded in thermodynamics, to unveil how biomolecule energetics, specifically a redistribution of free energy amongst microRNAs (miRNAs), results in a system deviating from a non-cancer state to the GBM cancer -specific phenotypic state. Utilizing global miRNA microarray expression data of normal and GBM patients tumors, surprisal analysis characterizes a miRNA system response capable of distinguishing GBM samples from normal tissue biopsy samples. We indicate that the miRNAs contributing to this system behavior is a disease phenotypic state specific to GBM and is therefore a unique GBM-specific thermodynamic signature. MiRNAs implicated in the regulation of stochastic signaling processes crucial in the hallmarks of human cancer, dominate this GBM-cancer phenotypic state. With this theory, we were able to distinguish with high fidelity GBM patients solely by monitoring the dynamics of miRNAs present in patients' biopsy samples. We anticipate that the GBM-specific thermodynamic signature will provide a critical translational tool in better characterizing cancer types and in the development of future therapeutics for GBM.

  8. Status of EUVL mask development in Europe (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Peters, Jan H.

    2005-06-01

    EUV lithography is the prime candidate for the next generation lithography technology after 193 nm immersion lithography. The commercial onset for this technology is expected for the 45 nm half-pitch technology or below. Several European and national projects and quite a large number of companies and research institutions in Europe work on various aspects of the technological challenges to make EUV a commercially viable technology in the not so far future. Here the development of EUV sources, the development of an EUV exposure tools, metrology tools dedicated for characterization of mask, the production of EUV mask blanks and the mask structuring itself are the key areas in which major activities can be found. In this talk we will primarily focus on those activities, which are related to establish an EUV mask supply chain with all its ingredients from substrate production, polishing, deposition of EUV layers, blank characterization, mask patterning process and the consecutive metrology and defect inspection as well as shipping and handling from blank supply to usage in the wafer fab. The EUV mask related projects on the national level are primarily supported by the French Ministry of Economics and Finance (MinEFi) and the German Ministry of Education and Research (BMBF).

  9. Equipment characterization to mitigate risks during transfers of cell culture manufacturing processes.

    PubMed

    Sieblist, Christian; Jenzsch, Marco; Pohlscheidt, Michael

    2016-08-01

    The production of monoclonal antibodies by mammalian cell culture in bioreactors up to 25,000 L is state of the art technology in the biotech industry. During the lifecycle of a product, several scale up activities and technology transfers are typically executed to enable the supply chain strategy of a global pharmaceutical company. Given the sensitivity of mammalian cells to physicochemical culture conditions, process and equipment knowledge are critical to avoid impacts on timelines, product quantity and quality. Especially, the fluid dynamics of large scale bioreactors versus small scale models need to be described, and similarity demonstrated, in light of the Quality by Design approach promoted by the FDA. This approach comprises an associated design space which is established during process characterization and validation in bench scale bioreactors. Therefore the establishment of predictive models and simulation tools for major operating conditions of stirred vessels (mixing, mass transfer, and shear force.), based on fundamental engineering principles, have experienced a renaissance in the recent years. This work illustrates the systematic characterization of a large variety of bioreactor designs deployed in a global manufacturing network ranging from small bench scale equipment to large scale production equipment (25,000 L). Several traditional methods to determine power input, mixing, mass transfer and shear force have been used to create a data base and identify differences for various impeller types and configurations in operating ranges typically applied in cell culture processes at manufacturing scale. In addition, extrapolation of different empirical models, e.g. Cooke et al. (Paper presented at the proceedings of the 2nd international conference of bioreactor fluid dynamics, Cranfield, UK, 1988), have been assessed for their validity in these operational ranges. Results for selected designs are shown and serve as examples of structured characterization to enable fast and agile process transfers, scale up and troubleshooting.

  10. Characterizing caged molecules through flash photolysis and transient absorption spectroscopy.

    PubMed

    Kao, Joseph P Y; Muralidharan, Sukumaran

    2013-01-01

    Caged molecules are photosensitive molecules with latent biological activity. Upon exposure to light, they are rapidly transformed into bioactive molecules such as neurotransmitters or second messengers. They are thus valuable tools for using light to manipulate biology with exceptional spatial and temporal resolution. Since the temporal performance of the caged molecule depends critically on the rate at which bioactive molecules are generated by light, it is important to characterize the kinetics of the photorelease process. This is accomplished by initiating the photoreaction with a very brief but intense pulse of light (i.e., flash photolysis) and monitoring the course of the ensuing reactions through various means, the most common of which is absorption spectroscopy. Practical guidelines for performing flash photolysis and transient absorption spectroscopy are described in this chapter.

  11. Far-Field High-Energy Diffraction Microscopy: A Non-Destructive Tool for Characterizing the Microstructure and Micromechanical State of Polycrystalline Materials

    DOE PAGES

    Park, Jun-Sang; Zhang, Xuan; Kenesei, Peter; ...

    2017-08-31

    A suite of non-destructive, three-dimensional X-ray microscopy techniques have recently been developed and used to characterize the microstructures of polycrystalline materials. These techniques utilize high-energy synchrotron radiation and include near-field and far-field diffraction microscopy (NF- and FF-HEDM, respectively) and absorption tomography. Several compatible sample environments have also been developed, enabling a wide range of 3D studies of material evolution. In this article, the FF-HEDM technique is described in detail, including its implementation at the 1-ID beamline of the Advanced Photon Source. Examples of how the information obtained from FF-HEDM can be used to deepen our understanding of structure-property-processing relationships inmore » selected materials are presented.« less

  12. Synthesis and characterization of non-hydrolysable diphosphoinositol polyphosphate second messengers

    PubMed Central

    Wu, Mingxuan; Dul, Barbara E.; Trevisan, Alexandra J.; Fiedler, Dorothea

    2012-01-01

    The diphosphoinositol polyphosphates (PP-IPs) are a central group of eukaryotic second messengers. They regulate numerous processes, including cellular energy homeostasis and adaptation to environmental stresses. To date, most of the molecular details in PP-IP signalling have remained elusive, due to a lack of appropriate methods and reagents. Here we describe the expedient synthesis of methylene-bisphosphonate PP-IP analogues. Their characterization revealed that the analogues exhibit significant stability and mimic their natural counterparts very well. This was further confirmed in two independent biochemical assays, in which our analogues potently inhibited phosphorylation of the protein kinase Akt and hydrolytic activity of the Ddp1 phosphohydrolase. The non-hydrolysable PP-IPs thus emerge as important tools and hold great promise for a variety of applications. PMID:23378892

  13. Syntheses and characterization of liposome-incorporated adamantyl aminoguanidines.

    PubMed

    Šekutor, Marina; Štimac, Adela; Mlinarić-Majerski, Kata; Frkanec, Ruža

    2014-08-21

    A series of mono and bis-aminoguanidinium adamantane derivatives has been synthesized and incorporated into liposomes. They combine two biomedically significant molecules, the adamantane moiety and the guanidinium group. The adamantane moiety possesses the membrane compatible features while the cationic guanidinium subunit was recognized as a favourable structural feature for binding to complementary molecules comprising phosphate groups. The liposome formulations of adamantyl aminoguanidines were characterized and it was shown that the entrapment efficiency of the examined compounds is significant. In addition, it was demonstrated that liposomes with incorporated adamantyl aminoguanidines effectively recognized the complementary liposomes via the phosphate group. These results indicate that adamantane derivatives bearing guanidinium groups might be versatile tools for biomedical application, from studies of molecular recognition processes to usage in drug formulation and cell targeting.

  14. The "Vsoil Platform" : a tool to integrate the various physical, chemical and biological processes contributing to the soil functioning at the local scale.

    NASA Astrophysics Data System (ADS)

    Lafolie, François; Cousin, Isabelle; Mollier, Alain; Pot, Valérie; Moitrier, Nicolas; Balesdent, Jérome; bruckler, Laurent; Moitrier, Nathalie; Nouguier, Cédric; Richard, Guy

    2014-05-01

    Models describing the soil functioning are valuable tools for addressing challenging issues related to agricultural production, soil protection or biogeochemical cycles. Coupling models that address different scientific fields is actually required in order to develop numerical tools able to simulate the complex interactions and feed-backs occurring within a soil profile in interaction with climate and human activities. We present here a component-based modelling platform named "VSoil", that aims at designing, developing, implementing and coupling numerical representation of biogeochemical and physical processes in soil, from the aggregate to the profile scales. The platform consists of four softwares, i) Vsoil_Processes dedicated to the conceptual description of processes and of their inputs and outputs, ii) Vsoil_Modules devoted to the development of numerical representation of elementary processes as modules, iii) Vsoil_Models which permits the coupling of modules to create models, iv) Vsoil_Player for the run of the model and the primary analysis of results. The platform is designed to be a collaborative tool, helping scientists to share not only their models, but also the scientific knowledge on which the models are built. The platform is based on the idea that processes of any kind can be described and characterized by their inputs (state variables required) and their outputs. The links between the processes are automatically detected by the platform softwares. For any process, several numerical representations (modules) can be developed and made available to platform users. When developing modules, the platform takes care of many aspects of the development task so that the user can focus on numerical calculations. Fortran2008 and C++ are the supported languages and existing codes can be easily incorporated into platform modules. Building a model from available modules simply requires selecting the processes being accounted for and for each process a module. During this task, the platform displays available modules and checks the compatibility between the modules. The model (main program) is automatically created when compatible modules have been selected for all the processes. A GUI is automatically generated to help the user providing parameters and initial situations. Numerical results can be immediately visualized, archived and exported. The platform also provides facilities to carry out sensitivity analysis. Parameters estimation and links with databases are being developed. The platform can be freely downloaded from the web site (http://www.inra.fr/sol_virtuel/) with a set of processes, variables, modules and models. However, it is designed so that any user can add its own components. Theses adds-on can be shared with co-workers by means of an export/import mechanism using the e-mail. The adds-on can also be made available to the whole community of platform users when developers asked for. A filtering tool is available to explore the content of the platform (processes, variables, modules, models).

  15. Competitive annealing of multiple DNA origami: formation of chimeric origami

    NASA Astrophysics Data System (ADS)

    Majikes, Jacob M.; Nash, Jessica A.; LaBean, Thomas H.

    2016-11-01

    Scaffolded DNA origami are a robust tool for building discrete nanoscale objects at high yield. This strategy ensures, in the design process, that the desired nanostructure is the minimum free energy state for the designed set of DNA sequences. Despite aiming for the minimum free energy structure, the folding process which leads to that conformation is difficult to characterize, although it has been the subject of much research. In order to shed light on the molecular folding pathways, this study intentionally frustrates the folding process of these systems by simultaneously annealing the staple pools for multiple target or parent origami structures, forcing competition. A surprising result of these competitive, simultaneous anneals is the formation of chimeric DNA origami which inherit structural regions from both parent origami. By comparing the regions inherited from the parent origami, relative stability of substructures were compared. This allowed examination of the folding process with typical characterization techniques and materials. Anneal curves were then used as a means to rapidly generate a phase diagram of anticipated behavior as a function of staple excess and parent staple ratio. This initial study shows that competitive anneals provide an exciting way to create diverse new nanostructures and may be used to examine the relative stability of various structural motifs.

  16. Process Analytical Technology for High Shear Wet Granulation: Wet Mass Consistency Reported by In-Line Drag Flow Force Sensor Is Consistent With Powder Rheology Measured by At-Line FT4 Powder Rheometer.

    PubMed

    Narang, Ajit S; Sheverev, Valery; Freeman, Tim; Both, Douglas; Stepaniuk, Vadim; Delancy, Michael; Millington-Smith, Doug; Macias, Kevin; Subramanian, Ganeshkumar

    2016-01-01

    Drag flow force (DFF) sensor that measures the force exerted by wet mass in a granulator on a thin cylindrical probe was shown as a promising process analytical technology for real-time in-line high-resolution monitoring of wet mass consistency during high shear wet granulation. Our previous studies indicated that this process analytical technology tool could be correlated to granulation end point established independently through drug product critical quality attributes. In this study, the measurements of flow force by a DFF sensor, taken during wet granulation of 3 placebo formulations with different binder content, are compared with concurrent at line FT4 Powder Rheometer characterization of wet granules collected at different time points of the processing. The wet mass consistency measured by the DFF sensor correlated well with the granulation's resistance to flow and interparticulate interactions as measured by FT4 Powder Rheometer. This indicated that the force pulse magnitude measured by the DFF sensor was indicative of fundamental material properties (e.g., shear viscosity and granule size/density), as they were changing during the granulation process. These studies indicate that DFF sensor can be a valuable tool for wet granulation formulation and process development and scale up, as well as for routine monitoring and control during manufacturing. Copyright © 2016. Published by Elsevier Inc.

  17. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  18. Proposal on How To Conduct a Biopharmaceutical Process Failure Mode and Effect Analysis (FMEA) as a Risk Assessment Tool.

    PubMed

    Zimmermann, Hartmut F; Hentschel, Norbert

    2011-01-01

    With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

  19. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  20. WE-G-BRC-02: Risk Assessment for HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayadev, J.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  1. WE-G-BRC-01: Risk Assessment for Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, G.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  2. WE-G-BRC-03: Risk Assessment for Physics Plan Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, S.

    2016-06-15

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less

  4. A Computable Definition of Sepsis Facilitates Screening and Performance Improvement Tracking

    PubMed Central

    Warmus, Holly R.; Schaffner, Erin K.; Kantawala, Sajel; Carcillo, Joseph; Rosen, Johanna; Horvat, Christopher M.

    2018-01-01

    Background: Sepsis kills almost 5,000 children annually, accounting for 16% of pediatric health care spending in the United States. Objectives: We sought to identify sepsis within the Electronic Health Record (EHR) of a quaternary children’s hospital to characterize disease incidence, improve recognition and response, and track performance metrics. Methods: Methods are organized in a plan-do-study-act cycle. During the “plan” phase, electronic definitions of sepsis (blood culture and antibiotic within 24 hours) and septic shock (sepsis plus vasoactive medication) were created to establish benchmark data and track progress with statistical process control. The performance of a screening tool was evaluated in the emergency department. During the “do” phase, a novel inpatient workflow is being piloted, which involves regular sepsis screening by nurses using the tool, and a regimented response to high risk patients. Results: Screening tool use in the emergency department reduced time to antibiotics (Fig. 1). Of the 6,159 admissions, EHR definitions identified 1,433 (23.3%) between July and December 2016 with sepsis, of which 159 (11.1%) had septic shock. Hospital mortality for all sepsis patients was 2.2% and 15.7% for septic shock (Table 1). These findings approximate epidemiologic studies of sepsis and severe sepsis, which report a prevalence range of 0.45–8.2% and mortality range of 8.2–25% (Table 2).1–5 Conclusions/Implications: Implementation of a sepsis screening tool is associated with improved performance. The prevalence of sepsis conditions identified with electronic definitions approximates the epidemiologic landscape characterized by other point-prevalence and administrative studies, providing face validity to this approach, and proving useful for tracking performance improvement. PMID:29732457

  5. Flexible continuous manufacturing platforms for solid dispersion formulations

    NASA Astrophysics Data System (ADS)

    Karry-Rivera, Krizia Marie

    In 2013 16,000 people died in the US due to overdose from prescription drugs and synthetic narcotics. As of that same year, 90% of new molecular entities in the pharmaceutical drug pipeline are classified as poor water-soluble. The work in this dissertation aims to design, develop and validate platforms that solubilize weak acids and can potentially deter drug abuse. These platforms are based on processing solid dispersions via solvent-casting and hot-melt extrusion methods to produce oral transmucosal films and melt tablets. To develop these platforms, nanocrystalline suspensions and glassy solutions were solvent-casted in the form of films after physicochemical characterizations of drug-excipient interactions and design of experiment approaches. A second order model was fitted to the emulsion diffusion process to predict average nanoparticle size and for process optimization. To further validate the manufacturing flexibility of the formulations, glassy solutions were also extruded and molded into tablets. This process included a systematic quality-by-design (QbD) approach that served to identify the factors affecting the critical quality attributes (CQAs) of the melt tablets. These products, due to their novelty, lack discriminatory performance tests that serve as predictors to their compliance and stability. Consequently, Process Analytical Technology (PAT) tools were integrated into the continuous manufacturing platform for films. Near-infrared (NIR) spectroscopy, including chemical imaging, combined with deconvolution algorithms were utilized for a holistic assessment of the effect of formulation and process variables on the product's CQAs. Biorelevant dissolution protocols were then established to improve the in-vivo in-vitro correlation of the oral transmucosal films. In conclusion, the work in this dissertation supports the delivery of poor-water soluble drugs in products that may deter abuse. Drug nanocrystals ensured high bioavailability, while glassy solutions enabled drug solubilization in polymer matrices. PAT tools helped in characterizing the micro and macro structure of the product while also used as a control strategy for manufacturing. The systematic QbD assessment enabled identification of the variables that significantly affected melt tablet performance and their potential as an abuse deterrent product. Being that these glassy products are novel systems, biorelevant protocols for testing dissolution performance of films were also developed.

  6. A review on characterization and bioremediation of pharmaceutical industries' wastewater: an Indian perspective

    NASA Astrophysics Data System (ADS)

    Rana, Rajender Singh; Singh, Prashant; Kandari, Vikash; Singh, Rakesh; Dobhal, Rajendra; Gupta, Sanjay

    2017-03-01

    During the past few decades, pharmaceutical industries have registered a quantum jump contributing to high economic growth, but simultaneously it has also given rise to severe environmental pollution. Untreated or allegedly treated pharmaceutical industrial wastewater (PIWW) creates a need for time to time assessment and characterization of discharged wastewater as per the standards provided by the regulatory authorities. To control environmental pollution, pharmaceutical industries use different treatment plans to treat and reuse wastewater. The characterization of PIWW using advanced and coupled techniques has progressed to a much advanced level, but in view of new developments in drug manufacture for emerging diseases and the complexities associated with them, better sophisticated instrumentation and methods of treatment are warranted. The bioremediation process to treat PIWW has undergone more intense investigation in recent decade. This results in the complete mineralization of pharmaceutical industries' wastewater and no waste product is obtained. Moreover, high efficiency and low operation cost prove it to be an effective tool for the treatment of PIWW. The present review focuses on the characterization as well as bioremediation aspects of PIWW.

  7. Improving Planetary Rover Attitude Estimation via MEMS Sensor Characterization

    PubMed Central

    Hidalgo, Javier; Poulakis, Pantelis; Köhler, Johan; Del-Cerro, Jaime; Barrientos, Antonio

    2012-01-01

    Micro Electro-Mechanical Systems (MEMS) are currently being considered in the space sector due to its suitable level of performance for spacecrafts in terms of mechanical robustness with low power consumption, small mass and size, and significant advantage in system design and accommodation. However, there is still a lack of understanding regarding the performance and testing of these new sensors, especially in planetary robotics. This paper presents what is missing in the field: a complete methodology regarding the characterization and modeling of MEMS sensors with direct application. A reproducible and complete approach including all the intermediate steps, tools and laboratory equipment is described. The process of sensor error characterization and modeling through to the final integration in the sensor fusion scheme is explained with detail. Although the concept of fusion is relatively easy to comprehend, carefully characterizing and filtering sensor information is not an easy task and is essential for good performance. The strength of the approach has been verified with representative tests of novel high-grade MEMS inertia sensors and exemplary planetary rover platforms with promising results. PMID:22438761

  8. 3D-liquid chromatography as a complex mixture characterization tool for knowledge-based downstream process development.

    PubMed

    Hanke, Alexander T; Tsintavi, Eleni; Ramirez Vazquez, Maria Del Pilar; van der Wielen, Luuk A M; Verhaert, Peter D E M; Eppink, Michel H M; van de Sandt, Emile J A X; Ottens, Marcel

    2016-09-01

    Knowledge-based development of chromatographic separation processes requires efficient techniques to determine the physicochemical properties of the product and the impurities to be removed. These characterization techniques are usually divided into approaches that determine molecular properties, such as charge, hydrophobicity and size, or molecular interactions with auxiliary materials, commonly in the form of adsorption isotherms. In this study we demonstrate the application of a three-dimensional liquid chromatography approach to a clarified cell homogenate containing a therapeutic enzyme. Each separation dimension determines a molecular property relevant to the chromatographic behavior of each component. Matching of the peaks across the different separation dimensions and against a high-resolution reference chromatogram allows to assign the determined parameters to pseudo-components, allowing to determine the most promising technique for the removal of each impurity. More detailed process design using mechanistic models requires isotherm parameters. For this purpose, the second dimension consists of multiple linear gradient separations on columns in a high-throughput screening compatible format, that allow regression of isotherm parameters with an average standard error of 8%. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1283-1291, 2016. © 2016 American Institute of Chemical Engineers.

  9. Visualizing red blood cell sickling and the effects of inhibition of sphingosine kinase 1 using soft X-ray tomography

    DOE PAGES

    Darrow, Michele C.; Zhang, Yujin; Cinquin, Bertrand P.; ...

    2016-08-09

    Sickle cell disease is a destructive genetic disorder characterized by the formation of fibrils of deoxygenated hemoglobin, leading to the red blood cell (RBC) morphology changes that underlie the clinical manifestations of this disease. Here, using cryogenic soft X-ray tomography (SXT), we characterized the morphology of sickled RBCs in terms of volume and the number of protrusions per cell. We were able to identify statistically a relationship between the number of protrusions and the volume of the cell, which is known to correlate to the severity of sickling. This structural polymorphism allows for the classification of the stages of themore » sickling process. Recent studies have shown that elevated sphingosine kinase 1 (Sphk1)-mediated sphingosine 1-phosphate production contributes to sickling. Here, we further demonstrate that compound 5C, an inhibitor of Sphk1, has anti-sickling properties. Additionally, the variation in cellular morphology upon treatment suggests that this drug acts to delay the sickling process. SXT is an effective tool that can be used to identify the morphology of the sickling process and assess the effectiveness of potential therapeutics.« less

  10. A Study of Drop-Microstructured Surface Interactions during Dropwise Condensation with Quartz Crystal Microbalance

    PubMed Central

    Su, Junwei; Charmchi, Majid; Sun, Hongwei

    2016-01-01

    Dropwise condensation (DWC) on hydrophobic surfaces is attracting attention for its great potential in many industrial applications, such as steam power plants, water desalination, and de-icing of aerodynamic surfaces, to list a few. The direct dynamic characterization of liquid/solid interaction can significantly accelerate the progress toward a full understanding of the thermal and mass transport mechanisms during DWC processes. This work reports a novel Quartz Crystal Microbalance (QCM) based method that can quantitatively analyze the interaction between water droplets and micropillar surfaces during different condensation states such as filmwise, Wenzel, and partial Cassie states. A combined nanoimprinting lithography and chemical surface treatment approach was utilized to fabricate the micropillar based superhydrophobic and superhydrophilic surfaces on the QCM substrates. The normalized frequency shift of the QCM device together with the microscopic observation of the corresponding drop motion revealed the droplets growth and their coalescence processes and clearly demonstrated the differences between the three aforementioned condensation states. In addition, the transition between Cassie and Wenzel states was successfully captured by this method. The newly developed QCM system provides a valuable tool for the dynamic characterization of different condensation processes. PMID:27739452

  11. Challenges in Biomarker Discovery: Combining Expert Insights with Statistical Analysis of Complex Omics Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Wang, Jing; Mitchell, Hugh D.

    2013-01-01

    The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities both for purely statistical and expert knowledge-based approaches and would benefit from improved integration of the two. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges thatmore » have been encountered. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to biomarker discovery and characterization are key to future success in the biomarker field. We will describe our recommendations of possible approaches to this problem including metrics for the evaluation of biomarkers.« less

  12. Visualizing red blood cell sickling and the effects of inhibition of sphingosine kinase 1 using soft X-ray tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darrow, Michele C.; Zhang, Yujin; Cinquin, Bertrand P.

    Sickle cell disease is a destructive genetic disorder characterized by the formation of fibrils of deoxygenated hemoglobin, leading to the red blood cell (RBC) morphology changes that underlie the clinical manifestations of this disease. Here, using cryogenic soft X-ray tomography (SXT), we characterized the morphology of sickled RBCs in terms of volume and the number of protrusions per cell. We were able to identify statistically a relationship between the number of protrusions and the volume of the cell, which is known to correlate to the severity of sickling. This structural polymorphism allows for the classification of the stages of themore » sickling process. Recent studies have shown that elevated sphingosine kinase 1 (Sphk1)-mediated sphingosine 1-phosphate production contributes to sickling. Here, we further demonstrate that compound 5C, an inhibitor of Sphk1, has anti-sickling properties. Additionally, the variation in cellular morphology upon treatment suggests that this drug acts to delay the sickling process. SXT is an effective tool that can be used to identify the morphology of the sickling process and assess the effectiveness of potential therapeutics.« less

  13. Droughts and water scarcity: facing challenges

    NASA Astrophysics Data System (ADS)

    Pereira, Luis S.

    2014-05-01

    Water scarcity characterizes large portions of the world, particularly the Mediterranean area. It is due to natural causes - climate aridity, which is permanent, and droughts, that are temporary - and to human causes - long term desertification and short term water shortages. Droughts aggravate water scarcity. Knowledge has well developed relative to all processes but management tools still are insufficient as well as the tools required to support appropriate planning and management. Particularly, new approaches on tools for assessing related impacts in agriculture and other economic and social activities are required. Droughts occur in all climates but their characteristics largely differ among regions both in terms frequency, duration and intensity. Research has already produced a large number of tools that allow appropriate monitoring of droughts occurrence and intensity, including dynamics of drought occurrence and time evolution. Advances in drought prediction already are available but we still are far from knowing when a drought will start, how it will evolve and when it dissipates. New developments using teleconnections and GCM are being considered. Climate change is a fact. Are droughts occurrence and severity changing with global change? Opinions are divided about this subject since driving factors and processes are varied and tools for the corresponding analysis are also various. Particularly, weather data series are often too short for obtaining appropriate answers. In a domain where research is producing improved knowledge and innovative approaches, research faces however a variety of challenges. The main ones, dealt in this keynote, refer to concepts and definitions, use of monitoring indices, prediction of drought initiation and evolution, improved assessment of drought impacts, and possible influence of climate change on drought occurrence and severity.

  14. Electrostatic Levitation: A Tool to Support Materials Research in Microgravity

    NASA Technical Reports Server (NTRS)

    Rogers, Jan; SanSoucie, Mike

    2012-01-01

    Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.

  15. System decontamination as a tool to control radiation fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riess, R.; Bertholdt, H.O.

    1995-03-01

    Since chemical decontamination of the Reactor Coolant Systems (RCS) and subsystems has the highest potential to reduce radiation fields in a short term this technology has gained an increasing importance. The available decontamination process at Siemens, i.e., the CORD processes, will be described. It is characterized by using permanganic acid for preoxidation and diluted organic acid for the decontamination step. It is a regenerative process resulting in very low waste volumes. This technology has been used frequently in Europe and Japan in both RCS and subsystems. An overview will be given i.e. on the 1993 applications. This overview will includemore » plant, scope, date of performance, system volume specal features of the process removed activities, decon factor time, waste volumes, and personnel dose during decontamination. This overview will be followed by an outlook on future developments in this area.« less

  16. Holographic digital microscopy in on-line process control

    NASA Astrophysics Data System (ADS)

    Osanlou, Ardeshir

    2011-09-01

    This article investigates the feasibility of real-time three-dimensional imaging of microscopic objects within various emulsions while being produced in specialized production vessels. The study is particularly relevant to on-line process monitoring and control in chemical, pharmaceutical, food, cleaning, and personal hygiene industries. Such processes are often dynamic and the materials cannot be measured once removed from the production vessel. The technique reported here is applicable to three-dimensional characterization analyses on stirred fluids in small reaction vessels. Relatively expensive pulsed lasers have been avoided through the careful control of the speed of the moving fluid in relation to the speed of the camera exposure and the wavelength of the continuous wave laser used. The ultimate aim of the project is to introduce a fully robust and compact digital holographic microscope as a process control tool in a full size specialized production vessel.

  17. The Impact of Solid Surface Features on Fluid-Fluid Interface Configuration

    NASA Astrophysics Data System (ADS)

    Araujo, J. B.; Brusseau, M. L. L.

    2017-12-01

    Pore-scale fluid processes in geological media are critical for a broad range of applications such as radioactive waste disposal, carbon sequestration, soil moisture distribution, subsurface pollution, land stability, and oil and gas recovery. The continued improvement of high-resolution image acquisition and processing have provided a means to test the usefulness of theoretical models developed to simulate pore-scale fluid processes, through the direct quantification of interfaces. High-resolution synchrotron X-ray microtomography is used in combination with advanced visualization tools to characterize fluid distributions in natural geologic media. The studies revealed the presence of fluid-fluid interface associated with macroscopic features on the surfaces of the solids such as pits and crevices. These features and respective fluid interfaces, which are not included in current theoretical or computational models, may have a significant impact on accurate simulation and understanding of multi-phase flow, energy, heat and mass transfer processes.

  18. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  19. PROTERAN: animated terrain evolution for visual analysis of patterns in protein folding trajectory.

    PubMed

    Zhou, Ruhong; Parida, Laxmi; Kapila, Kush; Mudur, Sudhir

    2007-01-01

    The mechanism of protein folding remains largely a mystery in molecular biology, despite the enormous effort from many groups in the past decades. Currently, the protein folding mechanism is often characterized by calculating the free energy landscape versus various reaction coordinates such as the fraction of native contacts, the radius of gyration and so on. In this paper, we present an integrated approach towards understanding the folding process via visual analysis of patterns of these reaction coordinates. The three disparate processes (1) protein folding simulation, (2) pattern elicitation and (3) visualization of patterns, work in tandem. Thus as the protein folds, the changing landscape in the pattern space can be viewed via the visualization tool, PROTERAN, a program we developed for this purpose. We first present an incremental (on-line) trie-based pattern discovery algorithm to elicit the patterns and then describe the terrain metaphor based visualization tool. Using two example small proteins, a beta-hairpin and a designed protein Trp-cage, we next demonstrate that this combined pattern discovery and visualization approach extracts crucial information about protein folding intermediates and mechanism.

  20. Development of the Thalamocortical Interactions: Past, Present and Future.

    PubMed

    López-Bendito, Guillermina

    2018-06-20

    For the past two decades, we have advanced in our understanding of the mechanisms implicated in the formation of brain circuits. The connection between the cortex and thalamus has deserved much attention, as thalamocortical connectivity is crucial for sensory processing and motor learning. Classical dye tracing studies in wild-type and knockout mice initially helped to characterize the developmental progression of this connectivity and revealed key transcription factors involved. With the recent advances in technical tools to specifically label subsets of projecting neurons, knock-down genes individually and/or modify their activity, the field has gained further understanding on the rules operating in thalamocortical circuit formation and plasticity. In this review, I will summarize the most relevant discoveries that have been made in this field, from development to early plasticity processes covering three major aspects: axon guidance, thalamic influence on sensory cortical specification, and the role of spontaneous thalamic activity. I will emphasize how the implementation of new tools has helped the field to progress and what I consider to be open questions and the perspective for the future. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

Top