Launch Vehicle Design Process Characterization Enables Design/Project Tool
NASA Technical Reports Server (NTRS)
Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.; Robinson, Nancy (Technical Monitor)
2001-01-01
The objectives of the project described in this viewgraph presentation included the following: (1) Provide an overview characterization of the launch vehicle design process; and (2) Delineate design/project tool to identify, document, and track pertinent data.
Standardizing Exoplanet Analysis with the Exoplanet Characterization Tool Kit (ExoCTK)
NASA Astrophysics Data System (ADS)
Fowler, Julia; Stevenson, Kevin B.; Lewis, Nikole K.; Fraine, Jonathan D.; Pueyo, Laurent; Bruno, Giovanni; Filippazzo, Joe; Hill, Matthew; Batalha, Natasha; Wakeford, Hannah; Bushra, Rafia
2018-06-01
Exoplanet characterization depends critically on analysis tools, models, and spectral libraries that are constantly under development and have no single source nor sense of unified style or methods. The complexity of spectroscopic analysis and initial time commitment required to become competitive is prohibitive to new researchers entering the field, as well as a remaining obstacle for established groups hoping to contribute in a comparable manner to their peers. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface including tools that address atmospheric characterization, transit observation planning with JWST, JWST corongraphy simulations, limb darkening, forward modeling, and data reduction, as well as libraries of stellar, planet, and opacity models. The foundation of these software tools and libraries exist within pockets of the exoplanet community, but our project will gather these seedling tools and grow a robust, uniform, and well-maintained exoplanet characterization toolkit.
National Water-Quality Assessment (NAWQA) area-characterization toolbox
Price, Curtis V.; Nakagaki, Naomi; Hitt, Kerie J.
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells. These tools are built on top of standard functionality included in ArcGIS Desktop running at the ArcInfo license level. Most of the tools require a license for the ArcGIS Spatial Analyst extension. ArcGIS is a commercial GIS software system produced by ESRI, Inc. (http://www.esri.com). The NAWQA Area-Characterization Toolbox is not supported by ESRI, Inc. or its technical support staff. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
Designing tools for oil exploration using nuclear modeling
NASA Astrophysics Data System (ADS)
Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike
2017-09-01
When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.
Graduate level design - Courses and projects: An untapped resource
NASA Technical Reports Server (NTRS)
Dubrawsky, Ido; Neff, Jon M.; Pinon, Elfego, III; Fowler, Wallace T.
1993-01-01
The authors describe their experiences at a major space engineering university (the University of Texas at Austin) in the use of graduate level design courses and projects to produce information and tools that are of use to undergraduate design classes, graduate students, and industry. The information produced to date includes a spacecraft subsystems information document, a mission design tool (a FORTRAN subroutine library), a series of space mission characterizations, and a set of spacecraft characterizations.
Combined experiment Phase 2 data characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, M.S.; Shipley, D.E.; Young, T.S.
1995-11-01
The National Renewable Energy Laboratory`s ``Combined Experiment`` has yielded a large quantity of experimental data on the operation of a downwind horizontal axis wind turbine under field conditions. To fully utilize this valuable resource and identify particular episodes of interest, a number of databases were created that characterize individual data events and rotational cycles over a wide range of parameters. Each of the 59 five-minute data episodes collected during Phase 11 of the Combined Experiment have been characterized by the mean, minimum, maximum, and standard deviation of all data channels, except the blade surface pressures. Inflow condition, aerodynamic force coefficient,more » and minimum leading edge pressure coefficient databases have also been established, characterizing each of nearly 21,000 blade rotational cycles. In addition, a number of tools have been developed for searching these databases for particular episodes of interest. Due to their extensive size, only a portion of the episode characterization databases are included in an appendix, and examples of the cycle characterization databases are given. The search tools are discussed and the FORTRAN or C code for each is included in appendices.« less
Shi, Wuxian; Chance, Mark R.
2010-01-01
About one-third of all proteins are associated with a metal. Metalloproteomics is defined as the structural and functional characterization of metalloproteins on a genome-wide scale. The methodologies utilized in metalloproteomics, including both forward (bottom-up) and reverse (top-down) technologies, to provide information on the identity, quantity and function of metalloproteins are discussed. Important techniques frequently employed in metalloproteomics include classical proteomics tools such as mass spectrometry and 2-D gels, immobilized-metal affinity chromatography, bioinformatics sequence analysis and homology modeling, X-ray absorption spectroscopy and other synchrotron radiation based tools. Combinative applications of these techniques provide a powerful approach to understand the function of metalloproteins. PMID:21130021
Extensional rheometry with a handheld mobile device
NASA Astrophysics Data System (ADS)
Marshall, Kristin A.; Liedtke, Aleesha M.; Todt, Anika H.; Walker, Travis W.
2017-06-01
The on-site characterization of complex fluids is important for a number of academic and industrial applications. Consequently, a need exists to develop portable rheometers that can provide in the field diagnostics and serve as tools for rapid quality assurance. With the advancement of smartphone technology and the widespread global ownership of smart devices, mobile applications are attractive as platforms for rheological characterization. The present work investigates the use of a smartphone device for the extensional characterization of a series of Boger fluids composed of glycerol/water and poly(ethylene oxide), taking advantage of the increasing high-speed video capabilities (currently up to 240 Hz capture rate at 720p) of smartphone cameras. We report a noticeable difference in the characterization of samples with slight variations in polymer concentration and discuss current device limitations. Potential benefits of a handheld extensional rheometer include its use as a point-of-care diagnostic tool, especially in developing communities, as well as a simple and inexpensive tool for assessing product quality in industry.
Development of Chemical and Metabolite Sensors for Rhodococcus opacus PD630.
DeLorenzo, Drew M; Henson, William R; Moon, Tae Seok
2017-10-20
Rhodococcus opacus PD630 is a nonmodel, Gram-positive bacterium that possesses desirable traits for biomass conversion, including consumption capabilities for lignocellulose-based sugars and toxic lignin-derived aromatic compounds, significant triacylglycerol accumulation, relatively rapid growth rate, and genetic tractability. However, few genetic elements have been directly characterized in R. opacus, limiting its application for lignocellulose bioconversion. Here, we report the characterization and development of genetic tools for tunable gene expression in R. opacus, including: (1) six fluorescent reporters for quantifying promoter output, (2) three chemically inducible promoters for variable gene expression, and (3) two classes of metabolite sensors derived from native R. opacus promoters that detect nitrogen levels or aromatic compounds. Using these tools, we also provide insights into native aromatic consumption pathways in R. opacus. Overall, this work expands the ability to control and characterize gene expression in R. opacus for future lignocellulose-based fuel and chemical production.
Models, Measurements, and Local Decisions: Assessing and ...
This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Evaluation of equipment and methods to map lost circulation zones in geothermal wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, W.J.; Leon, P.A.; Pittard, G.
A study and evaluation of methods to locate, characterize, and quantify lost circulation zones are described. Twenty-five methods of mapping and quantifying lost circulation zones were evaluated, including electrical, acoustical, mechanical, radioactive, and optical systems. Each tool studied is described. The structured, numerical evaluation plan, used as the basis for comparing the 25 tools, and the resulting ranking among the tools is presented.
Wireline system for multiple direct push tool usage
Bratton, Wesley L.; Farrington, Stephen P.; Shinn, II, James D.; Nolet, Darren C.
2003-11-11
A tool latching and retrieval system allows the deployment and retrieval of a variety of direct push subsurface characterization tools through an embedded rod string during a single penetration without requiring withdrawal of the string from the ground. This enables the in situ interchange of different tools, as well as the rapid retrieval of soil core samples from multiple depths during a single direct push penetration. The system includes specialized rods that make up the rod string, a tool housing which is integral to the rod string, a lock assembly, and several tools which mate to the lock assembly.
ERIC Educational Resources Information Center
Pacilio, Julia E.; Tokarski, John T.; Quiñones, Rosalynn; Iuliucci, Robbie J.
2014-01-01
High-resolution solid-state NMR (SSNMR) spectroscopy has many advantages as a tool to characterize solid-phase material that finds applications in polymer chemistry, nanotechnology, materials science, biomolecular structure determination, and others, including the pharmaceutical industry. The technology associated with achieving high resolution…
Method and apparatus for characterizing and enhancing the functional performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David
2013-04-30
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.
Method and apparatus for characterizing and enhancing the dynamic performance of machine tools
Barkman, William E; Babelay, Jr., Edwin F
2013-12-17
Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include dynamic one axis positional accuracy of the machine tool, dynamic cross-axis stability of the machine tool, and dynamic multi-axis positional accuracy of the machine tool.
Chen, Guangbo; Zhao, Yufei; Shang, Lu; Waterhouse, Geoffrey I N; Kang, Xiaofeng; Wu, Li-Zhu; Tung, Chen-Ho; Zhang, Tierui
2016-07-01
Monovalent Zn + (3d 10 4s 1 ) systems possess a special electronic structure that can be exploited in heterogeneous catalysis and photocatalysis, though it remains challenge to synthesize Zn + -containing materials. By careful design, Zn + -related species can be synthesized in zeolite and layered double hydroxide systems, which in turn exhibit excellent catalytic potential in methane, CO and CO 2 activation. Furthermore, by utilizing advanced characterization tools, including electron spin resonance, X-ray absorption fine structure and density functional theory calculations, the formation mechanism of the Zn + species and their structure-performance relationships can be understood. Such advanced characterization tools guide the rational design of high-performance Zn + -containing catalysts for efficient energy conversion.
Chen, Guangbo; Zhao, Yufei; Shang, Lu; Waterhouse, Geoffrey I. N.; Kang, Xiaofeng; Wu, Li‐Zhu; Tung, Chen‐Ho
2016-01-01
Monovalent Zn+ (3d104s1) systems possess a special electronic structure that can be exploited in heterogeneous catalysis and photocatalysis, though it remains challenge to synthesize Zn+‐containing materials. By careful design, Zn+‐related species can be synthesized in zeolite and layered double hydroxide systems, which in turn exhibit excellent catalytic potential in methane, CO and CO2 activation. Furthermore, by utilizing advanced characterization tools, including electron spin resonance, X‐ray absorption fine structure and density functional theory calculations, the formation mechanism of the Zn+ species and their structure‐performance relationships can be understood. Such advanced characterization tools guide the rational design of high‐performance Zn+‐containing catalysts for efficient energy conversion. PMID:27818902
PIPI: PTM-Invariant Peptide Identification Using Coding Method.
Yu, Fengchao; Li, Ning; Yu, Weichuan
2016-12-02
In computational proteomics, the identification of peptides with an unlimited number of post-translational modification (PTM) types is a challenging task. The computational cost associated with database search increases exponentially with respect to the number of modified amino acids and linearly with respect to the number of potential PTM types at each amino acid. The problem becomes intractable very quickly if we want to enumerate all possible PTM patterns. To address this issue, one group of methods named restricted tools (including Mascot, Comet, and MS-GF+) only allow a small number of PTM types in database search process. Alternatively, the other group of methods named unrestricted tools (including MS-Alignment, ProteinProspector, and MODa) avoids enumerating PTM patterns with an alignment-based approach to localizing and characterizing modified amino acids. However, because of the large search space and PTM localization issue, the sensitivity of these unrestricted tools is low. This paper proposes a novel method named PIPI to achieve PTM-invariant peptide identification. PIPI belongs to the category of unrestricted tools. It first codes peptide sequences into Boolean vectors and codes experimental spectra into real-valued vectors. For each coded spectrum, it then searches the coded sequence database to find the top scored peptide sequences as candidates. After that, PIPI uses dynamic programming to localize and characterize modified amino acids in each candidate. We used simulation experiments and real data experiments to evaluate the performance in comparison with restricted tools (i.e., Mascot, Comet, and MS-GF+) and unrestricted tools (i.e., Mascot with error tolerant search, MS-Alignment, ProteinProspector, and MODa). Comparison with restricted tools shows that PIPI has a close sensitivity and running speed. Comparison with unrestricted tools shows that PIPI has the highest sensitivity except for Mascot with error tolerant search and ProteinProspector. These two tools simplify the task by only considering up to one modified amino acid in each peptide, which results in a higher sensitivity but has difficulty in dealing with multiple modified amino acids. The simulation experiments also show that PIPI has the lowest false discovery proportion, the highest PTM characterization accuracy, and the shortest running time among the unrestricted tools.
Stand-Alone Measurements and Characterization | Photovoltaic Research |
Science and Technology Facility cluster tools offer powerful capabilities for measuring and characterizing Characterization tool suite are supplemented by the Integrated Measurements and Characterization cluster tool the Integrated M&C cluster tool using a mobile transport pod, which can keep samples under vacuum
Development of Chemical and Metabolite Sensors for Rhodococcus opacus PD630
DeLorenzo, Drew M.; Henson, William R.; Moon, Tae Seok
2017-07-26
Rhodococcus opacus PD630 is a non-model, gram positive bacterium that possesses desirable traits for biomass conversion, including consumption capabilities for lignocellulose-based sugars and toxic lignin-derived aromatic compounds, significant triacylglycerol accumulation, relatively rapid growth rate, and genetic tractability. However, few genetic elements have been directly characterized in R. opacus, limiting its application for lignocellulose bioconversion. Here, we report the characterization and development of genetic tools for tunable gene expression in R. opacus, including: 1) six fluorescent reporters for quantifying promoter output, 2) three chemically inducible promoters for variable gene expression, and 3) two classes of metabolite sensors derived from native R.more » opacus promoters that detect nitrogen levels or aromatic compounds. Using these tools, we also provide insights into native aromatic consumption pathways in R. opacus. Overall, this work expands the ability to control and characterize gene expression in R. opacus for future lignocellulose-based fuel and chemical production.« less
Fundamental CRISPR-Cas9 tools and current applications in microbial systems.
Tian, Pingfang; Wang, Jia; Shen, Xiaolin; Rey, Justin Forrest; Yuan, Qipeng; Yan, Yajun
2017-09-01
Derived from the bacterial adaptive immune system, CRISPR technology has revolutionized conventional genetic engineering methods and unprecedentedly facilitated strain engineering. In this review, we outline the fundamental CRISPR tools that have been employed for strain optimization. These tools include CRISPR editing, CRISPR interference, CRISPR activation and protein imaging. To further characterize the CRISPR technology, we present current applications of these tools in microbial systems, including model- and non-model industrial microorganisms. Specially, we point out the major challenges of the CRISPR tools when utilized for multiplex genome editing and sophisticated expression regulation. To address these challenges, we came up with strategies that place emphasis on the amelioration of DNA repair efficiency through CRISPR-Cas9-assisted recombineering. Lastly, multiple promising research directions were proposed, mainly focusing on CRISPR-based construction of microbial ecosystems toward high production of desired chemicals.
Intentional defect array wafers: their practical use in semiconductor control and monitoring systems
NASA Astrophysics Data System (ADS)
Emami, Iraj; McIntyre, Michael; Retersdorf, Michael
2003-07-01
In the competitive world of semiconductor manufacturing today, control of the process and manufacturing equipment is paramount to success of the business. Consistent with the need for rapid development of process technology, is a need for development wiht respect to equipment control including defect metrology tools. Historical control methods for defect metrology tools included a raw count of defects detected on a characterized production or test wafer with little or not regard to the attributes of the detected defects. Over time, these characterized wafers degrade with multiple passes on the tools and handling requiring the tool owner to create and characterize new samples periodically. With the complex engineering software analysis systems used today, there is a strong reliance on the accuracy of defect size, location, and classification in order to provide the best value when correlating the in line to sort type of data. Intentional Defect Array (IDA) wafers were designed and manufacturered at International Sematech (ISMT) in Austin, Texas and is a product of collaboration between ISMT member companies and suppliers of advanced defect inspection equipment. These wafers provide the use with known defect types and sizes in predetermined locations across the entire wafer. The wafers are designed to incorporate several desired flows and use critical dimensions consistent with current and future technology nodes. This paper briefly describes the design of the IDA wafer and details many practical applications in the control of advanced defect inspection equipment.
ERIC Educational Resources Information Center
Borko, Hilda; Stecher, Brian; Kuffner, Karin
2007-01-01
This document includes the final data collection and scoring tools created by the "Scoop" project, a five-year project funded through the Center for Evaluation, Standards,and Student Testing (CRESST), to develop an alternative approach for characterizing classroom practice. The goal of the project was to use artifacts and related materials to…
Assessment of SOAP note evaluation tools in colleges and schools of pharmacy.
Sando, Karen R; Skoy, Elizabeth; Bradley, Courtney; Frenzel, Jeanne; Kirwin, Jennifer; Urteaga, Elizabeth
2017-07-01
To describe current methods used to assess SOAP notes in colleges and schools of pharmacy. Members of the American Association of Colleges of Pharmacy Laboratory Instructors Special Interest Group were invited to share assessment tools for SOAP notes. Content of submissions was evaluated to characterize overall qualities and how the tools assessed subjective, objective, assessment, and plan information. Thirty-nine assessment tools from 25 schools were evaluated. Twenty-nine (74%) of the tools were rubrics and ten (26%) were checklists. All rubrics included analytic scoring elements, while two (7%) were mixed with holistic and analytic scoring elements. A majority of the rubrics (35%) used a four-item rating scale. Substantial variability existed in how tools evaluated subjective and objective sections. All tools included problem identification in the assessment section. Other assessment items included goals (82%) and rationale (69%). Seventy-seven percent assessed drug therapy; however, only 33% assessed non-drug therapy. Other plan items included education (59%) and follow-up (90%). There is a great deal of variation in the specific elements used to evaluate SOAP notes in colleges and schools of pharmacy. Improved consistency in assessment methods to evaluate SOAP notes may better prepare students to produce standardized documentation when entering practice. Copyright © 2017 Elsevier Inc. All rights reserved.
Waters, Martha; McKernan, Lauralynn; Maier, Andrew; Jayjock, Michael; Schaeffer, Val; Brosseau, Lisa
2015-01-01
The fundamental goal of this article is to describe, define, and analyze the components of the risk characterization process for occupational exposures. Current methods are described for the probabilistic characterization of exposure, including newer techniques that have increasing applications for assessing data from occupational exposure scenarios. In addition, since the probability of health effects reflects variability in the exposure estimate as well as the dose-response curve—the integrated considerations of variability surrounding both components of the risk characterization provide greater information to the occupational hygienist. Probabilistic tools provide a more informed view of exposure as compared to use of discrete point estimates for these inputs to the risk characterization process. Active use of such tools for exposure and risk assessment will lead to a scientifically supported worker health protection program. Understanding the bases for an occupational risk assessment, focusing on important sources of variability and uncertainty enables characterizing occupational risk in terms of a probability, rather than a binary decision of acceptable risk or unacceptable risk. A critical review of existing methods highlights several conclusions: (1) exposure estimates and the dose-response are impacted by both variability and uncertainty and a well-developed risk characterization reflects and communicates this consideration; (2) occupational risk is probabilistic in nature and most accurately considered as a distribution, not a point estimate; and (3) occupational hygienists have a variety of tools available to incorporate concepts of risk characterization into occupational health and practice. PMID:26302336
Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures
2017-11-01
ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT
Characterization and measurement of polymer wear
NASA Technical Reports Server (NTRS)
Buckley, D. H.; Aron, P. R.
1984-01-01
Analytical tools which characterize the polymer wear process are discussed. The devices discussed include: visual observation of polymer wear with SEM, the quantification with surface profilometry and ellipsometry, to study the chemistry with AES, XPS and SIMS, to establish interfacial polymer orientation and accordingly bonding with QUARTIR, polymer state with Raman spectroscopy and stresses that develop in polymer films using a X-ray double crystal camera technique.
NASA Astrophysics Data System (ADS)
DeArmond, Fredrick Michael
As optical microscopy techniques continue to improve, most notably the development of super-resolution optical microscopy which garnered the Nobel Prize in Chemistry in 2014, renewed emphasis has been placed on the development and use of fluorescence microscopy techniques. Of particular note is a renewed interest in multiphoton excitation due to a number of inherent properties of the technique including simplified optical filtering, increased sample penetration, and inherently confocal operation. With this renewed interest in multiphoton fluorescence microscopy, comes an increased demand for robust non-linear fluorescent markers, and characterization of the associated tool set. These factors have led to an experimental setup to allow a systematized approach for identifying and characterizing properties of fluorescent probes in the hopes that the tool set will provide researchers with additional information to guide their efforts in developing novel fluorophores suitable for use in advanced optical microscopy techniques as well as identifying trends for their synthesis. Hardware was setup around a software control system previously developed. Three experimental tool sets were set up, characterized, and applied over the course of this work. These tools include scanning multiphoton fluorescence microscope with single molecule sensitivity, an interferometric autocorrelator for precise determination of the bandwidth and pulse width of the ultrafast Titanium Sapphire excitation source, and a simplified fluorescence microscope for the measurement of two-photon absorption cross sections. Resulting values for two-photon absorption cross sections and two-photon absorption action cross sections for two standardized fluorophores, four commercially available fluorophores, and ten novel fluorophores are presented as well as absorption and emission spectra.
Friction stir weld tools having fine grain structure
Grant, Glenn J.; Frye, John G.; Kim, Jin Yong; Lavender, Curt A.; Weil, Kenneth Scott
2016-03-15
Tools for friction stir welding can be made with fewer process steps, lower cost techniques, and/or lower cost ingredients than other state-of-the-art processes by utilizing improved compositions and processes of fabrication. Furthermore, the tools resulting from the improved compositions and processes of fabrication can exhibit better distribution and homogeneity of chemical constituents, greater strength, and/or increased durability. In one example, a friction stir weld tool includes tungsten and rhenium and is characterized by carbide and oxide dispersoids, by carbide particulates, and by grains that comprise a solid solution of the tungsten and rhenium. The grains do not exceed 10 micrometers in diameter.
Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.
2001-01-01
Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.
POVME 2.0: An Enhanced Tool for Determining Pocket Shape and Volume Characteristics
2015-01-01
Analysis of macromolecular/small-molecule binding pockets can provide important insights into molecular recognition and receptor dynamics. Since its release in 2011, the POVME (POcket Volume MEasurer) algorithm has been widely adopted as a simple-to-use tool for measuring and characterizing pocket volumes and shapes. We here present POVME 2.0, which is an order of magnitude faster, has improved accuracy, includes a graphical user interface, and can produce volumetric density maps for improved pocket analysis. To demonstrate the utility of the algorithm, we use it to analyze the binding pocket of RNA editing ligase 1 from the unicellular parasite Trypanosoma brucei, the etiological agent of African sleeping sickness. The POVME analysis characterizes the full dynamics of a potentially druggable transient binding pocket and so may guide future antitrypanosomal drug-discovery efforts. We are hopeful that this new version will be a useful tool for the computational- and medicinal-chemist community. PMID:25400521
Electromagnetic nondestructive evaluation of tempering process in AISI D2 tool steel
NASA Astrophysics Data System (ADS)
Kahrobaee, Saeed; Kashefi, Mehrdad
2015-05-01
The present paper investigates the potential of using eddy current technique as a reliable nondestructive tool to detect microstructural changes during the different stages of tempering treatment in AISI D2 tool steel. Five stages occur in tempering of the steel: precipitation of ɛ carbides, formation of cementite, retained austenite decomposition, secondary hardening effect and spheroidization of carbides. These stages were characterized by destructive methods, including dilatometry, differential scanning calorimetry, X-ray diffraction, scanning electron microscopic observations, and hardness measurements. The microstructural changes alter the electrical resistivity/magnetic saturation, which, in turn, influence the eddy current signals. Two EC parameters, induced voltage sensed by pickup coil and impedance point detected by excitation coil, were evaluated as a function of tempering temperature to characterize the microstructural features, nondestructively. The study revealed that a good correlation exists between the EC parameters and the microstructural changes.
Tools to Assess Community-Based Cumulative Risk and Exposures
Multiple agents and stressors can interact in a given community to adversely affect human and ecological conditions. A cumulative risk assessment (CRA) analyzes, characterizes, and potentially quantifies the effects from multiple stressors, which include chemical agents (for exam...
Automated Measurement and Verification and Innovative Occupancy Detection Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Phillip; Bruce, Nordman; Piette, Mary Ann
In support of DOE’s sensors and controls research, the goal of this project is to move toward integrated building to grid systems by building on previous work to develop and demonstrate a set of load characterization measurement and evaluation tools that are envisioned to be part of a suite of applications for transactive efficient buildings, built upon data-driven load characterization and prediction models. This will include the ability to include occupancy data in the models, plus data collection and archival methods to include different types of occupancy data with existing networks and a taxonomy for naming these data within amore » Volttron agent platform.« less
Tool Wear Feature Extraction Based on Hilbert Marginal Spectrum
NASA Astrophysics Data System (ADS)
Guan, Shan; Song, Weijie; Pang, Hongyang
2017-09-01
In the metal cutting process, the signal contains a wealth of tool wear state information. A tool wear signal’s analysis and feature extraction method based on Hilbert marginal spectrum is proposed. Firstly, the tool wear signal was decomposed by empirical mode decomposition algorithm and the intrinsic mode functions including the main information were screened out by the correlation coefficient and the variance contribution rate. Secondly, Hilbert transform was performed on the main intrinsic mode functions. Hilbert time-frequency spectrum and Hilbert marginal spectrum were obtained by Hilbert transform. Finally, Amplitude domain indexes were extracted on the basis of the Hilbert marginal spectrum and they structured recognition feature vector of tool wear state. The research results show that the extracted features can effectively characterize the different wear state of the tool, which provides a basis for monitoring tool wear condition.
Correlation Characterization of Particles in Volume Based on Peak-to-Basement Ratio
Vovk, Tatiana A.; Petrov, Nikolay V.
2017-01-01
We propose a new express method of the correlation characterization of the particles suspended in the volume of optically transparent medium. It utilizes inline digital holography technique for obtaining two images of the adjacent layers from the investigated volume with subsequent matching of the cross-correlation function peak-to-basement ratio calculated for these images. After preliminary calibration via numerical simulation, the proposed method allows one to quickly distinguish parameters of the particle distribution and evaluate their concentration. The experimental verification was carried out for the two types of physical suspensions. Our method can be applied in environmental and biological research, which includes analyzing tools in flow cytometry devices, express characterization of particles and biological cells in air and water media, and various technical tasks, e.g. the study of scattering objects or rapid determination of cutting tool conditions in mechanisms. PMID:28252020
Integrated Measurements and Characterization | Photovoltaic Research | NREL
Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool
XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.
2009-01-01
Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.
Wagner, James M; Alper, Hal S
2016-04-01
Coupling the tools of synthetic biology with traditional molecular genetic techniques can enable the rapid prototyping and optimization of yeast strains. While the era of yeast synthetic biology began in the well-characterized model organism Saccharomyces cerevisiae, it is swiftly expanding to include non-conventional yeast production systems such as Hansenula polymorpha, Kluyveromyces lactis, Pichia pastoris, and Yarrowia lipolytica. These yeasts already have roles in the manufacture of vaccines, therapeutic proteins, food additives, and biorenewable chemicals, but recent synthetic biology advances have the potential to greatly expand and diversify their impact on biotechnology. In this review, we summarize the development of synthetic biological tools (including promoters and terminators) and enabling molecular genetics approaches that have been applied in these four promising alternative biomanufacturing platforms. An emphasis is placed on synthetic parts and genome editing tools. Finally, we discuss examples of synthetic tools developed in other organisms that can be adapted or optimized for these hosts in the near future. Copyright © 2015 Elsevier Inc. All rights reserved.
CHEMINFORMATICS TOOLS FOR TOXICANT CHARACTERIZATION
Imaging and characterization of primary and secondary radiation in ion beam therapy
NASA Astrophysics Data System (ADS)
Granja, Carlos; Martisikova, Maria; Jakubek, Jan; Opalka, Lukas; Gwosch, Klaus
2016-07-01
Imaging in ion beam therapy is an essential and increasingly significant tool for treatment planning and radiation and dose deposition verification. Efforts aim at providing precise radiation field characterization and online monitoring of radiation dose distribution. A review is given of the research and methodology of quantum-imaging, composition, spectral and directional characterization of the mixed-radiation fields in proton and light ion beam therapy developed by the IEAP CTU Prague and HIT Heidelberg group. Results include non-invasive imaging of dose deposition and primary beam online monitoring.
A micromanipulation cell including a tool changer
NASA Astrophysics Data System (ADS)
Clévy, Cédric; Hubert, Arnaud; Agnus, Joël; Chaillet, Nicolas
2005-10-01
This paper deals with the design, fabrication and characterization of a tool changer for micromanipulation cells. This tool changer is part of a manipulation cell including a three linear axes robot and a piezoelectric microgripper. All these parts are designed to perform micromanipulation tasks in confined spaces such as a microfactory or in the chamber of a scanning electron microscope (SEM). The tool changer principle is to fix a pair of tools (i.e. the gripper tips) either on the tips of the microgripper actuator (piezoceramic bulk) or on a tool magazine. The temperature control of a thermal glue enables one to fix or release this pair of tools. Liquefaction and solidification are generated by surface mounted device (SMD) resistances fixed on the surface of the actuator or magazine. Based on this principle, the tool changer can be adapted to other kinds of micromanipulation cells. Hundreds of automatic tool exchanges were performed with a maximum positioning error between two consecutive tool exchanges of 3.2 µm, 2.3 µm and 2.8 µm on the X, Y and Z axes respectively (Z refers to the vertical axis). Finally, temperature measurements achieved under atmospheric pressure and in a vacuum environment and pressure measurements confirm the possibility of using this device in the air as well as in a SEM.
Developments in label-free microfluidic methods for single-cell analysis and sorting.
Carey, Thomas R; Cotner, Kristen L; Li, Brian; Sohn, Lydia L
2018-04-24
Advancements in microfluidic technologies have led to the development of many new tools for both the characterization and sorting of single cells without the need for exogenous labels. Label-free microfluidics reduce the preparation time, reagents needed, and cost of conventional methods based on fluorescent or magnetic labels. Furthermore, these devices enable analysis of cell properties such as mechanical phenotype and dielectric parameters that cannot be characterized with traditional labels. Some of the most promising technologies for current and future development toward label-free, single-cell analysis and sorting include electronic sensors such as Coulter counters and electrical impedance cytometry; deformation analysis using optical traps and deformation cytometry; hydrodynamic sorting such as deterministic lateral displacement, inertial focusing, and microvortex trapping; and acoustic sorting using traveling or standing surface acoustic waves. These label-free microfluidic methods have been used to screen, sort, and analyze cells for a wide range of biomedical and clinical applications, including cell cycle monitoring, rapid complete blood counts, cancer diagnosis, metastatic progression monitoring, HIV and parasite detection, circulating tumor cell isolation, and point-of-care diagnostics. Because of the versatility of label-free methods for characterization and sorting, the low-cost nature of microfluidics, and the rapid prototyping capabilities of modern microfabrication, we expect this class of technology to continue to be an area of high research interest going forward. New developments in this field will contribute to the ongoing paradigm shift in cell analysis and sorting technologies toward label-free microfluidic devices, enabling new capabilities in biomedical research tools as well as clinical diagnostics. This article is categorized under: Diagnostic Tools > Biosensing Diagnostic Tools > Diagnostic Nanodevices. © 2018 Wiley Periodicals, Inc.
High energy PIXE: A tool to characterize multi-layer thick samples
NASA Astrophysics Data System (ADS)
Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.
2018-02-01
High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.
Radar polarimetry - Analysis tools and applications
NASA Technical Reports Server (NTRS)
Evans, Diane L.; Farr, Tom G.; Van Zyl, Jakob J.; Zebker, Howard A.
1988-01-01
The authors have developed several techniques to analyze polarimetric radar data from the NASA/JPL airborne SAR for earth science applications. The techniques determine the heterogeneity of scatterers with subregions, optimize the return power from these areas, and identify probable scattering mechanisms for each pixel in a radar image. These techniques are applied to the discrimination and characterization of geologic surfaces and vegetation cover, and it is found that their utility varies depending on the terrain type. It is concluded that there are several classes of problems amenable to single-frequency polarimetric data analysis, including characterization of surface roughness and vegetation structure, and estimation of vegetation density. Polarimetric radar remote sensing can thus be a useful tool for monitoring a set of earth science parameters.
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
Implementation of Systematic Review Tools in IRIS | Science ...
Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view
USDA-ARS?s Scientific Manuscript database
Resistance against specific diseases is affecting profitability in fish production systems including rainbow trout. Limited information is known about functions and mechanisms of the immune gene pathways in teleosts. Immunogenomics are powerful tools to determine immune-related genes/gene pathways a...
Cryptosporidium Source Tracking in the Potomac River Watershed
To better characterize the presence of Cryptosporidium in the Potomac River watershed, a PCR-based genotyping tool was used to analyze 64 base-flow and 28 storm-flow samples from five sites within the watershed. These sites included two water treatment plant intakes as well as t...
USDA-ARS?s Scientific Manuscript database
High-density single nucleotide polymorphism (SNP) genotyping chips are a powerful tool for studying genomic patterns of diversity, inferring ancestral relationships among individuals in populations and studying marker-trait associations in mapping experiments. We developed a genotyping array includ...
Cryptosporidium source tracking in the Potomac River watershed - MCEARD
To better characterize Cryptosporidium in the Potomac River watershed, a PCR-based genotyping tool was used to analyze 64 base-flow and 28 storm-flow samples from five sites within the watershed. These sites included two water treatment plant intakes as well as three upstream si...
Understanding and reduction of defects on finished EUV masks
NASA Astrophysics Data System (ADS)
Liang, Ted; Sanchez, Peter; Zhang, Guojing; Shu, Emily; Nagpal, Rajesh; Stivers, Alan
2005-05-01
To reduce the risk of EUV lithography adaptation for the 32nm technology node in 2009, Intel has operated a EUV mask Pilot Line since early 2004. The Pilot Line integrates all the necessary process modules including common tool sets shared with current photomask production as well as EUV specific tools. This integrated endeavor ensures a comprehensive understanding of any issues, and development of solutions for the eventual fabrication of defect-free EUV masks. Two enabling modules for "defect-free" masks are pattern inspection and repair, which have been integrated into the Pilot Line. This is the first time we are able to look at real defects originated from multilayer blanks and patterning process on finished masks over entire mask area. In this paper, we describe our efforts in the qualification of DUV pattern inspection and electron beam mask repair tools for Pilot Line operation, including inspection tool sensitivity, defect classification and characterization, and defect repair. We will discuss the origins of each of the five classes of defects as seen by DUV pattern inspection tool on finished masks, and present solutions of eliminating and mitigating them.
SMARTe Site Characterization Tool. In: SMARTe20ll, EPA/600/C-10/007
The purpose of the Site Characterization Tool is to: (1) develop a sample design for collecting site characterization data and (2) perform data analysis on uploaded data. The sample design part helps to determine how many samples should be collected to characterize a site with ...
Acoustic/seismic signal propagation and sensor performance modeling
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Marlin, David H.; Mackay, Sean
2007-04-01
Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).
European solvent industry group generic exposure scenario risk and exposure tool
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates. PMID:23361440
European solvent industry group generic exposure scenario risk and exposure tool.
Zaleski, Rosemary T; Qian, Hua; Zelenka, Michael P; George-Ares, Anita; Money, Chris
2014-01-01
The European Solvents Industry Group (ESIG) Generic Exposure Scenario (GES) Risk and Exposure Tool (EGRET) was developed to facilitate the safety evaluation of consumer uses of solvents, as required by the European Union Registration, Evaluation and Authorization of Chemicals (REACH) Regulation. This exposure-based risk assessment tool provides estimates of both exposure and risk characterization ratios for consumer uses. It builds upon the consumer portion of the European Center for Ecotoxicology and Toxicology of Chemicals (ECETOC) Targeted Risk Assessment (TRA) tool by implementing refinements described in ECETOC TR107. Technical enhancements included the use of additional data to refine scenario defaults and the ability to include additional parameters in exposure calculations. Scenarios were also added to cover all frequently encountered consumer uses of solvents. The TRA tool structure was modified to automatically determine conditions necessary for safe use. EGRET reports results using specific standard phrases in a format consistent with REACH exposure scenario guidance, in order that the outputs can be readily assimilated within safety data sheets and similar information technology systems. Evaluation of tool predictions for a range of commonly encountered consumer uses of solvents found it provides reasonable yet still conservative exposure estimates.
Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing
NASA Astrophysics Data System (ADS)
Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander
2005-09-01
The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilcox, S.
2013-08-01
Under this Agreement, NREL will work with Participant to improve concentrating solar power system performance characterizations. This work includes, but is not limited to, research and development of methods for acquiring renewable resource characterization information using site-specific measurements of solar radiation and meteorological conditions; collecting system performance data; and developing tools for improving the design, installation, operation, and maintenance of solar energy conversion systems. This work will be conducted at NREL and Participant facilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilcox, S.
Under this Agreement, NREL will work with Participant to improve concentrating solar power system performance characterizations. This work includes, but is not limited to, research and development of methods for acquiring renewable resource characterization information using site-specific measurements of solar radiation and meteorological conditions; collecting system performance data; and developing tools for improving the design, installation, operation, and maintenance of solar energy conversion systems. This work will be conducted at NREL and Participant facilities.
Integrated Exoplanet Modeling with the GSFC Exoplanet Modeling & Analysis Center (EMAC)
NASA Astrophysics Data System (ADS)
Mandell, Avi M.; Hostetter, Carl; Pulkkinen, Antti; Domagal-Goldman, Shawn David
2018-01-01
Our ability to characterize the atmospheres of extrasolar planets will be revolutionized by JWST, WFIRST and future ground- and space-based telescopes. In preparation, the exoplanet community must develop an integrated suite of tools with which we can comprehensively predict and analyze observations of exoplanets, in order to characterize the planetary environments and ultimately search them for signs of habitability and life.The GSFC Exoplanet Modeling and Analysis Center (EMAC) will be a web-accessible high-performance computing platform with science support for modelers and software developers to host and integrate their scientific software tools, with the goal of leveraging the scientific contributions from the entire exoplanet community to improve our interpretations of future exoplanet discoveries. Our suite of models will include stellar models, models for star-planet interactions, atmospheric models, planet system science models, telescope models, instrument models, and finally models for retrieving signals from observational data. By integrating this suite of models, the community will be able to self-consistently calculate the emergent spectra from the planet whether from emission, scattering, or in transmission, and use these simulations to model the performance of current and new telescopes and their instrumentation.The EMAC infrastructure will not only provide a repository for planetary and exoplanetary community models, modeling tools and intermodal comparisons, but it will include a "run-on-demand" portal with each software tool hosted on a separate virtual machine. The EMAC system will eventually include a means of running or “checking in” new model simulations that are in accordance with the community-derived standards. Additionally, the results of intermodal comparisons will be used to produce open source publications that quantify the model comparisons and provide an overview of community consensus on model uncertainties on the climates of various planetary targets.
Alternative toxicity assessment methods to characterize the hazards of chemical substances have been proposed to reduce animal testing and screen thousands of chemicals in an efficient manner. Resources to accomplish these goals include utilizing large in vitro chemical screening...
The People Unite: Learning Meaningful Civics Online
ERIC Educational Resources Information Center
Pitts, Annette Boyd; Dziuban, Charles; Cornett, Jeffrey W.
2011-01-01
Throughout the world, today's students are being characterized as digital natives, the "net generation." This twenty-first-century student cohort is adept at multi-tasking and at using a variety of tools and resources including electronic search engines, blogs, wikis, visual images, videos, gaming platforms, and social networking.…
Microarray technology is a powerful tool to investigate the gene expression profiles for thousands of genes simultaneously. In recent years, microarrays have been used to characterize environmental pollutants and identify molecular mode(s) of action of chemicals including endocri...
This study focused on identifying impaired and unimpaired areas (i.e., reference) within the Ukrainian portion of the Danube Delta using modern environmental diagnostic approaches and tools. To characterize the state of the areas under study, a triad approach was used including c...
NWTC Helps Guide U.S. Offshore R&D; NREL (National Renewable Energy Laboratory)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2015-07-01
The National Wind Technology Center (NWTC) at the National Renewable Energy Laboratory (NREL) is helping guide our nation's research-and-development effort in offshore renewable energy, which includes: Design, modeling, and analysis tools; Device and component testing; Resource characterization; Economic modeling and analysis; Grid integration.
VETA x ray data acquisition and control system
NASA Technical Reports Server (NTRS)
Brissenden, Roger J. V.; Jones, Mark T.; Ljungberg, Malin; Nguyen, Dan T.; Roll, John B., Jr.
1992-01-01
We describe the X-ray Data Acquisition and Control System (XDACS) used together with the X-ray Detection System (XDS) to characterize the X-ray image during testing of the AXAF P1/H1 mirror pair at the MSFC X-ray Calibration Facility. A variety of X-ray data were acquired, analyzed and archived during the testing including: mirror alignment, encircled energy, effective area, point spread function, system housekeeping and proportional counter window uniformity data. The system architecture is presented with emphasis placed on key features that include a layered UNIX tool approach, dedicated subsystem controllers, real-time X-window displays, flexibility in combining tools, network connectivity and system extensibility. The VETA test data archive is also described.
The Exoplanet Characterization ToolKit (ExoCTK)
NASA Astrophysics Data System (ADS)
Stevenson, Kevin; Fowler, Julia; Lewis, Nikole K.; Fraine, Jonathan; Pueyo, Laurent; Valenti, Jeff; Bruno, Giovanni; Filippazzo, Joseph; Hill, Matthew; Batalha, Natasha E.; Bushra, Rafia
2018-01-01
The success of exoplanet characterization depends critically on a patchwork of analysis tools and spectroscopic libraries that currently require extensive development and lack a centralized support system. Due to the complexity of spectroscopic analyses and initial time commitment required to become productive, there are currently a limited number of teams that are actively advancing the field. New teams with significant expertise, but without the proper tools, face prohibitively steep hills to climb before they can contribute. As a solution, we are developing an open-source, modular data analysis package in Python and a publicly facing web interface focused primarily on atmospheric characterization of exoplanets and exoplanet transit observation planning with JWST. The foundation of these software tools and libraries exist within pockets of the exoplanet community. Our project will gather these seedling tools and grow a robust, uniform, and well maintained exoplanet characterization toolkit.
NASA Astrophysics Data System (ADS)
Salim, S.; Agusnar, H.; Wirjosentono, B.; Tamrin; Marpaung, H.; Rihayat, T.; Nurhanifa; Adriana
2018-03-01
Plastic polymer is one of the most dominant materials of daily human activities because of its multifunctional nature, light and strong and anti-corrosion so it is easy to apply in various equipment. Plastic is generally derived from petroleum material so it is nonbiodegradable. Therefore, this study aims to create a breakthrough of natural and biodegradable biodegradable plastic materials from plant starch (pisok kepok starch) with the help of 3 types of acid (HNO3, HCl and H2SO4) called Poly Lactid Acid (PLA). PLA is enhanced by mixing with a clay material with a variation of 1, 3 and 5% composition to form a PLA / Clay Nanocomposite material which is expected to have superior properties and resemble conventional plastics in general. Several types of characterization were performed to see the quality of the resulting material including tensile strength test with UTM tool, thermal endurance test with TGA tool, morphological structure test using SEM tool and additional test to see filler clay quality through X-RD tool. Based on the characterization of tensile and thermal test, 5B nanocomposite with addition of 5% clay and HCl acid aid showed the best tensile strength of 36 Mpa and the highest stability was 446,63 oC. Based on the results of morphological analysis of the best samples (5B) showed good interface ties. Meanwhile, based on the results of filler analysis, the opening of clay layer d-spacing occurred at 0.355 nm.
NASA Technical Reports Server (NTRS)
Maluf, David A. (Inventor); Bell, David G. (Inventor); Gurram, Mohana M. (Inventor); Gawdiak, Yuri O. (Inventor)
2009-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as a monthly report, a task plan report, a budget report and a risk management report, are generated and made available for display or further analysis. An extensible database allows searching for information based upon context and upon content.
Characterizing health plan price estimator tools: findings from a national survey.
Higgins, Aparna; Brainard, Nicole; Veselovskiy, German
2016-02-01
Policy makers have growing interest in price transparency and in the kinds of tools available to consumers. Health plans have implemented price estimator tools that make provider pricing information available to members; however, systematic data on prevalence and characteristics of such tools are limited. The purpose of this study was to describe the characteristics of price estimator tools offered by health plans to their members and to identify potential trends, challenges, and opportunities for advancing the utility of these tools. National Web-based survey. Between 2014 and 2015, we conducted a national Web-based survey of health plans with commercial enrollment (100 plans, 43% response rate). Descriptive analyses were conducted using survey data. Health plan members have access to a variety of price estimator tool capabilities for commonly used procedures. These tools take into account member characteristics, including member zip code and benefit design. Despite outreach to members, however, challenges remain with respect to member uptake of such tools. Our study found that health plans share price and provider performance data with their members.
Development and characterization of microsatellite markers for Berberis thunbergii (Berberidaceae).
Allen, Jenica M; Obae, Samuel G; Brand, Mark H; Silander, John A; Jones, Kenneth L; Nunziata, Schyler O; Lance, Stacey L
2012-05-01
Microsatellite markers were isolated and characterized in Berberis thunbergii, an invasive and ornamental shrub in the eastern United States, to assess genetic diversity among populations and potentially identify horticultural cultivars. A total of 12 loci were identified for the species. Eight of the loci were polymorphic and were screened in 24 individuals from two native (Tochigi and Ibaraki prefectures, Japan) and one invasive (Connecticut, USA) population and 21 horticultural cultivars. The number of alleles per locus ranged from three to seven, and observed heterozygosity ranged from 0.048 to 0.636. These new markers will provide tools for examining genetic relatedness of B. thunbergii plants in the native and invasive range, including phylogeographic studies and assessment of rapid evolution in the invasive range. These markers may also provide tools for examining hybridization with other related species in the invasive range.
Narhi, Linda O; Corvari, Vincent; Ripple, Dean C; Afonina, Nataliya; Cecchini, Irene; Defelippis, Michael R; Garidel, Patrick; Herre, Andrea; Koulov, Atanas V; Lubiniecki, Tony; Mahler, Hanns-Christian; Mangiagalli, Paolo; Nesta, Douglas; Perez-Ramirez, Bernardo; Polozova, Alla; Rossi, Mara; Schmidt, Roland; Simler, Robert; Singh, Satish; Spitznagel, Thomas M; Weiskopf, Andrew; Wuchner, Klaus
2015-06-01
Measurement and characterization of subvisible particles (defined here as those ranging in size from 2 to 100 μm), including proteinaceous and nonproteinaceous particles, is an important part of every stage of protein therapeutic development. The tools used and the ways in which the information generated is applied depends on the particular product development stage, the amount of material, and the time available for the analysis. In order to compare results across laboratories and products, it is important to harmonize nomenclature, experimental protocols, data analysis, and interpretation. In this manuscript on perspectives on subvisible particles in protein therapeutic drug products, we focus on the tools available for detection, characterization, and quantification of these species and the strategy around their application. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project
NASA Technical Reports Server (NTRS)
Colantonio, Ron
2011-01-01
Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena
Mass spectrometry as a quantitative tool in plant metabolomics
Jorge, Tiago F.; Mata, Ana T.
2016-01-01
Metabolomics is a research field used to acquire comprehensive information on the composition of a metabolite pool to provide a functional screen of the cellular state. Studies of the plant metabolome include the analysis of a wide range of chemical species with very diverse physico-chemical properties, and therefore powerful analytical tools are required for the separation, characterization and quantification of this vast compound diversity present in plant matrices. In this review, challenges in the use of mass spectrometry (MS) as a quantitative tool in plant metabolomics experiments are discussed, and important criteria for the development and validation of MS-based analytical methods provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644967
Tool for use in lifting pin supported objects
NASA Technical Reports Server (NTRS)
Marzek, R. A.; Read, W. S. (Inventor)
1974-01-01
A tool for use in lifting a pin-supported, electronic package mounted in juxtaposition with the surface of an electronic circuit board is described. The tool is configured to be received beneath a pin-supported package and is characterized by a manually operable linkage, including an elongated, rigid link is supported for axial reciprocation and a pivotal link pinned to the body and supported for oscillation induced in response to axial motion imparted to the rigid link. A lifting plate is pivotally coupled to the distal end of the pivotal link so that oscillatory motion imparted to the pivotal link serves to move the plate vertically for elevating the plate into lifting engagement with the electronic package positioned thereabove.
NASA Astrophysics Data System (ADS)
Baer, Donald R.
2018-05-01
Nanoparticles in a variety of forms are increasing important in fundamental research, technological and medical applications, and environmental or toxicology studies. Physical and chemical drivers that lead to multiple types of particle instabilities complicate both the ability to produce, appropriately characterize, and consistently deliver well-defined particles, frequently leading to inconsistencies and conflicts in the published literature. This perspective suggests that provenance information, beyond that often recorded or reported, and application of a set of core characterization methods, including a surface sensitive technique, consistently applied at critical times can serve as tools in the effort minimize reproducibility issues.
Characterization of diode-laser stacks for high-energy-class solid state lasers
NASA Astrophysics Data System (ADS)
Pilar, Jan; Sikocinski, Pawel; Pranowicz, Alina; Divoky, Martin; Crump, P.; Staske, R.; Lucianetti, Antonio; Mocek, Tomas
2014-03-01
In this work, we present a comparative study of high power diode stacks produced by world's leading manufacturers such as DILAS, Jenoptik, and Quantel. The diode-laser stacks are characterized by central wavelength around 939 nm, duty cycle of 1 %, and maximum repetition rate of 10 Hz. The characterization includes peak power, electrical-to-optical efficiency, central wavelength and full width at half maximum (FWHM) as a function of diode current and cooling temperature. A cross-check of measurements performed at HiLASE-IoP and Ferdinand-Braun-Institut (FBH) shows very good agreement between the results. Our study reveals also the presence of discontinuities in the spectra of two diode stacks. We consider the results presented here a valuable tool to optimize pump sources for ultra-high average power lasers, including laser fusion facilities.
Optical Spectroscopy of New Materials
NASA Technical Reports Server (NTRS)
White, Susan M.; Arnold, James O. (Technical Monitor)
1993-01-01
Composites are currently used for a rapidly expanding number of applications including aircraft structures, rocket nozzles, thermal protection of spacecraft, high performance ablative surfaces, sports equipment including skis, tennis rackets and bicycles, lightweight automobile components, cutting tools, and optical-grade mirrors. Composites are formed from two or more insoluble materials to produce a material with superior properties to either component. Composites range from dispersion-hardened alloys to advanced fiber-reinforced composites. UV/VIS and FTIR spectroscopy currently is used to evaluate the bonding between the matrix and the fibers, monitor the curing process of a polymer, measure surface contamination, characterize the interphase material, monitor anion transport in polymer phases, characterize the void formation (voids must be minimized because, like cracks in a bulk material, they lead to failure), characterize the surface of the fiber component, and measure the overall optical properties for energy balances.
Baptista, Marco A S; Dave, Kuldip D; Sheth, Niketa P; De Silva, Shehan N; Carlson, Kirsten M; Aziz, Yasmin N; Fiske, Brian K; Sherer, Todd B; Frasier, Mark A
2013-11-01
Progress in Parkinson's disease (PD) research and therapeutic development is hindered by many challenges, including a need for robust preclinical animal models. Limited availability of these tools is due to technical hurdles, patent issues, licensing restrictions and the high costs associated with generating and distributing these animal models. Furthermore, the lack of standardization of phenotypic characterization and use of varying methodologies has made it difficult to compare outcome measures across laboratories. In response, The Michael J. Fox Foundation for Parkinson's Research (MJFF) is directly sponsoring the generation, characterization and distribution of preclinical rodent models, enabling increased access to these crucial tools in order to accelerate PD research. To date, MJFF has initiated and funded the generation of 30 different models, which include transgenic or knockout models of PD-relevant genes such as Park1 (also known as Park4 and SNCA), Park8 (LRRK2), Park7 (DJ-1), Park6 (PINK1), Park2 (Parkin), VPS35, EiF4G1 and GBA. The phenotypic characterization of these animals is performed in a uniform and streamlined manner at independent contract research organizations. Finally, MJFF created a central repository at The Jackson Laboratory (JAX) that houses both non-MJFF and MJFF-generated preclinical animal models. Funding from MJFF, which subsidizes the costs involved in transfer, rederivation and colony expansion, has directly resulted in over 2500 rodents being distributed to the PD community for research use.
Resilience of arctic mycorrhizal fungal communities after wildfire facilitated by resprouting shrubs
Rebecca E. Hewitt; Elizabeth Bent; Teresa N. Hollingsworth; F. Stuart Chapin; D. Lee Taylor
2013-01-01
Climate-induced changes in the tundra fire regime are expected to alter shrub abundance and distribution across the Arctic. However, little is known about how fire may indirectly impact shrub performance by altering mycorrhizal symbionts. We used molecular tools, including ARISA and ITS sequencing, to characterize the mycorrhizal communities on resprouting ...
A multi-objective sampling design has been implemented through R-EMAP support of a cooperative agreement with the state of West Virginia. Goals of the project include: 1) development and testing of a temperature-adjusted fish IBI for the Central Appalachian Plateau and Western Al...
The School Leader's Tool for Assessing and Improving School Culture
ERIC Educational Resources Information Center
Wagner, Christopher R.
2006-01-01
School culture consists of "the beliefs, attitudes, and behaviors which characterize a school" (Phillips, 1996, p. 1). It is the shared experiences both in school and out of school (traditions and celebrations) that create a sense of community, family, and team membership. It affects everything that happens in a school, including student…
Automated clustering-based workload characterization
NASA Technical Reports Server (NTRS)
Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena
1996-01-01
The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.
Preparing and Analyzing Iced Airfoils
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.; Choo, Yung K.; Coroneos, Rula M.; Pennline, James A.; Hackenberg, Anthony W.; Schilling, Herbert W.; Slater, John W.;
2004-01-01
SmaggIce version 1.2 is a computer program for preparing and analyzing iced airfoils. It includes interactive tools for (1) measuring ice-shape characteristics, (2) controlled smoothing of ice shapes, (3) curve discretization, (4) generation of artificial ice shapes, and (5) detection and correction of input errors. Measurements of ice shapes are essential for establishing relationships between characteristics of ice and effects of ice on airfoil performance. The shape-smoothing tool helps prepare ice shapes for use with already available grid-generation and computational-fluid-dynamics software for studying the aerodynamic effects of smoothed ice on airfoils. The artificial ice-shape generation tool supports parametric studies since ice-shape parameters can easily be controlled with the artificial ice. In such studies, artificial shapes generated by this program can supplement simulated ice obtained from icing research tunnels and real ice obtained from flight test under icing weather condition. SmaggIce also automatically detects geometry errors such as tangles or duplicate points in the boundary which may be introduced by digitization and provides tools to correct these. By use of interactive tools included in SmaggIce version 1.2, one can easily characterize ice shapes and prepare iced airfoils for grid generation and flow simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jensen, David S.; Kanyal, Supriya S.; Madaan, Nitesh
Herein we apply a suite of surface/materials analytical tools to characterize some of the materials created in the production of microfabricated thin layer chromatography plates. Techniques used include X-ray photoelectron spectroscopy (XPS), valence band spectroscopy, static time-of-flight secondary ion spectrometry (ToF-SIMS) in both positive and negative ion modes, Rutherford backscattering spectroscopy (RBS), and helium ion microscopy (HIM). Materials characterized include: the Si(100) substrate with native oxide: Si/SiO2, alumina (35 nm) deposited as a diffusion barrier on the Si/SiO2: Si/SiO2/Al2O3, iron (6 nm) thermally evaporated on the Al2O3: Si/SiO2/Al2O3/Fe, the iron film annealed in H2 to make Fe catalyst nanoparticles: Si/SiO2/Al2O3/Fe(NP),more » and carbon nanotubes (CNTs) grown from the Fe nanoparticles: Si/SiO2/Al2O3/Fe(NP)/CNT. The Fe thin films and nanoparticles are found in an oxidized state. Some of the analyses of the CNTs/CNT forests reported appear to be unique: the CNT forest appears to exhibit an interesting ‘channeling’ phenomenon by RBS, we observe an odd-even effect in the ToF-SIMS spectra of Cn- species for n = 1 – 6, with ions at even n showing greater intensity than the neighboring signals, and ions with n ≥ 6 showing a steady decrease in intensity, and valence band characterization of CNTs using X-radiation is reported. The information obtained from the combination of the different analytical tools provides a more complete understanding of our materials than a single technique, which is analogous to the story of ‘The Blind Men and the Elephant’. (Of course there is increasing emphasis on the use of multiple characterization tools in surface and materials analysis.) The raw XPS and ToF-SIMS spectra from this study will be submitted to Surface Science Spectra for archiving.« less
Ultrasonic Characterization of Aerospace Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara; Johnston, Patrick; Haldren, Harold; Perey, Daniel
2015-01-01
Composite materials have seen an increased use in aerospace in recent years and it is expected that this trend will continue due to the benefits of reduced weight, increased strength, and other factors. Ongoing work at NASA involves the investigation of the large-scale use of composites for spacecraft structures (SLS components, Orion Composite Crew Module, etc). NASA is also involved in work to enable the use of composites in advanced aircraft structures through the Advanced Composites Project (ACP). In both areas (space and aeronautics) there is a need for new nondestructive evaluation and materials characterization techniques that are appropriate for characterizing composite materials. This paper will present an overview of NASA's needs for characterizing aerospace composites, including a description of planned and ongoing work under ACP for the detection of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking. The research approaches include investigation of angle array, guided wave, and phase sensitive ultrasonic methods. The use of ultrasonic simulation tools for optimizing and developing methods will also be discussed.
NASA Astrophysics Data System (ADS)
Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.
2014-12-01
X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.
Validation of a new device to quantify groundwater-surface water exchange
NASA Astrophysics Data System (ADS)
Cremeans, Mackenzie M.; Devlin, J. F.
2017-11-01
Distributions of flow across the groundwater-surface water interface should be expected to be as complex as the geologic deposits associated with stream or lake beds and their underlying aquifers. In these environments, the conventional Darcy-based method of characterizing flow systems (near streams) has significant limitations, including reliance on parameters with high uncertainties (e.g., hydraulic conductivity), the common use of drilled wells in the case of streambank investigations, and potentially lengthy measurement times for aquifer characterization and water level measurements. Less logistically demanding tools for quantifying exchanges across streambeds have been developed and include drive-point mini-piezometers, seepage meters, and temperature profiling tools. This project adds to that toolbox by introducing the Streambed Point Velocity Probe (SBPVP), a reusable tool designed to quantify groundwater-surface water interactions (GWSWI) at the interface with high density sampling, which can effectively, rapidly, and accurately complement conventional methods. The SBPVP is a direct push device that measures in situ water velocities at the GWSWI with a small-scale tracer test on the probe surface. Tracer tests do not rely on hydraulic conductivity or gradient information, nor do they require long equilibration times. Laboratory testing indicated that the SBPVP has an average accuracy of ± 3% and an average precision of ± 2%. Preliminary field testing, conducted in the Grindsted Å in Jutland, Denmark, yielded promising agreement between groundwater fluxes determined by conventional methods and those estimated from the SBPVP tests executed at similar scales. These results suggest the SBPVP is a viable tool to quantify groundwater-surface water interactions in high definition in sandy streambeds.
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
T7 lytic phage-displayed peptide libraries: construction and diversity characterization.
Krumpe, Lauren R H; Mori, Toshiyuki
2014-01-01
In this chapter, we describe the construction of T7 bacteriophage (phage)-displayed peptide libraries and the diversity analyses of random amino acid sequences obtained from the libraries. We used commercially available reagents, Novagen's T7Select system, to construct the libraries. Using a combination of biotinylated extension primer and streptavidin-coupled magnetic beads, we were able to prepare library DNA without applying gel purification, resulting in extremely high ligation efficiencies. Further, we describe the use of bioinformatics tools to characterize library diversity. Amino acid frequency and positional amino acid diversity and hydropathy are estimated using the REceptor LIgand Contacts website http://relic.bio.anl.gov. Peptide net charge analysis and peptide hydropathy analysis are conducted using the Genetics Computer Group Wisconsin Package computational tools. A comprehensive collection of the estimated number of recombinants and titers of T7 phage-displayed peptide libraries constructed in our lab is included.
NDE and SHM Simulation for CFRP Composites
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Parker, F. Raymond
2014-01-01
Ultrasound-based nondestructive evaluation (NDE) is a common technique for damage detection in composite materials. There is a need for advanced NDE that goes beyond damage detection to damage quantification and characterization in order to enable data driven prognostics. The damage types that exist in carbon fiber-reinforced polymer (CFRP) composites include microcracking and delaminations, and can be initiated and grown via impact forces (due to ground vehicles, tool drops, bird strikes, etc), fatigue, and extreme environmental changes. X-ray microfocus computed tomography data, among other methods, have shown that these damage types often result in voids/discontinuities of a complex volumetric shape. The specific damage geometry and location within ply layers affect damage growth. Realistic threedimensional NDE and structural health monitoring (SHM) simulations can aid in the development and optimization of damage quantification and characterization techniques. This paper is an overview of ongoing work towards realistic NDE and SHM simulation tools for composites, and also discusses NASA's need for such simulation tools in aeronautics and spaceflight. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with realistic 3-dimensional damage in CFRP composites. The custom code uses elastodynamic finite integration technique and is parallelized to run efficiently on computing cluster or multicore machines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.
Nanoparticles in a variety of forms are of increasing importance in fundamental research, technological and medical applications, and environmental or toxicology studies. Physical and chemical drivers that lead to multiple types of particle instabilities complicate both the ability to produce and consistently deliver well defined particles and their appropriate characterization, frequently leading to inconsistencies and conflicts in the published literature. This perspective suggests that provenance information, beyond that often recorded or reported, and application of a set of core characterization methods, including a surface sensitive technique, consistently applied at critical times can serve as tools in the effort minimize reproducibilitymore » issues.« less
USDA-ARS?s Scientific Manuscript database
Fusarium Link is a genus including ubiquitous plant-pathogenic fungi that may cause severe crop losses. The Fusarium genus is divided in species complexes; the species are grouped by physiological, biological, ecological and genetic similarity. The Fusarium fujikuroi species complex (FFSC) is one of...
ERIC Educational Resources Information Center
Fonger, Nicole L.
2012-01-01
Representational fluency (RF) includes an ability to interpret, create, move within and among, and connect tool-based representations of mathematical objects. Taken as an indicator of conceptual understanding, there is a need to better support school algebra students' RF in learning environments that utilize both computer algebra systems…
CCS Activities Being Performed by the U.S. DOE
Dressel, Brian; Deel, Dawn; Rodosta, Traci; Plasynski, Sean; Litynski, John; Myer, Larry
2011-01-01
The United States Department of Energy (DOE) is the lead federal agency for the development and deployment of carbon sequestration technologies. Its mission includes promoting scientific and technological innovations and transfer of knowledge for safe and permanent storage of CO2 in the subsurface. To accomplish its mission, DOE is characterizing and classifying potential geologic storage reservoirs in basins throughout the U.S. and Canada, and developing best practices for project developers, to help ensure the safety of future geologic storage projects. DOE’s Carbon Sequestration Program, Regional Carbon Sequestration Partnership (RCSP) Initiative, administered by the National Energy Technology Laboratory (NETL), is identifying, characterizing, and testing potential injection formations. The RCSP Initiative consists of collaborations among government, industry, universities, and international organizations. Through this collaborative effort, a series of integrated knowledge-based tools have been developed to help potential sequestration project developers. They are the Carbon Sequestration Atlas of the United States and Canada, National Carbon Sequestration Database and Geographic System (NATCARB), and best practice manuals for CCS including Depositional Reservoir Classification for CO2; Public Outreach and Education for Carbon Storage Projects; Monitoring, Verification, and Accounting of CO2 Stored in Deep Geologic Formation; Site Screening, Site Selection, and Initial Characterization of CO2 Storage in Deep Geologic Formations. DOE’s future research will help with refinement of these tools and additional best practice manuals (BPM) which focus on other technical aspects of project development. PMID:21556188
Manufacture and Preparation of Test Specimens for Johnson-Cook Material Characterization
2013-01-01
modeling and simulation, and will be included in the Elastic Plastic Impact Code (EPIC) library. This report describes the welding and machining...used by the government for ballistic, blast and other types of modeling and simulation, and will be included in the Elastic Plastic Impact Code (EPIC...made of H13 tool steel with a scrolled pin and shoulder (See Figure 2-3) was used however the different heat requirements of the materials required
Biokinetics of Nanomaterials: the Role of Biopersistence.
Laux, Peter; Riebeling, Christian; Booth, Andy M; Brain, Joseph D; Brunner, Josephine; Cerrillo, Cristina; Creutzenberg, Otto; Estrela-Lopis, Irina; Gebel, Thomas; Johanson, Gunnar; Jungnickel, Harald; Kock, Heiko; Tentschert, Jutta; Tlili, Ahmed; Schäffer, Andreas; Sips, Adriënne J A M; Yokel, Robert A; Luch, Andreas
2017-04-01
Nanotechnology risk management strategies and environmental regulations continue to rely on hazard and exposure assessment protocols developed for bulk materials, including larger size particles, while commercial application of nanomaterials (NMs) increases. In order to support and corroborate risk assessment of NMs for workers, consumers, and the environment it is crucial to establish the impact of biopersistence of NMs at realistic doses. In the future, such data will allow a more refined future categorization of NMs. Despite many experiments on NM characterization and numerous in vitro and in vivo studies, several questions remain unanswered including the influence of biopersistence on the toxicity of NMs. It is unclear which criteria to apply to characterize a NM as biopersistent. Detection and quantification of NMs, especially determination of their state, i.e., dissolution, aggregation, and agglomeration within biological matrices and other environments are still challenging tasks; moreover mechanisms of nanoparticle (NP) translocation and persistence remain critical gaps. This review summarizes the current understanding of NM biokinetics focusing on determinants of biopersistence. Thorough particle characterization in different exposure scenarios and biological matrices requires use of suitable analytical methods and is a prerequisite to understand biopersistence and for the development of appropriate dosimetry. Analytical tools that potentially can facilitate elucidation of key NM characteristics, such as ion beam microscopy (IBM) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), are discussed in relation to their potential to advance the understanding of biopersistent NM kinetics. We conclude that a major requirement for future nanosafety research is the development and application of analytical tools to characterize NPs in different exposure scenarios and biological matrices.
Automatic Fault Characterization via Abnormality-Enhanced Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Laguna, I; de Supinski, B R
Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less
Mirand, Amy L; Beehler, Gregory P; Kuo, Christina L; Mahoney, Martin C
2002-01-01
Background A practice intervention must have its basis in an understanding of the physician and practice to secure its benefit and relevancy. We used a formative process to characterize primary care physician attitudes, needs, and practice obstacles regarding primary prevention. The characterization will provide the conceptual framework for the development of a practice tool to facilitate routine delivery of primary preventive care. Methods A focus group of primary care physician Opinion Leaders was audio-taped, transcribed, and qualitatively analyzed to identify emergent themes that described physicians' perceptions of prevention in daily practice. Results The conceptual worth of primary prevention, including behavioral counseling, was high, but its practice was significantly countered by the predominant clinical emphasis on and rewards for secondary care. In addition, lack of health behavior training, perceived low self-efficacy, and patient resistance to change were key deterrents to primary prevention delivery. Also, the preventive focus in primary care is not on cancer, but on predominant chronic nonmalignant conditions. Conclusions The success of the future practice tool will be largely dependent on its ability to "fit" primary prevention into the clinical culture of diagnoses and treatment sustained by physicians, patients, and payers. The tool's message output must be formatted to facilitate physician delivery of patient-tailored behavioral counseling in an accurate, confident, and efficacious manner. Also, the tool's health behavior messages should be behavior-specific, not disease-specific, to draw on shared risk behaviors of numerous diseases and increase the likelihood of perceived salience and utility of the tool in primary care. PMID:12204096
Composite Characterization Using Ultrasonic Wavefield Techniques
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.; Seebo, Jeffrey P.
2016-01-01
The large-scale use of composite components in aerospace applications is expected to continue due to the benefits of composite materials, such as reduced weight, increased strength, and tailorability. NASA's Advanced Composites Project (ACP) has the goals of reducing the timeline for certification of composite materials and enabling the expanded use of advanced composite materials. A key technical challenge area for accomplishing these goals is the need for nondestructive evaluation and materials characterization techniques that are optimized for rapid inspection and detailed defect/damage characterization in composite materials. This presentation will discuss ongoing research investigating the use of ultrasonic wavefield techniques for the characterization of defects such as fiber waviness and delamination damage. Ongoing work includes the development of realistic ultrasonic simulation tools for use in predicting the inspectability of composites and optimizing inspection methodologies. Recent studies on detecting/characterizing delamination damage and fiber waviness via wavefield methods will be described.
NAPL detection with ground-penetrating radar (Invited)
NASA Astrophysics Data System (ADS)
Bradford, J. H.
2013-12-01
Non-polar organic compounds are common contaminants and are collectively referred to as nonaqueous-phase liquids (NAPLs). NAPL contamination problems occur in virtually every environment on or near the earth's surface and therefore a robust suite of geophysical tools is required to accurately characterize NAPL spills and monitor their remediation. NAPLs typically have low dielectric permittivity and low electric conductivity relative to water. Thus a zone of anomalous electrical properties often occurs when NAPL displaces water in the subsurface pore space. Such electric property anomalies make it possible to detect NAPL in the subsurface using electrical or electromagnetic geophysical methods including ground-penetrating radar (GPR). The GPR signature associated with the presence of NAPL is manifest in essentially three ways. First, the decrease in dielectric permittivity results in increased EM propagation velocity. Second, the decrease in permittivity can significantly change reflectivity. Finally, electric conductivity anomalies lead to anomalous GPR signal attenuation. The conductivity anomaly may be either high or low depending on the state of NAPL degradation, but with either high or low conductivity, GPR attenuation analysis can be a useful tool for identifying contaminated-zones. Over the past 15 years I have conducted numerous modeling, laboratory, and field tests to investigate the ability to use GPR to measure NAPL induced anomalies. The emphasis of this work has been on quantitative analysis to characterize critical source zone parameters such as NAPL concentration. Often, the contaminated zones are below the conventional resolution of the GPR signal and require thin layer analysis. Through a series of field examples, I demonstrate 5 key GPR analysis tools that can help identify and quantify NAPL contaminants. These tools include 1) GPR velocity inversion from multi-fold data, 2) amplitude vs offset analysis, 3) spectral decomposition, 4) frequency dependent attenuation analysis, and 5) reflectivity inversion. Examples are taken from a variety of applications that include oil spills on the ocean, oil spills on and under sea ice, and both LNAPL and DNAPL contaminated groundwater systems. Many factors conspire to complicate field data analysis, yet careful analysis and integration of multiple techniques has proven robust. Use of these methods in practical application has been slow to take root. Nonetheless, a best practices working model integrates geophysics from the outset and mirrors the approach utilized in hydrocarbon exploration. This model ultimately minimizes site characterization and remediation costs.
WE-G-BRC-02: Risk Assessment for HDR Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayadev, J.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-01: Risk Assessment for Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-03: Risk Assessment for Physics Plan Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, S.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
USDA-ARS?s Scientific Manuscript database
The genome-wide association study (GWAS) is a useful tool for detecting and characterizing traits of interest including those associated with disease resistance in soybean. The availability of 50,000 single nucleotide polymorphism (SNP) markers (SoySNP50K iSelect BeadChip; www.soybase.org) on 19,652...
ERIC Educational Resources Information Center
Hawley, Patricia H.
2011-01-01
Adolescence is a period characterized by well-documented growth and change, including reproductive, social, and cognitive development. Though not unheard of, modern evolutionary approaches to adolescence are still relatively uncommon. Recent treatises in developmental biology, however, have yielded new tools through which to explore human…
Characterization of technical surfaces by structure function analysis
NASA Astrophysics Data System (ADS)
Kalms, Michael; Kreis, Thomas; Bergmann, Ralf B.
2018-03-01
The structure function is a tool for characterizing technical surfaces that exhibits a number of advantages over Fourierbased analysis methods. So it is optimally suited for analyzing the height distributions of surfaces measured by full-field non-contacting methods. The structure function is thus a useful method to extract global or local criteria like e. g. periodicities, waviness, lay, or roughness to analyze and evaluate technical surfaces. After the definition of line- and area-structure function and offering effective procedures for their calculation this paper presents examples using simulated and measured data of technical surfaces including aircraft parts.
Borehole Tool for the Comprehensive Characterization of Hydrate-bearing Sediments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Sheng; Santamarina, J. Carlos
Reservoir characterization and simulation require reliable parameters to anticipate hydrate deposits responses and production rates. The acquisition of the required fundamental properties currently relies on wireline logging, pressure core testing, and/or laboratory observations of synthesized specimens, which are challenged by testing capabilities and innate sampling disturbances. The project reviews hydrate-bearing sediments, properties, and inherent sampling effects, albeit lessen with the developments in pressure core technology, in order to develop robust correlations with index parameters. The resulting information is incorporated into a tool for optimal field characterization and parameter selection with uncertainty analyses. Ultimately, the project develops a borehole tool formore » the comprehensive characterization of hydrate-bearing sediments at in situ, with the design recognizing past developments and characterization experience and benefited from the inspiration of nature and sensor miniaturization.« less
Applications of capillary electrophoresis in characterizing recombinant protein therapeutics.
Zhao, Shuai Sherry; Chen, David D Y
2014-01-01
The use of recombinant protein for therapeutic applications has increased significantly in the last three decades. The heterogeneity of these proteins, often caused by the complex biosynthesis pathways and the subsequent PTMs, poses a challenge for drug characterization to ensure its safety, quality, integrity, and efficacy. CE, with its simple instrumentation, superior separation efficiency, small sample consumption, and short analysis time, is a well-suited analytical tool for therapeutic protein characterization. Different separation modes, including CIEF, SDS-CGE, CZE, and CE-MS, provide complementary information of the proteins. The CE applications for recombinant therapeutic proteins from the year 2000 to June 2013 are reviewed and technical concerns are discussed in this article. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Mayorga, E.
2013-12-01
Practical, problem oriented software developed by scientists and graduate students in domains lacking a strong software development tradition is often balkanized into the scripting environments provided by dominant, typically proprietary tools. In environmental fields, these tools include ArcGIS, Matlab, SAS, Excel and others, and are often constrained to specific operating systems. While this situation is the outcome of rational choices, it limits the dissemination of useful tools and their integration into loosely coupled frameworks that can meet wider needs and be developed organically by groups addressing their own needs. Open-source dynamic languages offer the advantages of an accessible programming syntax, a wealth of pre-existing libraries, multi-platform access, linkage to community libraries developed in lower level languages such as C or FORTRAN, and access to web service infrastructure. Python in particular has seen a large and increasing uptake in scientific communities, as evidenced by the continued growth of the annual SciPy conference. Ecosystems with distinctive physical structures and organization, and mechanistic processes that are well characterized, are both factors that have often led to the grass-roots development of useful code meeting the needs of a range of communities. In aquatic applications, examples include river and watershed analysis tools (River Tools, Taudem, etc), and geochemical modules such as CO2SYS, PHREEQ and LOADEST. I will review the state of affairs and explore the potential offered by a Python tool ecosystem in supporting aquatic biogeochemistry and water quality research. This potential is multi-faceted and broadly involves accessibility to lone grad students, access to a wide community of programmers and problem solvers via online resources such as StackExchange, and opportunities to leverage broader cyberinfrastructure efforts and tools, including those from widely different domains. Collaborative development of such tools can provide the additional advantage of enhancing cohesion and communication across specific research areas, and reducing research obstacles in a range of disciplines.
Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states
Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.
2012-01-01
Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984
National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox
Price, Curtis
2010-01-01
This is release 1.0 of the National Water-Quality Assessment (NAWQA) Area-Characterization Toolbox. These tools are designed to be accessed using ArcGIS Desktop software (versions 9.3 and 9.3.1). The toolbox is composed of a collection of custom tools that implement geographic information system (GIS) techniques used by the NAWQA Program to characterize aquifer areas, drainage basins, and sampled wells.
Survey of Ambient Air Pollution Health Risk Assessment Tools.
Anenberg, Susan C; Belova, Anna; Brandt, Jørgen; Fann, Neal; Greco, Sue; Guttikunda, Sarath; Heroux, Marie-Eve; Hurley, Fintan; Krzyzanowski, Michal; Medina, Sylvia; Miller, Brian; Pandey, Kiran; Roos, Joachim; Van Dingenen, Rita
2016-09-01
Designing air quality policies that improve public health can benefit from information about air pollution health risks and impacts, which include respiratory and cardiovascular diseases and premature death. Several computer-based tools help automate air pollution health impact assessments and are being used for a variety of contexts. Expanding information gathered for a May 2014 World Health Organization expert meeting, we survey 12 multinational air pollution health impact assessment tools, categorize them according to key technical and operational characteristics, and identify limitations and challenges. Key characteristics include spatial resolution, pollutants and health effect outcomes evaluated, and method for characterizing population exposure, as well as tool format, accessibility, complexity, and degree of peer review and application in policy contexts. While many of the tools use common data sources for concentration-response associations, population, and baseline mortality rates, they vary in the exposure information source, format, and degree of technical complexity. We find that there is an important tradeoff between technical refinement and accessibility for a broad range of applications. Analysts should apply tools that provide the appropriate geographic scope, resolution, and maximum degree of technical rigor for the intended assessment, within resources constraints. A systematic intercomparison of the tools' inputs, assumptions, calculations, and results would be helpful to determine the appropriateness of each for different types of assessment. Future work would benefit from accounting for multiple uncertainty sources and integrating ambient air pollution health impact assessment tools with those addressing other related health risks (e.g., smoking, indoor pollution, climate change, vehicle accidents, physical activity). © 2016 Society for Risk Analysis.
Frontiers of in situ electron microscopy
Zheng, Haimei; Zhu, Yimei; Meng, Shirley Ying
2015-01-01
In situ transmission electron microscopy (TEM) has become an increasingly important tool for materials characterization. It provides key information on the structural dynamics of a material during transformations and the correlation between structure and properties of materials. With the recent advances in instrumentation, including aberration corrected optics, sample environment control, the sample stage, and fast and sensitive data acquisition, in situ TEM characterization has become more and more powerful. In this article, a brief review of the current status and future opportunities of in situ TEM is included. It also provides an introduction to the six articles covered by inmore » this issue of MRS Bulletin explore the frontiers of in situ electron microscopy, including liquid and gas environmental TEM, dynamic four-dimensional TEM, nanomechanics, ferroelectric domain switching studied by in situ TEM, and state-of-the-art atomic imaging of light elements (i.e., carbon atoms) and individual defects.« less
Galeotti, Francesco; Barile, Elisa; Lanzotti, Virginia; Dolci, Marcello; Curir, Paolo
2008-01-01
One flavone-C-glycoside and two flavonol-O-glycosides were recognized and isolated as the main flavonoidal components in nine different carnation cultivars, and their chemical structures have been determined by spectroscopic methods, including UV detection, MS and NMR. The distribution of these three compounds in flowers, leaves, stems, young sprouts, and roots of each cultivar was evaluated by a simple HPLC-UV method: the graphic representation of their content in the different tissues allows to identify and characterize unambiguously each considered carnation cultivar. The presented method could be an easy, inexpensive and reliable tool for carnation cultivar discrimination.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
Park, Jun-Sang; Zhang, Xuan; Kenesei, Peter; ...
2017-08-31
A suite of non-destructive, three-dimensional X-ray microscopy techniques have recently been developed and used to characterize the microstructures of polycrystalline materials. These techniques utilize high-energy synchrotron radiation and include near-field and far-field diffraction microscopy (NF- and FF-HEDM, respectively) and absorption tomography. Several compatible sample environments have also been developed, enabling a wide range of 3D studies of material evolution. In this article, the FF-HEDM technique is described in detail, including its implementation at the 1-ID beamline of the Advanced Photon Source. Examples of how the information obtained from FF-HEDM can be used to deepen our understanding of structure-property-processing relationships inmore » selected materials are presented.« less
de Almeida-Leite, Camila Megale; Arantes, Rosa Maria Esteves
2010-12-15
Central nervous system glial cells as astrocytes and microglia have been investigated in vitro and many intracellular pathways have been clarified upon various stimuli. Peripheral glial cells, however, are not as deeply investigated in vitro despite its importance role in inflammatory and neurodegenerative diseases. Based on our previous experience of culturing neuronal cells, our objective was to standardize and morphologically characterize a primary culture of mouse superior cervical ganglion glial cells in order to obtain a useful tool to study peripheral glial cell biology. Superior cervical ganglia from neonatal C57BL6 mice were enzymatically and mechanically dissociated and cells were plated on diluted Matrigel coated wells in a final concentration of 10,000cells/well. Five to 8 days post plating, glial cell cultures were fixed for morphological and immunocytochemical characterization. Glial cells showed a flat and irregular shape, two or three long cytoplasm processes, and round, oval or long shaped nuclei, with regular outline. Cell proliferation and mitosis were detected both qualitative and quantitatively. Glial cells were able to maintain their phenotype in our culture model including immunoreactivity against glial cell marker GFAP. This is the first description of immunocytochemical characterization of mouse sympathetic cervical ganglion glial cells in primary culture. This work discusses the uses and limitations of our model as a tool to study many aspects of peripheral glial cell biology. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lam, N.; Qiu, H.-I.; Quattrochi, Dale A.; Zhao, Wei
1997-01-01
With the rapid increase in spatial data, especially in the NASA-EOS (Earth Observing System) era, it is necessary to develop efficient and innovative tools to handle and analyze these data so that environmental conditions can be assessed and monitored. A main difficulty facing geographers and environmental scientists in environmental assessment and measurement is that spatial analytical tools are not easily accessible. We have recently developed a remote sensing/GIS software module called Image Characterization and Modeling System (ICAMS) to provide specialized spatial analytical tools for the measurement and characterization of satellite and other forms of spatial data. ICAMS runs on both the Intergraph-MGE and Arc/info UNIX and Windows-NT platforms. The main techniques in ICAMS include fractal measurement methods, variogram analysis, spatial autocorrelation statistics, textural measures, aggregation techniques, normalized difference vegetation index (NDVI), and delineation of land/water and vegetated/non-vegetated boundaries. In this paper, we demonstrate the main applications of ICAMS on the Intergraph-MGE platform using Landsat Thematic Mapper images from the city of Lake Charles, Louisiana. While the utilities of ICAMS' spatial measurement methods (e.g., fractal indices) in assessing environmental conditions remain to be researched, making the software available to a wider scientific community can permit the techniques in ICAMS to be evaluated and used for a diversity of applications. The findings from these various studies should lead to improved algorithms and more reliable models for environmental assessment and monitoring.
Jiang, Likun; You, Weiwei; Zhang, Xiaojun; Xu, Jian; Jiang, Yanliang; Wang, Kai; Zhao, Zixia; Chen, Baohua; Zhao, Yunfeng; Mahboob, Shahid; Al-Ghanim, Khalid A; Ke, Caihuan; Xu, Peng
2016-02-01
The small abalone (Haliotis diversicolor) is one of the most important aquaculture species in East Asia. To facilitate gene cloning and characterization, genome analysis, and genetic breeding of it, we constructed a large-insert bacterial artificial chromosome (BAC) library, which is an important genetic tool for advanced genetics and genomics research. The small abalone BAC library includes 92,610 clones with an average insert size of 120 Kb, equivalent to approximately 7.6× of the small abalone genome. We set up three-dimensional pools and super pools of 18,432 BAC clones for target gene screening using PCR method. To assess the approach, we screened 12 target genes in these 18,432 BAC clones and identified 16 positive BAC clones. Eight positive BAC clones were then sequenced and assembled with the next generation sequencing platform. The assembled contigs representing these 8 BAC clones spanned 928 Kb of the small abalone genome, providing the first batch of genome sequences for genome evaluation and characterization. The average GC content of small abalone genome was estimated as 40.33%. A total of 21 protein-coding genes, including 7 target genes, were annotated into the 8 BACs, which proved the feasibility of PCR screening approach with three-dimensional pools in small abalone BAC library. One hundred fifty microsatellite loci were also identified from the sequences for marker development in the future. The BAC library and clone pools provided valuable resources and tools for genetic breeding and conservation of H. diversicolor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunshah, R.F.; Shabaik, A.H.
The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides, ultrafine grain cermets. The deposits are characterized by hardness, microstructure and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets, Al/sub 2/O/sub 3/ and VC-TiC alloy carbides is given. Tools of different coating characteristics are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the costing, tool temperature, and cutting forces. Tool life tests show coated high speed steel tools show a 300% improvement in tool life.more » (Author) (GRA)« less
NASA Research Center Contributions to Space Shuttle Return to Flight (SSRTF)
NASA Technical Reports Server (NTRS)
Cockrell, Charles E., Jr.; Barnes, Robert S.; Belvin, Harry L.; Allmen, John; Otero, Angel
2005-01-01
Contributions provided by the NASA Research Centers to key Space Shuttle return-to-flight milestones, with an emphasis on debris and Thermal Protection System (TPS) damage characterization, are described herein. Several CAIB recommendations and Space Shuttle Program directives deal with the mitigation of external tank foam insulation as a debris source, including material characterization as well as potential design changes, and an understanding of Orbiter TPS material characteristics, damage scenarios, and repair options. Ames, Glenn, and Langley Research Centers have performed analytic studies, conducted experimental testing, and developed new technologies, analysis tools, and hardware to contribute to each of these recommendations. For the External Tank (ET), these include studies of spray-on foam insulation (SOFI), investigations of potential design changes, and applications of advanced non-destructive evaluation (NDE) technologies to understand ET TPS shedding during liftoff and ascent. The end-to-end debris assessment included transport analysis to determine the probabilities of impact for various debris sources. For the Orbiter, methods were developed, and validated through experimental testing, to determine thresholds for potential damage of Orbiter TPS components. Analysis tools were developed and validated for on-orbit TPS damage assessments, especially in the area of aerothermal environments. Advanced NDE technologies were also applied to the Orbiter TPS components, including sensor technologies to detect wing leading edge impacts during liftoff and ascent. Work is continuing to develop certified TPS repair options and to develop improved methodologies for reinforced carbon-carbon (RCC) damage progression to assist in on-orbit repair decision philosophy.
Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds
NASA Astrophysics Data System (ADS)
Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.
2016-04-01
A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.
Enabling functional genomics with genome engineering
Hilton, Isaac B.; Gersbach, Charles A.
2015-01-01
Advances in genome engineering technologies have made the precise control over genome sequence and regulation possible across a variety of disciplines. These tools can expand our understanding of fundamental biological processes and create new opportunities for therapeutic designs. The rapid evolution of these methods has also catalyzed a new era of genomics that includes multiple approaches to functionally characterize and manipulate the regulation of genomic information. Here, we review the recent advances of the most widely adopted genome engineering platforms and their application to functional genomics. This includes engineered zinc finger proteins, TALEs/TALENs, and the CRISPR/Cas9 system as nucleases for genome editing, transcription factors for epigenome editing, and other emerging applications. We also present current and potential future applications of these tools, as well as their current limitations and areas for future advances. PMID:26430154
Remote Sensing for Inland Water Quality Monitoring: A U.S. Army Corps of Engineers Perspective
2011-10-01
outlined in Water Quality Management Plans , including traditional field sampling (water, sediment, and biological) and measure- ment of physical...at one time, a more comprehen- sive historical record or trend analysis, a planning tool for prioritizing field surveying and sampling, and accurate...estimations of optically active constituents used to characterize water quality. Furthermore, when utilized in water quality management planning
Ultrasonic NDE Simulation for Composite Manufacturing Defects
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.
2016-01-01
The increased use of composites in aerospace components is expected to continue into the future. The large scale use of composites in aerospace necessitates the development of composite-appropriate nondestructive evaluation (NDE) methods to quantitatively characterize defects in as-manufactured parts and damage incurred during or post manufacturing. Ultrasonic techniques are one of the most common approaches for defect/damage detection in composite materials. One key technical challenge area included in NASA's Advanced Composite's Project is to develop optimized rapid inspection methods for composite materials. Common manufacturing defects in carbon fiber reinforced polymer (CFRP) composites include fiber waviness (in-plane and out-of-plane), porosity, and disbonds; among others. This paper is an overview of ongoing work to develop ultrasonic wavefield based methods for characterizing manufacturing waviness defects. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with in-plane fiber waviness (also known as marcelling). Wavefield data processing methods are applied to the simulation data to explore possible routes for quantitative defect characterization.
Vleeshouwers, Vivianne G A A; Oliver, Richard P
2014-03-01
One of most important challenges in plant breeding is improving resistance to the plethora of pathogens that threaten our crops. The ever-growing world population, changing pathogen populations, and fungicide resistance issues have increased the urgency of this task. In addition to a vital inflow of novel resistance sources into breeding programs, the functional characterization and deployment of resistance also needs improvement. Therefore, plant breeders need to adopt new strategies and techniques. In modern resistance breeding, effectors are emerging as tools to accelerate and improve the identification, functional characterization, and deployment of resistance genes. Since genome-wide catalogues of effectors have become available for various pathogens, including biotrophs as well as necrotrophs, effector-assisted breeding has been shown to be successful for various crops. "Effectoromics" has contributed to classical resistance breeding as well as for genetically modified approaches. Here, we present an overview of how effector-assisted breeding and deployment is being exploited for various pathosystems.
An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.
2003-01-01
This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.
Huang, Ying; Chen, Shi-Yi; Deng, Feilong
2016-01-01
In silico analysis of DNA sequences is an important area of computational biology in the post-genomic era. Over the past two decades, computational approaches for ab initio prediction of gene structure from genome sequence alone have largely facilitated our understanding on a variety of biological questions. Although the computational prediction of protein-coding genes has already been well-established, we are also facing challenges to robustly find the non-coding RNA genes, such as miRNA and lncRNA. Two main aspects of ab initio gene prediction include the computed values for describing sequence features and used algorithm for training the discriminant function, and by which different combinations are employed into various bioinformatic tools. Herein, we briefly review these well-characterized sequence features in eukaryote genomes and applications to ab initio gene prediction. The main purpose of this article is to provide an overview to beginners who aim to develop the related bioinformatic tools.
Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostrouchov, George; Doll, William E.; Beard, Les P.
2009-01-01
Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less
New Laboratory Tools for Emerging Bacterial Challenges.
Fournier, Pierre-Edouard; Drancourt, Michel; Raoult, Didier
2017-08-15
Since its creation, the Méditerranée-Infection foundation has aimed at optimizing the management of infectious diseases and surveying the local and global epidemiology. This pivotal role was permitted by the development of rational sampling, point-of-care tests, and extended automation as well as new technologies, including mass spectrometry for colony identification, real-time genomics for isolate characterization, and the development of versatile and permissive culture systems. By identifying and characterizing emerging microbial pathogens, these developments provided significant breakthroughs in infectious diseases. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.
Copéret, Christophe
2011-01-05
Stereoselectivity in alkene metathesis is a challenge and can be used as a tool to study active sites under working conditions. This review describes the stereochemical relevance and problems in alkene metathesis (kinetic vs. thermodynamic issues), the use of (E/Z) ratio at low conversions as a tool to characterize active sites of heterogeneous catalysts and finally to propose strategies to improve catalysts based on the current state of the art.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Optical Fabrication and Measurement AXAF and CIRS
NASA Technical Reports Server (NTRS)
Engelhaupt, Darell
1997-01-01
This paper presents a final report on Optical Fabrication and Measurement AXAF (Advanced X-Ray Astrophysics Facility) and CIRS (Composite Infrared Spectrometer) from July 12, 1994 to August 16, 1996.. This paper includes specific tasks to be performed. The tasks are as follows: 1) Preparation and Characterization of Zerodur Glass Samples; 2) Develop and Fabricate AXAF and CIRS Metrology Tooling; 3) Update AXAF Technical Data Base; and 4) Perform Fabrication Related Metrology Tasks for CIRS. This paper also includes final activities from the July, 1996 report to August 1996.
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
Advanced Stoichiometric Analysis of Metabolic Networks of Mammalian Systems
Orman, Mehmet A.; Berthiaume, Francois; Androulakis, Ioannis P.; Ierapetritou, Marianthi G.
2013-01-01
Metabolic engineering tools have been widely applied to living organisms to gain a comprehensive understanding about cellular networks and to improve cellular properties. Metabolic flux analysis (MFA), flux balance analysis (FBA), and metabolic pathway analysis (MPA) are among the most popular tools in stoichiometric network analysis. Although application of these tools into well-known microbial systems is extensive in the literature, various barriers prevent them from being utilized in mammalian cells. Limited experimental data, complex regulatory mechanisms, and the requirement of more complex nutrient media are some major obstacles in mammalian cell systems. However, mammalian cells have been used to produce therapeutic proteins, to characterize disease states or related abnormal metabolic conditions, and to analyze the toxicological effects of some medicinally important drugs. Therefore, there is a growing need for extending metabolic engineering principles to mammalian cells in order to understand their underlying metabolic functions. In this review article, advanced metabolic engineering tools developed for stoichiometric analysis including MFA, FBA, and MPA are described. Applications of these tools in mammalian cells are discussed in detail, and the challenges and opportunities are highlighted. PMID:22196224
Solid-State NMR Study of the Cicada Wing.
Gullion, John D; Gullion, Terry
2017-08-17
Wings of flying insects are part of the cuticle which forms the exoskeleton. The primary molecular components of cuticle are protein, chitin, and lipid. How these components interact with one another to form the exoskeleton is not completely understood. The difficulty in characterizing the cuticle arises because it is insoluble and noncrystalline. These properties severely limit the experimental tools that can be used for molecular characterization. Solid-state nuclear magnetic resonance experiments have been used in the past to characterize the exoskeleton of beetles and have found that chitin and protein make comparable contributions to the molecular matrix. However, little work has been done to characterize the components of the wing, which includes vein and membrane. In this work, solid-state NMR was used to characterize the wing of the 17-year cycle cicada (Magicicada cassini) that appeared in northern West Virginia during the summer of 2016. The NMR results show noticeable differences between the molecular components of the vein and membrane.
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
NASA Astrophysics Data System (ADS)
Iskhakova, K.; Murzakhanov, F.; Mamin, G.; Putlyaev, V.; Klimashina, E.; Fadeeva, I.; Fomin, A.; Barinov, S.; Maltsev, A.; Bakhteev, S.; Yusupov, R.; Gafurov, M.; Orlinskii, S.
2018-05-01
Calcium phosphates (CaP) are exploited in many fields of science, including geology, chemistry, biology and medicine due to their abundance in the nature and presence in the living organism. Various analytical and biochemical methods are used for controlling their chemical content, structure, morphology, etc. Unfortunately, magnetic resonance techniques are usually not even considered as necessary tools for CaP inspection. Some aspects of application of the commercially realized electron paramagnetic resonance (EPR) approaches for characterization of CaP powders and ceramics (including the nanosized materails) such as hydroxyapatite and tricalcium phosphates of biogenic and synthetic origins containing intrinsic impurities or intentional dopants are demonstrated. The key features and advantages of the EPR techniques for CaP based materials characterization that could compliment the data obtained with the recognized analytical methods are pointed out.
Rosen, Jacob; Brown, Jeffrey D; Barreca, Marco; Chang, Lily; Hannaford, Blake; Sinanan, Mika
2002-01-01
Minimally invasive surgeiy (MIS) involves a multi-dimensional series of tasks requiring a synthesis between visual information and the kinematics and dynamics of the surgical tools. Analysis of these sources of information is a key step in mastering MIS surgery but may also be used to define objective criteria for characterizing surgical performance. The BIueDRAGON is a new system for acquiring the kinematics and the dynamics of two endoscopic tools along with the visual view of the surgical scene. It includes two four-bar mechanisms equipped with position and force torque sensors for measuring the positions and the orientations (P/O) of two endoscopic tools along with the forces and torques applied by the surgeons hands. The methodology of decomposing the surgical task is based on a fully connected, finite-states (28 states) Markov model where each states corresponded to a fundamental tool/tissue interaction based on the tool kinematics and associated with unique F/T signatures. The experimental protocol included seven MIS tasks performed on an animal model (pig) by 30 surgeons at different levels of their residency training. Preliminary analysis of these data showed that major differences between residents at different skill levels were: (i) the types of tool/tissue interactions being used, (ii) the transitions between tool/tissue interactions being applied by each hand, (iii) time spent while perfonning each tool/tissue interaction, (iv) the overall completion time, and (v) the variable F/T magnitudes being applied by the subjects through the endoscopic tools. Systems like surgical robots or virtual reality simulators that inherently measure the kinematics and the dynamics of the surgical tool may benefit from inclusion of the proposed methodology for analysis of efficacy and objective evaluation of surgical skills during training.
Species richness and variety of life in Arizona’s ponderosa pine forest type
David R. Patton; Richard W. Hofstetter; John D. Bailey; Mary Ann Benoit
2014-01-01
Species richness (SR) is a tool that managers can use to include diversity in planning and decision-making and is a convenient and useful way to characterize the first level of biological diversity. A richness list derived from existing inventories enhances a managerâs understanding of the complexity of the plant and animal communities they manage. Without a list of...
Process Definition and Modeling Guidebook. Version 01.00.02
1992-12-01
material (and throughout the guidebook)process defnition is considered to be the act of representing the important characteristics of a process in a...characterized by software standards and guidelines, software inspections and reviews, and more formalized testing (including test plans, test sup- port tools...paper-based approach works well for training, examples, and possibly even small pilot projects and case studies. However, large projects will benefit from
2011-09-01
project research addresses our long-term goal to develop an analytical suite of the Advanced Laser Fluorescence (ALF) methods and instruments to improve...demonstrated ALF utility as an integrated tool for aquatic research and observations. The ALF integration into the major oceanographic programs is...currently in progress, including the California Current Ecosystem Long Term Ecological Research (CCE LTER, NSF) and California Cooperative Oceanic
Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.
Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda
2015-08-31
The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.
Molecular and Nanoscale Engineering of High Efficiency Excitonic Solar Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenekhe, Samson A.; Ginger, David S.; Cao, Guozhong
We combined the synthesis of new polymers and organic-inorganic hybrid materials with new experimental characterization tools to investigate bulk heterojunction (BHJ) polymer solar cells and hybrid organic-inorganic solar cells during the 2007-2010 period (phase I) of this project. We showed that the bulk morphology of polymer/fullerene blend solar cells could be controlled by using either self-assembled polymer semiconductor nanowires or diblock poly(3-alkylthiophenes) as the light-absorbing and hole transport component. We developed new characterization tools in-house, including photoinduced absorption (PIA) spectroscopy, time-resolved electrostatic force microscopy (TR-EFM) and conductive and photoconductive atomic force microscopy (c-AFM and pc-AFM), and used them to investigatemore » charge transfer and recombination dynamics in polymer/fullerene BHJ solar cells, hybrid polymer-nanocrystal (PbSe) devices, and dye-sensitized solar cells (DSSCs); we thus showed in detail how the bulk photovoltaic properties are connected to the nanoscale structure of the BHJ polymer solar cells. We created various oxide semiconductor (ZnO, TiO 2) nanostructures by solution processing routes, including hierarchical aggregates and nanorods/nanotubes, and showed that the nanostructured photoanodes resulted in substantially enhanced light-harvesting and charge transport, leading to enhanced power conversion efficiency of dye-sensitized solar cells.« less
Phytophthora database 2.0: update and future direction.
Park, Bongsoo; Martin, Frank; Geiser, David M; Kim, Hye-Seon; Mansfield, Michele A; Nikolaeva, Ekaterina; Park, Sook-Young; Coffey, Michael D; Russo, Joseph; Kim, Seong H; Balci, Yilmaz; Abad, Gloria; Burgess, Treena; Grünwald, Niklaus J; Cheong, Kyeongchae; Choi, Jaeyoung; Lee, Yong-Hwan; Kang, Seogchan
2013-12-01
The online community resource Phytophthora database (PD) was developed to support accurate and rapid identification of Phytophthora and to help characterize and catalog the diversity and evolutionary relationships within the genus. Since its release in 2008, the sequence database has grown to cover 1 to 12 loci for ≈2,600 isolates (representing 138 described and provisional species). Sequences of multiple mitochondrial loci were added to complement nuclear loci-based phylogenetic analyses and diagnostic tool development. Key characteristics of most newly described and provisional species have been summarized. Other additions to improve the PD functionality include: (i) geographic information system tools that enable users to visualize the geographic origins of chosen isolates on a global-scale map, (ii) a tool for comparing genetic similarity between isolates via microsatellite markers to support population genetic studies, (iii) a comprehensive review of molecular diagnostics tools and relevant references, (iv) sequence alignments used to develop polymerase chain reaction-based diagnostics tools to support their utilization and new diagnostic tool development, and (v) an online community forum for sharing and preserving experience and knowledge accumulated in the global Phytophthora community. Here we present how these improvements can support users and discuss the PD's future direction.
Kohmoto, Tomohiro; Masuda, Kiyoshi; Naruto, Takuya; Tange, Shoichiro; Shoda, Katsutoshi; Hamada, Junichi; Saito, Masako; Ichikawa, Daisuke; Tajima, Atsushi; Otsuji, Eigo; Imoto, Issei
2017-01-01
High-throughput next-generation sequencing is a powerful tool to identify the genotypic landscapes of somatic variants and therapeutic targets in various cancers including gastric cancer, forming the basis for personalized medicine in the clinical setting. Although the advent of many computational algorithms leads to higher accuracy in somatic variant calling, no standard method exists due to the limitations of each method. Here, we constructed a new pipeline. We combined two different somatic variant callers with different algorithms, Strelka and VarScan 2, and evaluated performance using whole exome sequencing data obtained from 19 Japanese cases with gastric cancer (GC); then, we characterized these tumors based on identified driver molecular alterations. More single nucleotide variants (SNVs) and small insertions/deletions were detected by Strelka and VarScan 2, respectively. SNVs detected by both tools showed higher accuracy for estimating somatic variants compared with those detected by only one of the two tools and accurately showed the mutation signature and mutations of driver genes reported for GC. Our combinatorial pipeline may have an advantage in detection of somatic mutations in GC and may be useful for further genomic characterization of Japanese patients with GC to improve the efficacy of GC treatments. J. Med. Invest. 64: 233-240, August, 2017.
Friction Stir Spot Welding of Advanced High Strength Steels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hovanski, Yuri; Grant, Glenn J.; Santella, M. L.
Friction stir spot welding techniques were developed to successfully join several advanced high strength steels. Two distinct tool materials were evaluated to determine the effect of tool materials on the process parameters and joint properties. Welds were characterized primarily via lap shear, microhardness, and optical microscopy. Friction stir spot welds were compared to the resistance spot welds in similar strength alloys by using the AWS standard for resistance spot welding high strength steels. As further comparison, a primitive cost comparison between the two joining processes was developed, which included an evaluation of the future cost prospects of friction stir spotmore » welding in advanced high strength steels.« less
Phased Array Antenna Testbed Development at the NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey
2003-01-01
Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.
Gray, Kathleen M.
2018-01-01
Environmental health literacy (EHL) is a relatively new framework for conceptualizing how people understand and use information about potentially harmful environmental exposures and their influence on health. As such, information on the characterization and measurement of EHL is limited. This review provides an overview of EHL as presented in peer-reviewed literature and aggregates studies based on whether they represent individual level EHL or community level EHL or both. A range of assessment tools has been used to measure EHL, with many studies relying on pre-/post-assessment; however, a broader suite of assessment tools may be needed to capture community-wide outcomes. This review also suggests that the definition of EHL should explicitly include community change or collective action as an important longer-term outcome and proposes a refinement of previous representations of EHL as a theoretical framework, to include self-efficacy. PMID:29518955
Approaches for Defining the Hsp90-dependent Proteome
Hartson, Steven D.; Matts, Robert L.
2011-01-01
Hsp90 is the target of ongoing drug discovery studies seeking new compounds to treat cancer, neurodegenerative diseases, and protein folding disorders. To better understand Hsp90’s roles in cellular pathologies and in normal cells, numerous studies have utilized proteomics assays and related high-throughput tools to characterize its physical and functional protein partnerships. This review surveys these studies, and summarizes the strengths and limitations of the individual attacks. We also include downloadable spreadsheets compiling all of the Hsp90-interacting proteins identified in more than 23 studies. These tools include cross-references among gene aliases, human homologues of yeast Hsp90-interacting proteins, hyperlinks to database entries, summaries of canonical pathways that are enriched in the Hsp90 interactome, and additional bioinformatic annotations. In addition to summarizing Hsp90 proteomics studies performed to date and the insights they have provided, we identify gaps in our current understanding of Hsp90-mediated proteostasis. PMID:21906632
Moscow Test Well, INEL Oversight Program: Aqueous geochemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurry, M.; Fromm, J.; Welhan, J.
1992-09-29
This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less
Enabling functional genomics with genome engineering.
Hilton, Isaac B; Gersbach, Charles A
2015-10-01
Advances in genome engineering technologies have made the precise control over genome sequence and regulation possible across a variety of disciplines. These tools can expand our understanding of fundamental biological processes and create new opportunities for therapeutic designs. The rapid evolution of these methods has also catalyzed a new era of genomics that includes multiple approaches to functionally characterize and manipulate the regulation of genomic information. Here, we review the recent advances of the most widely adopted genome engineering platforms and their application to functional genomics. This includes engineered zinc finger proteins, TALEs/TALENs, and the CRISPR/Cas9 system as nucleases for genome editing, transcription factors for epigenome editing, and other emerging applications. We also present current and potential future applications of these tools, as well as their current limitations and areas for future advances. © 2015 Hilton and Gersbach; Published by Cold Spring Harbor Laboratory Press.
NASA Technical Reports Server (NTRS)
Grillenbeck, Anton M.; Dillinger, Stephan A.; Elliott, Kenny B.
1998-01-01
Theoretical and experimental studies have been performed to investigate the potential and limitations of the modal characterization of a typical spacecraft bus structure by means of active structure elements. The aim of these studies has been test and advance tools for performing an accurate on-orbit modal identification which may be characterized by the availability of a generally very limited test instrumentation, autonomous excitation capabilities by active structure elements and a zero-g environment. The NASA LARC CSI Evolutionary Testbed provided an excellent object for the experimental part of this study program. The main subjects of investigation were: (1) the selection of optimum excitation and measurement to unambiguously identify modes of interest; (2) the applicability of different types of excitation means with focus on active structure elements; and (3) the assessment of the modal identification potential of different types of excitation functions and modal analysis tools. Conventional as well as dedicated modal analysis tools were applied to determine modal parameters and mode shapes. The results will be presented and discussed based on orthogonality checks as well as on suitable indicators for the quality of the acquired modes with respect to modal purity. In particular, the suitability for modal analysis of the acquired frequency response functions as obtained by excitation with active structure elements will be demonstrated with the help of reciprocity checks. Finally, the results will be summarized in a procedure to perform an on-orbit modal identification, including an indication of limitation to be observed.
U.S. Geological Survey: A synopsis of Three-dimensional Modeling
Jacobsen, Linda J.; Glynn, Pierre D.; Phelps, Geoff A.; Orndorff, Randall C.; Bawden, Gerald W.; Grauch, V.J.S.
2011-01-01
The U.S. Geological Survey (USGS) is a multidisciplinary agency that provides assessments of natural resources (geological, hydrological, biological), the disturbances that affect those resources, and the disturbances that affect the built environment, natural landscapes, and human society. Until now, USGS map products have been generated and distributed primarily as 2-D maps, occasionally providing cross sections or overlays, but rarely allowing the ability to characterize and understand 3-D systems, how they change over time (4-D), and how they interact. And yet, technological advances in monitoring natural resources and the environment, the ever-increasing diversity of information needed for holistic assessments, and the intrinsic 3-D/4-D nature of the information obtained increases our need to generate, verify, analyze, interpret, confirm, store, and distribute its scientific information and products using 3-D/4-D visualization, analysis, modeling tools, and information frameworks. Today, USGS scientists use 3-D/4-D tools to (1) visualize and interpret geological information, (2) verify the data, and (3) verify their interpretations and models. 3-D/4-D visualization can be a powerful quality control tool in the analysis of large, multidimensional data sets. USGS scientists use 3-D/4-D technology for 3-D surface (i.e., 2.5-D) visualization as well as for 3-D volumetric analyses. Examples of geological mapping in 3-D include characterization of the subsurface for resource assessments, such as aquifer characterization in the central United States, and for input into process models, such as seismic hazards in the western United States.
NMR Methods, Applications and Trends for Groundwater Evaluation and Management
NASA Astrophysics Data System (ADS)
Walsh, D. O.; Grunewald, E. D.
2011-12-01
Nuclear magnetic resonance (NMR) measurements have a tremendous potential for improving groundwater characterization, as they provide direct detection and measurement of groundwater and unique information about pore-scale properties. NMR measurements, commonly used in chemistry and medicine, are utilized in geophysical investigations through non-invasive surface NMR (SNMR) or downhole NMR logging measurements. Our recent and ongoing research has focused on improving the performance and interpretation of NMR field measurements for groundwater characterization. Engineering advancements have addressed several key technical challenges associated with SNMR measurements. Susceptibility of SNMR measurements to environmental noise has been dramatically reduced through the development of multi-channel acquisition hardware and noise-cancellation software. Multi-channel instrumentation (up to 12 channels) has also enabled more efficient 2D and 3D imaging. Previous limitations in measuring NMR signals from water in silt, clay and magnetic geology have been addressed by shortening the instrument dead-time from 40 ms to 4 ms, and increasing the power output. Improved pulse sequences have been developed to more accurately estimate NMR relaxation times and their distributions, which are sensitive to pore size distributions. Cumulatively, these advancements have vastly expanded the range of environments in which SNMR measurements can be obtained, enabling detection of groundwater in smaller pores, in magnetic geology, in the unsaturated zone, and nearby to infrastructure (presented here in case studies). NMR logging can provide high-resolution estimates of bound and mobile water content and pore size distributions. While NMR logging has been utilized in oil and gas applications for decades, its use in groundwater investigations has been limited by the large size and high cost of oilfield NMR logging tools and services. Recently, engineering efforts funded by the US Department of Energy have produced an NMR logging tool that is much smaller and less costly than comparable oilfield NMR logging tools. This system is specifically designed for near surface groundwater investigations, incorporates small diameter probes (as small as 1.67 inches diameter) and man-portable surface stations, and provides NMR data and information content on par with oilfield NMR logging tools. A direct-push variant of this logging tool has also been developed. Key challenges associated with small diameter tools include inherently lower SNR and logging speeds, the desire to extend the sensitive zone as far as possible into unconsolidated formations, and simultaneously maintaining high power and signal fidelity. Our ongoing research in groundwater NMR aims to integrating surface and borehole measurements for regional-scale permeability mapping, and to develop in-place NMR sensors for long term monitoring of contaminant and remediation processes. In addition to groundwater resource characterization, promising new applications of NMR include assessing water content in ice and permafrost, management of groundwater in mining operations, and evaluation and management of groundwater in civil engineering applications.
Concentration solar power optimization system and method of using the same
Andraka, Charles E
2014-03-18
A system and method for optimizing at least one mirror of at least one CSP system is provided. The system has a screen for displaying light patterns for reflection by the mirror, a camera for receiving a reflection of the light patterns from the mirror, and a solar characterization tool. The solar characterization tool has a characterizing unit for determining at least one mirror parameter of the mirror based on an initial position of the camera and the screen, and a refinement unit for refining the determined parameter(s) based on an adjusted position of the camera and screen whereby the mirror is characterized. The system may also be provided with a solar alignment tool for comparing at least one mirror parameter of the mirror to a design geometry whereby an alignment error is defined, and at least one alignment unit for adjusting the mirror to reduce the alignment error.
Investigating the Effects of Pin Tool Design on Friction Stir Welded Ti-6Al-4V
NASA Technical Reports Server (NTRS)
Rubisoff, H. A.; Querin, J. A.; Schneider, Judy A.; Magee, D.
2009-01-01
Friction stir welding (FSWing), a solid state joining technique, uses a non-consumable rotating pin tool to thermomechanically join materials. Heating of the weldment caused by friction and deformation is a function of the interaction between the pin tool and the work piece. Therefore, the geometry of the pin tool is in part responsible for the resulting microstructure and mechanical properties. In this study microwave sintered tungsten carbide (WC) pin tools with tapers and flats were used to FSW Ti-6Al-4V. Transverse sections of welds were mechanically tested, and the microstructure was characterized using optical microscopy (OM) and scanning election microscopy (SEM). X-ray diffraction (XRD) and electron back-scatter diffraction (EBSD) were used to characterize the texture within the welds produced from the different pin tool designs.
Toutios, Asterios; Narayanan, Shrikanth S
2016-01-01
Real-time magnetic resonance imaging (rtMRI) of the moving vocal tract during running speech production is an important emerging tool for speech production research providing dynamic information of a speaker's upper airway from the entire mid-sagittal plane or any other scan plane of interest. There have been several advances in the development of speech rtMRI and corresponding analysis tools, and their application to domains such as phonetics and phonological theory, articulatory modeling, and speaker characterization. An important recent development has been the open release of a database that includes speech rtMRI data from five male and five female speakers of American English each producing 460 phonetically balanced sentences. The purpose of the present paper is to give an overview and outlook of the advances in rtMRI as a tool for speech research and technology development.
Different Strokes for Different Folks: Visual Presentation Design between Disciplines
Gomez, Steven R.; Jianu, Radu; Ziemkiewicz, Caroline; Guo, Hua; Laidlaw, David H.
2015-01-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard “chalk talks”. We found design differences in slideshows using two methods – coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant’s own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information. PMID:26357149
Recent Advances in Cardiac Computed Tomography: Dual Energy, Spectral and Molecular CT Imaging
Danad, Ibrahim; Fayad, Zahi A.; Willemink, Martin J.; Min, James K.
2015-01-01
Computed tomography (CT) evolved into a powerful diagnostic tool and it is impossible to imagine current clinical practice without CT imaging. Due to its widespread availability, ease of clinical application, superb sensitivity for detection of CAD, and non-invasive nature, CT has become a valuable tool within the armamentarium of the cardiologist. In the last few years, numerous technological advances in CT have occurred—including dual energy CT (DECT), spectral CT and CT-based molecular imaging. By harnessing the advances in technology, cardiac CT has advanced beyond the mere evaluation of coronary stenosis to an imaging modality tool that permits accurate plaque characterization, assessment of myocardial perfusion and even probing of molecular processes that are involved in coronary atherosclerosis. Novel innovations in CT contrast agents and pre-clinical spectral CT devices have paved the way for CT-based molecular imaging. PMID:26068288
THE DURABILITY OF LARGE-SCALE ADDITIVE MANUFACTURING COMPOSITE MOLDS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, Brian K; Love, Lonnie J; Duty, Chad
2016-01-01
Oak Ridge National Laboratory s Big Area Additive Manufacturing (BAAM) technology permits the rapid production of thermoplastic composite molds using a carbon fiber filled Acrylonitrile-Butadiene-Styrene (ABS) thermoplastic. Demonstration tools (i.e. 0.965 m X 0.559 m X 0.152 m) for composite part fabrication have been printed, coated, and finished with a traditional tooling gel. We present validation results demonstrating the stability of thermoplastic printed molds for room temperature Vacuum Assisted Resin Transfer Molding (VARTM) processes. Arkema s Elium thermoplastic resin was investigated with a variety of reinforcement materials. Experimental results include dimensional characterization of the tool surface using laser scanning techniquemore » following demolding of 10 parts. Thermoplastic composite molds offer rapid production compared to traditionally built thermoset molds in that near-net deposition allows direct digital production of the net geometry at production rate of 45 kg/hr.« less
Social media use and educational preferences among first-year pharmacy students.
Clauson, Kevin A; Singh-Franco, Devada; Sircar-Ramsewak, Feroza; Joseph, Shine; Sandars, John
2013-01-01
Social media may offer a means to engage students, facilitate collaborative learning, and tailor educational delivery for diverse learning styles. The purpose of this study is to characterize social media awareness among pharmacy students and determine perceptions toward integrating these tools in education. A 23-item survey was administered to 1st-year students at a multicampus college of pharmacy. Students (95% response rate; N = 196) most commonly used wikis (97%), social networking (91%), and videosharing (84%). Tools reported as never used or unknown included social bookmarking (89%), collaborative writing (84%), and RSS readers (73%). Respondents indicated that educational integration of social media would impact their ability to learn in a positive/very positive manner (75%) and make them feel connected/very connected (68%). Selectively targeting social media for educational integration and instructing pharmacy students how to employ a subset of these tools may be useful in engaging them and encouraging lifelong learning.
Different Strokes for Different Folks: Visual Presentation Design between Disciplines.
Gomez, S R; Jianu, R; Ziemkiewicz, C; Guo, Hua; Laidlaw, D H
2012-12-01
We present an ethnographic study of design differences in visual presentations between academic disciplines. Characterizing design conventions between users and data domains is an important step in developing hypotheses, tools, and design guidelines for information visualization. In this paper, disciplines are compared at a coarse scale between four groups of fields: social, natural, and formal sciences; and the humanities. Two commonplace presentation types were analyzed: electronic slideshows and whiteboard "chalk talks". We found design differences in slideshows using two methods - coding and comparing manually-selected features, like charts and diagrams, and an image-based analysis using PCA called eigenslides. In whiteboard talks with controlled topics, we observed design behaviors, including using representations and formalisms from a participant's own discipline, that suggest authors might benefit from novel assistive tools for designing presentations. Based on these findings, we discuss opportunities for visualization ethnography and human-centered authoring tools for visual information.
Verification and Validation of NASA-Supported Enhancements to Decision Support Tools of PECAD
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; McKellip, Rodney; Moore, Roxzana F.; Fendley, Debbie
2005-01-01
This section of the evaluation report summarizes the verification and validation (V&V) of recently implemented, NASA-supported enhancements to the decision support tools of the Production Estimates and Crop Assessment Division (PECAD). The implemented enhancements include operationally tailored Moderate Resolution Imaging Spectroradiometer (MODIS) products and products of the Global Reservoir and Lake Monitor (GRLM). The MODIS products are currently made available through two separate decision support tools: the MODIS Image Gallery and the U.S. Department of Agriculture (USDA) Foreign Agricultural Service (FAS) MODIS Normalized Difference Vegetation Index (NDVI) Database. Both the Global Reservoir and Lake Monitor and MODIS Image Gallery provide near-real-time products through PECAD's CropExplorer. This discussion addresses two areas: 1. Assessments of the standard NASA products on which these enhancements are based. 2. Characterizations of the performance of the new operational products.
TOUTIOS, ASTERIOS; NARAYANAN, SHRIKANTH S.
2016-01-01
Real-time magnetic resonance imaging (rtMRI) of the moving vocal tract during running speech production is an important emerging tool for speech production research providing dynamic information of a speaker's upper airway from the entire mid-sagittal plane or any other scan plane of interest. There have been several advances in the development of speech rtMRI and corresponding analysis tools, and their application to domains such as phonetics and phonological theory, articulatory modeling, and speaker characterization. An important recent development has been the open release of a database that includes speech rtMRI data from five male and five female speakers of American English each producing 460 phonetically balanced sentences. The purpose of the present paper is to give an overview and outlook of the advances in rtMRI as a tool for speech research and technology development. PMID:27833745
Small Angle X-ray Scattering for Nanoparticle Research
Li, Tao; Senesi, Andrew J.; Lee, Byeongdu
2016-04-07
X-ray scattering is a structural characterization tool that has impacted diverse fields of study. It is unique in its ability to examine materials in real time and under realistic sample environments, enabling researchers to understand morphology at nanometer and ångström length scales using complementary small and wide angle X-ray scattering (SAXS, WAXS), respectively. Herein, we focus on the use of SAXS to examine nanoscale particulate systems. We provide a theoretical foundation for X-ray scattering, considering both form factor and structure factor, as well as the use of correlation functions, which may be used to determine a particle’s size, size distribution,more » shape, and organization into hierarchal structures. The theory is expanded upon with contemporary use cases. Both transmission and reflection (grazing incidence) geometries are addressed, as well the combination of SAXS with other X-ray and non-X ray characterization tools. Furthermore, we conclude with an examination of several key areas of research where X-rays scattering has played a pivotal role, including in situ nanoparticle synthesis, nanoparticle assembly, and in operando studies of catalysts and energy storage materials. Throughout this review we highlight the unique capabilities of X-ray scattering for structural characterization of materials in their native environment.« less
Rommel, Karl-Philipp; Lücke, Christian; Lurz, Philipp
2017-10-01
Heart failure with preserved ejection fraction (HFpEF) presents a major challenge in modern cardiology. Although this syndrome is of increasing prevalence and is associated with unfavorable outcomes, treatment trials have failed to establish effective therapies. Currently, solutions to this dilemma are being investigated, including categorizing and characterizing patients more diversely to individualize treatment. In this regard, new imaging techniques might provide important information. Diastolic dysfunction is a diagnostic and pathophysiological cornerstone in HFpEF and is believed to be caused by systemic inflammation with the development of interstitial myocardial fibrosis and myocardial stiffening. Cardiac magnetic resonance (CMR) T 1 -mapping is a novel tool, which allows noninvasive quantification of the extracellular space and diffuse myocardial fibrosis. This review provides an overview of the potential of myocardial tissue characterization with CMR T 1 mapping in HFpEF patients, outlining its diagnostic and prognostic implications and discussing future directions. We conclude that CMR T 1 mapping is potentially an effective tool for patient characterization in large-scale epidemiological, diagnostic, and therapeutic HFpEF trials beyond traditional imaging parameters. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Preserved dopaminergic homeostasis and dopamine-related behaviour in hemizygous TH-Cre mice.
Runegaard, Annika H; Jensen, Kathrine L; Fitzpatrick, Ciarán M; Dencker, Ditte; Weikop, Pia; Gether, Ulrik; Rickhag, Mattias
2017-01-01
Cre-driver mouse lines have been extensively used as genetic tools to target and manipulate genetically defined neuronal populations by expression of Cre recombinase under selected gene promoters. This approach has greatly advanced neuroscience but interpretations are hampered by the fact that most Cre-driver lines have not been thoroughly characterized. Thus, a phenotypic characterization is of major importance to reveal potential aberrant phenotypes prior to implementation and usage to selectively inactivate or induce transgene expression. Here, we present a biochemical and behavioural assessment of the dopaminergic system in hemizygous tyrosine hydroxylase (TH)-Cre mice in comparison to wild-type (WT) controls. Our data show that TH-Cre mice display preserved dopaminergic homeostasis with unaltered levels of TH and dopamine as well as unaffected dopamine turnover in striatum. TH-Cre mice also show preserved dopamine transporter expression and function supporting sustained dopaminergic transmission. In addition, TH-Cre mice demonstrate normal responses in basic behavioural paradigms related to dopaminergic signalling including locomotor activity, reward preference and anxiolytic behaviour. Our results suggest that TH-Cre mice represent a valid tool to study the dopamine system, though careful characterization must always be performed to prevent false interpretations following Cre-dependent transgene expression and manipulation of selected neuronal pathways. © 2016 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Small Angle X-ray Scattering for Nanoparticle Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Tao; Senesi, Andrew J.; Lee, Byeongdu
X-ray scattering is a structural characterization tool that has impacted diverse fields of study. It is unique in its ability to examine materials in real time and under realistic sample environments, enabling researchers to understand morphology at nanometer and ångström length scales using complementary small and wide angle X-ray scattering (SAXS, WAXS), respectively. Herein, we focus on the use of SAXS to examine nanoscale particulate systems. We provide a theoretical foundation for X-ray scattering, considering both form factor and structure factor, as well as the use of correlation functions, which may be used to determine a particle’s size, size distribution,more » shape, and organization into hierarchal structures. The theory is expanded upon with contemporary use cases. Both transmission and reflection (grazing incidence) geometries are addressed, as well the combination of SAXS with other X-ray and non-X ray characterization tools. Furthermore, we conclude with an examination of several key areas of research where X-rays scattering has played a pivotal role, including in situ nanoparticle synthesis, nanoparticle assembly, and in operando studies of catalysts and energy storage materials. Throughout this review we highlight the unique capabilities of X-ray scattering for structural characterization of materials in their native environment.« less
Battlefield decision aid for acoustical ground sensors with interface to meteorological data sources
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Noble, John M.; VanAartsen, Bruce H.; Szeto, Gregory L.
2001-08-01
The performance of acoustical ground sensors depends heavily on the local atmospheric and terrain conditions. This paper describes a prototype physics-based decision aid, called the Acoustic Battlefield Aid (ABFA), for predicting these environ-mental effects. ABFA integrates advanced models for acoustic propagation, atmospheric structure, and array signal process-ing into a convenient graphical user interface. The propagation calculations are performed in the frequency domain on user-definable target spectra. The solution method involves a parabolic approximation to the wave equation combined with a ter-rain diffraction model. Sensor performance is characterized with Cramer-Rao lower bounds (CRLBs). The CRLB calcula-tions include randomization of signal energy and wavefront orientation resulting from atmospheric turbulence. Available performance characterizations include signal-to-noise ratio, probability of detection, direction-finding accuracy for isolated receiving arrays, and location-finding accuracy for networked receiving arrays. A suite of integrated tools allows users to create new target descriptions from standard digitized audio files and to design new sensor array layouts. These tools option-ally interface with the ARL Database/Automatic Target Recognition (ATR) Laboratory, providing access to an extensive library of target signatures. ABFA also includes a Java-based capability for network access of near real-time data from sur-face weather stations or forecasts from the Army's Integrated Meteorological System. As an example, the detection footprint of an acoustical sensor, as it evolves over a 13-hour period, is calculated.
Using Micro-Synchrophasor Data for Advanced Distribution Grid Planning and Operations Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma; Kiliccote, Sila; McParland, Charles
2014-07-01
This report reviews the potential for distribution-grid phase-angle data that will be available from new micro-synchrophasors (µPMUs) to be utilized in existing distribution-grid planning and operations analysis. This data could augment the current diagnostic capabilities of grid analysis software, used in both planning and operations for applications such as fault location, and provide data for more accurate modeling of the distribution system. µPMUs are new distribution-grid sensors that will advance measurement and diagnostic capabilities and provide improved visibility of the distribution grid, enabling analysis of the grid’s increasingly complex loads that include features such as large volumes of distributed generation.more » Large volumes of DG leads to concerns on continued reliable operation of the grid, due to changing power flow characteristics and active generation, with its own protection and control capabilities. Using µPMU data on change in voltage phase angle between two points in conjunction with new and existing distribution-grid planning and operational tools is expected to enable model validation, state estimation, fault location, and renewable resource/load characterization. Our findings include: data measurement is outstripping the processing capabilities of planning and operational tools; not every tool can visualize a voltage phase-angle measurement to the degree of accuracy measured by advanced sensors, and the degree of accuracy in measurement required for the distribution grid is not defined; solving methods cannot handle the high volumes of data generated by modern sensors, so new models and solving methods (such as graph trace analysis) are needed; standardization of sensor-data communications platforms in planning and applications tools would allow integration of different vendors’ sensors and advanced measurement devices. In addition, data from advanced sources such as µPMUs could be used to validate models to improve/ensure accuracy, providing information on normally estimated values such as underground conductor impedance, and characterization of complex loads. Although the input of high-fidelity data to existing tools will be challenging, µPMU data on phase angle (as well as other data from advanced sensors) will be useful for basic operational decisions that are based on a trend of changing data.« less
NASA Astrophysics Data System (ADS)
Budde, M. E.; Rowland, J.; Anthony, M.; Palka, S.; Martinez, J.; Hussain, R.
2017-12-01
The U.S. Geological Survey (USGS) supports the use of Earth observation data for food security monitoring through its role as an implementing partner of the Famine Early Warning Systems Network (FEWS NET). The USGS Earth Resources Observation and Science (EROS) Center has developed tools designed to aid food security analysts in developing assumptions of agro-climatological outcomes. There are four primary steps to developing agro-climatology assumptions; including: 1) understanding the climatology, 2) evaluating current climate modes, 3) interpretation of forecast information, and 4) incorporation of monitoring data. Analysts routinely forecast outcomes well in advance of the growing season, which relies on knowledge of climatology. A few months prior to the growing season, analysts can assess large-scale climate modes that might influence seasonal outcomes. Within two months of the growing season, analysts can evaluate seasonal forecast information as indicators. Once the growing season begins, monitoring data, based on remote sensing and field information, can characterize the start of season and remain integral monitoring tools throughout the duration of the season. Each subsequent step in the process can lead to modifications of the original climatology assumption. To support such analyses, we have created an agro-climatology analysis tool that characterizes each step in the assumption building process. Satellite-based rainfall and normalized difference vegetation index (NDVI)-based products support both the climatology and monitoring steps, sea-surface temperature data and knowledge of the global climate system inform the climate modes, and precipitation forecasts at multiple scales support the interpretation of forecast information. Organizing these data for a user-specified area provides a valuable tool for food security analysts to better formulate agro-climatology assumptions that feed into food security assessments. We have also developed a knowledge base for over 80 countries that provide rainfall and NDVI-based products, including annual and seasonal summaries, historical anomalies, coefficient of variation, and number of years below 70% of annual or seasonal averages. These products provide a quick look for analysts to assess the agro-climatology of a country.
Toward a Personalized Approach in Prebiotics Research.
Dey, Moul
2017-01-26
Recent characterization of the human microbiome and its influences on health have led to dramatic conceptual shifts in dietary bioactives research. Prebiotic foods that include many dietary fibers and resistant starches are perceived as beneficial for maintaining a healthy gut microbiota. This article brings forward some current perspectives in prebiotic research to discuss why reporting of individual variations in response to interventions will be important to discern suitability of prebiotics as a disease prevention tool.
NASA Technical Reports Server (NTRS)
1991-01-01
Various papers on supercomputing are presented. The general topics addressed include: program analysis/data dependence, memory access, distributed memory code generation, numerical algorithms, supercomputer benchmarks, latency tolerance, parallel programming, applications, processor design, networks, performance tools, mapping and scheduling, characterization affecting performance, parallelism packaging, computing climate change, combinatorial algorithms, hardware and software performance issues, system issues. (No individual items are abstracted in this volume)
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
NASA Technical Reports Server (NTRS)
Fusaro, Robert L. (Editor); Achenbach, J. D. (Editor)
1993-01-01
The present volume on tribological materials and NDE discusses liquid lubricants for advanced aircraft engines, a liquid lubricant for space applications, solid lubricants for aeronautics, and thin solid-lubricant films in space. Attention is given to the science and technology of NDE, tools for an NDE engineering base, experimental techniques in ultrasonics for NDE and material characterization, and laser ultrasonics. Topics addressed include thermal methods of NDE and quality control, digital radiography in the aerospace industry, materials characterization by ultrasonic methods, and NDE of ceramics and ceramic composites. Also discussed are smart materials and structures, intelligent processing of materials, implementation of NDE technology on flight structures, and solid-state weld evaluation.
Timmins, Peter; Desai, Divyakant; Chen, Wei; Wray, Patrick; Brown, Jonathan; Hanley, Sarah
2016-08-01
Approaches to characterizing and developing understanding around the mechanisms that control the release of drugs from hydrophilic matrix tablets are reviewed. While historical context is provided and direct physical characterization methods are described, recent advances including the role of percolation thresholds, the application on magnetic resonance and other spectroscopic imaging techniques are considered. The influence of polymer and dosage form characteristics are reviewed. The utility of mathematical modeling is described. Finally, how all the information derived from applying the developed mechanistic understanding from all of these tools can be brought together to develop a robust and reliable hydrophilic matrix extended-release tablet formulation is proposed.
On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Gates, Thomas S.
2003-01-01
A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.
Genomics of foodborne pathogens for microbial food safety.
Allard, Marc W; Bell, Rebecca; Ferreira, Christina M; Gonzalez-Escalona, Narjol; Hoffmann, Maria; Muruvanda, Tim; Ottesen, Andrea; Ramachandran, Padmini; Reed, Elizabeth; Sharma, Shashi; Stevens, Eric; Timme, Ruth; Zheng, Jie; Brown, Eric W
2018-02-01
Whole genome sequencing (WGS) has been broadly used to provide detailed characterization of foodborne pathogens. These genomes for diverse species including Salmonella, Escherichia coli, Listeria, Campylobacter and Vibrio have provided great insight into the genetic make-up of these pathogens. Numerous government agencies, industry and academia have developed new applications in food safety using WGS approaches such as outbreak detection and characterization, source tracking, determining the root cause of a contamination event, profiling of virulence and pathogenicity attributes, antimicrobial resistance monitoring, quality assurance for microbiology testing, as well as many others. The future looks bright for additional applications that come with the new technologies and tools in genomics and metagenomics. Published by Elsevier Ltd.
The Advancement of Public Awareness, Concerning TRU Waste Characterization, Using a Virtual Document
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, T. B.; Burns, T. P.; Estill, W. G.
2002-02-28
Building public trust and confidence through openness is a goal of the DOE Carlsbad Field Office for the Waste Isolation Pilot Plant (WIPP). The objective of the virtual document described in this paper is to give the public an overview of the waste characterization steps, an understanding of how waste characterization instrumentation works, and the type and amount of data generated from a batch of drums. The document is intended to be published on a web page and/or distributed at public meetings on CDs. Users may gain as much information as they desire regarding the transuranic (TRU) waste characterization program,more » starting at the highest level requirements (drivers) and progressing to more and more detail regarding how the requirements are met. Included are links to: drivers (which include laws, permits and DOE Orders); various characterization steps required for transportation and disposal under WIPP's Hazardous Waste Facility Permit; physical/chemical basis for each characterization method; types of data produced; and quality assurance process that accompanies each measurement. Examples of each type of characterization method in use across the DOE complex are included. The original skeleton of the document was constructed in a PowerPoint presentation and included descriptions of each section of the waste characterization program. This original document had a brief overview of Acceptable Knowledge, Non-Destructive Examination, Non-Destructive Assay, Small Quantity sites, and the National Certification Team. A student intern was assigned the project of converting the document to a virtual format and to discuss each subject in depth. The resulting product is a fully functional virtual document that works in a web browser and functions like a web page. All documents that were referenced, linked to, or associated, are included on the virtual document's CD. WIPP has been engaged in a variety of Hazardous Waste Facility Permit modification activities. During the public meetings, discussion centered on proposed changes to the characterization program. The philosophy behind the virtual document is to show the characterization process as a whole, rather than as isolated parts. In addition to public meetings, other uses for the information might be as a training tool for new employees at the WIPP facility to show them where their activities fit into the overall scheme, as well as an employee review to help prepare for waste certification audits.« less
Visualization of metallodrugs in single cells by secondary ion mass spectrometry imaging.
Wu, Kui; Jia, Feifei; Zheng, Wei; Luo, Qun; Zhao, Yao; Wang, Fuyi
2017-07-01
Secondary ion mass spectrometry, including nanoscale secondary ion mass spectrometry (NanoSIMS) and time-of-flight secondary ion mass spectrometry (ToF-SIMS), has emerged as a powerful tool for biological imaging, especially for single cell imaging. SIMS imaging can provide information on subcellular distribution of endogenous and exogenous chemicals, including metallodrugs, from membrane through to cytoplasm and nucleus without labeling, and with high spatial resolution and chemical specificity. In this mini-review, we summarize recent progress in the field of SIMS imaging, particularly in the characterization of the subcellular distribution of metallodrugs. We anticipate that the SIMS imaging method will be widely applied to visualize subcellular distributions of drugs and drug candidates in single cells, exerting significant influence on early drug evaluation and metabolism in medicinal and pharmaceutical chemistry. Recent progress of SIMS applications in characterizing the subcellular distributions of metallodrugs was summarized.
Podbielska, Maria; Levery, Steven B; Hogan, Edward L
2011-01-01
A family of neutral glycosphingolipids containing a 3-O-acetyl-sphingosine galactosylceramide (3-SAG) has been characterized. Seven new derivatives of galactosylceramide (GalCer), designated as fast-migrating cerebrosides (FMCs) by TLC retention factor, have been identified. The simplest compounds – FMC-1 and FMC-2 – of this series have been characterized as the 3-SAG containing nonhydroxy and hydroxy fatty acyl, respectively. The next two – FMC-3 and FMC-4 – add 6-O-acetyl-galactose and the most complex glycosphingolipids, FMC-5, -6 and -7, are 2,3,4,6-tetra-O-acetyl-3-SAG. These hydrophobic myelin lipid biomarkers coappear with GalCer during myelinogenesis and disappear along with GalCer in de- or dys-myelinating disorders. Myelin lipid antigens, including FMCs, are keys to myelin biology, opening the possibility of new and novel immune modulatory tools for treatment of autoimmune diseases including multiple sclerosis. PMID:22701512
Discovery and Characterization of Small Planets from the K2 Mission
NASA Astrophysics Data System (ADS)
Howard, Andrew
The K2 mission offers a unique opportunity to find substantial numbers of new transiting planets with host stars much brighter than those found by Kepler -- ideal targets for measurements of planetary atmospheres (with HST and JWST) and planetary masses and densities (with Doppler spectroscopy). The K2 data present unique challenges compared to the Kepler mission. We propose to build on our team's demonstrated successes with the Kepler photometry and in finding exciting new planetary systems in K2 data. We will search for transiting planets in photometry of all stellar K2 targets in each of the first three K2 Campaigns (Fields C0, C1, and C2). We will adapt and enhance our TERRA transit search tool to detect transits in the K2 photometry, and we will assess candidate transiting planets with a suite of K2-specific vetting tools including pixel-level inspection for transit localization, centroid motion tests, and secondary eclipse searches. We will publicly release TERRA and our pixel-level diagnostics for use by other teams in future analyses of K2 and TESS photometry. We will also develop FreeBLEND, a free and open source tool to robustly quantify the probability of false positive detections for individual planet candidates given reduced photometry, constraints from the K2 pixel-level data, adaptive optics imaging, high-resolution stellar spectroscopy, and radial velocity measurements. This tool will be similar to BLENDER for Kepler, but (a) more computationally efficient and useable on the wide range of galactic latitudes that K2 samples and (b) available for use by the entire community. With these tools we will publicly release high-quality (low-noise) reduced photometry of the K2 target stars as well as catalogs of the transiting planets. Host stars in our planet catalogs will be characterized by medium and high-resolution spectroscopy (as appropriate) to yield accurate planet parameters. For a handful of planets in the sample, we will measure masses using Keck-HIRES to constrain the planets' bulk densities and compositions. This project is relevant to the ADA Program as it focuses on archived K2 mission data. It supports NASA's strategic goals to characterize the diverse population of small exoplanets, identified targets to maximize JWST's exoplanet science yield, and develops community tools for use with K2, TESS, and other future missions.
Games, Patrícia Dias; daSilva, Elói Quintas Gonçalves; Barbosa, Meire de Oliveira; Almeida-Souza, Hebréia Oliveira; Fontes, Patrícia Pereira; deMagalhães, Marcos Jorge; Pereira, Paulo Roberto Gomes; Prates, Maura Vianna; Franco, Gloria Regina; Faria-Campos, Alessandra; Campos, Sérgio Vale Aguiar; Baracat-Pereira, Maria Cristina
2016-12-15
Antimicrobial peptides from plants present mechanisms of action that are different from those of conventional defense agents. They are under-explored but have a potential as commercial antimicrobials. Bell pepper leaves ('Magali R') are discarded after harvesting the fruit and are sources of bioactive peptides. This work reports the isolation by peptidomics tools, and the identification and partially characterization by computational tools of an antimicrobial peptide from bell pepper leaves, and evidences the usefulness of records and the in silico analysis for the study of plant peptides aiming biotechnological uses. Aqueous extracts from leaves were enriched in peptide by salt fractionation and ultrafiltration. An antimicrobial peptide was isolated by tandem chromatographic procedures. Mass spectrometry, automated peptide sequencing and bioinformatics tools were used alternately for identification and partial characterization of the Hevein-like peptide, named HEV-CANN. The computational tools that assisted to the identification of the peptide included BlastP, PSI-Blast, ClustalOmega, PeptideCutter, and ProtParam; conventional protein databases (DB) as Mascot, Protein-DB, GenBank-DB, RefSeq, Swiss-Prot, and UniProtKB; specific for peptides DB as Amper, APD2, CAMP, LAMPs, and PhytAMP; other tools included in ExPASy for Proteomics; The Bioactive Peptide Databases, and The Pepper Genome Database. The HEV-CANN sequence presented 40 amino acid residues, 4258.8 Da, theoretical pI-value of 8.78, and four disulfide bonds. It was stable, and it has inhibited the growth of phytopathogenic bacteria and a fungus. HEV-CANN presented a chitin-binding domain in their sequence. There was a high identity and a positive alignment of HEV-CANN sequence in various databases, but there was not a complete identity, suggesting that HEV-CANN may be produced by ribosomal synthesis, which is in accordance with its constitutive nature. Computational tools for proteomics and databases are not adjusted for short sequences, which hampered HEV-CANN identification. The adjustment of statistical tests in large databases for proteins is an alternative to promote the significant identification of peptides. The development of specific DB for plant antimicrobial peptides, with information about peptide sequences, functional genomic data, structural motifs and domains of molecules, functional domains, and peptide-biomolecule interactions are valuable and necessary.
Advanced MRI in Multiple Sclerosis: Current Status and Future Challenges
Fox, Robert J.; Beall, Erik; Bhattacharyya, Pallab; Chen, Jacqueline; Sakaie, Ken
2011-01-01
Synopsis Magnetic resonance imaging (MRI) has rapidly become a leading research tool in the study of multiple sclerosis (MS). Conventional imaging is useful in diagnosis and management of the inflammatory stages of MS, but has limitations in describing the degree of tissue injury as well as the cause of progressive disability seen in the later stages of disease. Advanced MRI techniques hold promise to fill this void. Magnetization transfer imaging is a widely available technique that can characterize demyelination and may be useful in measuring putative remyelinating therapies. Diffusion tensor imaging describes the three-dimensional diffusion of water and holds promise in characterizing neurodegeneration and putative neuroprotective therapies. Spectroscopy measures the imbalance of cellular metabolites and could help unravel the pathogenesis of neurodegeneration in MS. Functional (f) MRI can be used to understand the functional consequences of MS injury, including the impact on cortical function and compensatory mechanisms. These imaging tools hold great promise to increase our understanding of MS pathogenesis and provide greater insight into the efficacy of new MS therapies. PMID:21439446
Phase imaging of mechanical properties of live cells (Conference Presentation)
NASA Astrophysics Data System (ADS)
Wax, Adam
2017-02-01
The mechanisms by which cells respond to mechanical stimuli are essential for cell function yet not well understood. Many rheological tools have been developed to characterize cellular viscoelastic properties but these typically require direct mechanical contact, limiting their throughput. We have developed a new approach for characterizing the organization of subcellular structures using a label free, noncontact, single-shot phase imaging method that correlates to measured cellular mechanical stiffness. The new analysis approach measures refractive index variance and relates it to disorder strength. These measurements are compared to cellular stiffness, measured using the same imaging tool to visualize nanoscale responses to flow shear stimulus. The utility of the technique is shown by comparing shear stiffness and phase disorder strength across five cellular populations with varying mechanical properties. An inverse relationship between disorder strength and shear stiffness is shown, suggesting that cell mechanical properties can be assessed in a format amenable to high throughput studies using this novel, non-contact technique. Further studies will be presented which include examination of mechanical stiffness in early carcinogenic events and investigation of the role of specific cellular structural proteins in mechanotransduction.
NASA Astrophysics Data System (ADS)
Gaitan, S.; ten Veldhuis, J. A. E.
2015-06-01
Cities worldwide are challenged by increasing urban flood risks. Precise and realistic measures are required to reduce flooding impacts. However, currently implemented sewer and topographic models do not provide realistic predictions of local flooding occurrence during heavy rain events. Assessing other factors such as spatially distributed rainfall, socioeconomic characteristics, and social sensing, may help to explain probability and impacts of urban flooding. Several spatial datasets have been recently made available in the Netherlands, including rainfall-related incident reports made by citizens, spatially distributed rain depths, semidistributed socioeconomic information, and buildings age. Inspecting the potential of this data to explain the occurrence of rainfall related incidents has not been done yet. Multivariate analysis tools for describing communities and environmental patterns have been previously developed and used in the field of study of ecology. The objective of this paper is to outline opportunities for these tools to explore urban flooding risks patterns in the mentioned datasets. To that end, a cluster analysis is performed. Results indicate that incidence of rainfall-related impacts is higher in areas characterized by older infrastructure and higher population density.
X-ray photoelectron spectroscopy for characterization of wood surfaces in adhesion studies
James F. Beecher; Charles R. Frihart
2005-01-01
X-ray photoelectron spectroscopy (XPS) is one of a set of tools that have been used to characterize wood surfaces. Among the advantages of XPS are surface sensitivity, identification of nearly all elements, and frequently, discrimination of bonding states. For these reasons, XPS seemed to be an appropriate tool to help explain the differences in bond strength under wet...
Monte Carlo simulation of the resolution volume for the SEQUOIA spectrometer
NASA Astrophysics Data System (ADS)
Granroth, G. E.; Hahn, S. E.
2015-01-01
Monte Carlo ray tracing simulations, of direct geometry spectrometers, have been particularly useful in instrument design and characterization. However, these tools can also be useful for experiment planning and analysis. To this end, the McStas Monte Carlo ray tracing model of SEQUOIA, the fine resolution fermi chopper spectrometer at the Spallation Neutron Source (SNS) of Oak Ridge National Laboratory (ORNL), has been modified to include the time of flight resolution sample and detector components. With these components, the resolution ellipsoid can be calculated for any detector pixel and energy bin of the instrument. The simulation is split in two pieces. First, the incident beamline up to the sample is simulated for 1 × 1011 neutron packets (4 days on 30 cores). This provides a virtual source for the backend that includes the resolution sample and monitor components. Next, a series of detector and energy pixels are computed in parallel. It takes on the order of 30 s to calculate a single resolution ellipsoid on a single core. Python scripts have been written to transform the ellipsoid into the space of an oriented single crystal, and to characterize the ellipsoid in various ways. Though this tool is under development as a planning tool, we have successfully used it to provide the resolution function for convolution with theoretical models. Specifically, theoretical calculations of the spin waves in YFeO3 were compared to measurements taken on SEQUOIA. Though the overall features of the spectra can be explained while neglecting resolution effects, the variation in intensity of the modes is well described once the resolution is included. As this was a single sharp mode, the simulated half intensity value of the resolution ellipsoid was used to provide the resolution width. A description of the simulation, its use, and paths forward for this technique will be discussed.
Advanced Power System Analysis Capabilities
NASA Technical Reports Server (NTRS)
1997-01-01
As a continuing effort to assist in the design and characterization of space power systems, the NASA Lewis Research Center's Power and Propulsion Office developed a powerful computerized analysis tool called System Power Analysis for Capability Evaluation (SPACE). This year, SPACE was used extensively in analyzing detailed operational timelines for the International Space Station (ISS) program. SPACE was developed to analyze the performance of space-based photovoltaic power systems such as that being developed for the ISS. It is a highly integrated tool that combines numerous factors in a single analysis, providing a comprehensive assessment of the power system's capability. Factors particularly critical to the ISS include the orientation of the solar arrays toward the Sun and the shadowing of the arrays by other portions of the station.
The Air Quality Model Evaluation International Initiative ...
This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Humpolickova, Jana; Mejdrová, Ivana; Matousova, Marika; Nencka, Radim; Boura, Evzen
2017-01-12
The lipid kinase phosphatidylinositol 4-kinase IIIβ (PI4KB) is an essential host factor for many positive-sense single-stranded RNA (+RNA) viruses including human pathogens hepatitis C virus (HCV), Severe acute respiratory syndrome (SARS), coxsackie viruses, and rhinoviruses. Inhibitors of PI4KB are considered to be potential broad-spectrum virostatics, and it is therefore critical to develop a biochemical understanding of the kinase. Here, we present highly potent and selective fluorescent inhibitors that we show to be useful chemical biology tools especially in determination of dissociation constants. Moreover, we show that the coumarin-labeled inhibitor can be used to image PI4KB in cells using fluorescence-lifetime imaging microscopy (FLIM) microscopy.
Whole-genome CNV analysis: advances in computational approaches.
Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P
2015-01-01
Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.
Initial Assessment of X-Ray Computer Tomography image analysis for material defect microstructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kane, Joshua James; Windes, William Enoch
2016-06-01
The original development work leading to this report was focused on the non destructive three-dimensional (3-D) characterization of nuclear graphite as a means to better understand the nature of the inherent pore structure. The pore structure of graphite and its evolution under various environmental factors such as irradiation, mechanical stress, and oxidation plays an important role in their observed properties and characteristics. If we are to transition from an empirical understanding of graphite behavior to a truly predictive mechanistic understanding the pore structure must be well characterized and understood. As the pore structure within nuclear graphite is highly interconnected andmore » truly 3-D in nature, 3-D characterization techniques are critical. While 3-D characterization has been an excellent tool for graphite pore characterization, it is applicable to a broad number of materials systems over many length scales. Given the wide range of applications and the highly quantitative nature of the tool, it is quite surprising to discover how few materials researchers understand and how valuable of a tool 3-D image processing and analysis can be. Ultimately, this report is intended to encourage broader use of 3 D image processing and analysis in materials science and engineering applications, more specifically nuclear-related materials applications, by providing interested readers with enough familiarity to explore its vast potential in identifying microstructure changes. To encourage this broader use, the report is divided into two main sections. Section 2 provides an overview of some of the key principals and concepts needed to extract a wide variety of quantitative metrics from a 3-D representation of a material microstructure. The discussion includes a brief overview of segmentation methods, connective components, morphological operations, distance transforms, and skeletonization. Section 3 focuses on the application of concepts from Section 2 to relevant materials at Idaho National Laboratory. In this section, image analysis examples featuring nuclear graphite will be discussed in detail. Additionally, example analyses from Transient Reactor Test Facility low-enriched uranium conversion, Advanced Gas Reactor like compacts, and tristructural isotopic particles are shown to give a broader perspective of the applicability to relevant materials of interest.« less
Computer-aided tracking and characterization of homicides and sexual assaults (CATCH)
NASA Astrophysics Data System (ADS)
Kangas, Lars J.; Terrones, Kristine M.; Keppel, Robert D.; La Moria, Robert D.
1999-03-01
When a serial offender strikes, it usually means that the investigation is unprecedented for that police agency. The volume of incoming leads and pieces of information in the case(s) can be overwhelming as evidenced by the thousands of leads gathered in the Ted Bundy Murders, Atlanta Child Murders, and the Green River Murders. Serial cases can be long term investigations in which the suspect remains unknown and continues to perpetrate crimes. With state and local murder investigative systems beginning to crop up, it will become important to manage that information in a timely and efficient way by developing computer programs to assist in that task. One vital function will be to compare violent crime cases from different jurisdictions so investigators can approach the investigation knowing that similar cases exist. CATCH (Computer Aided Tracking and Characterization of Homicides) is being developed to assist crime investigations by assessing likely characteristics of unknown offenders, by relating a specific crime case to other cases, and by providing a tool for clustering similar cases that may be attributed to the same offenders. CATCH is a collection of tools that assist the crime analyst in the investigation process by providing advanced data mining and visualization capabilities.These tools include clustering maps, query tools, geographic maps, timelines, etc. Each tool is designed to give the crime analyst a different view of the case data. The clustering tools in CATCH are based on artificial neural networks (ANNs). The ANNs learn to cluster similar cases from approximately 5000 murders and 3000 sexual assaults residing in a database. The clustering algorithm is applied to parameters describing modus operandi (MO), signature characteristics of the offenders, and other parameters describing the victim and offender. The proximity of cases within a two-dimensional representation of the clusters allows the analyst to identify similar or serial murders and sexual assaults.
SMARTE'S SITE CHARACTERIZATION TOOL
Site Characterization involves collecting environmental data to evaluate the nature and extent of contamination. Environmental data could consist of chemical analyses of soil, sediment, water or air samples. Typically site characterization data are statistically evaluated for thr...
New methodology to baseline and match AME polysilicon etcher using advanced diagnostic tools
NASA Astrophysics Data System (ADS)
Poppe, James; Shipman, John; Reinhardt, Barbara E.; Roussel, Myriam; Hedgecock, Raymond; Fonda, Arturo
1999-09-01
As process controls tighten in the semiconductor industry, the need to understand the variables that determine system performance become more important. For plasma etch systems, process success depends on the control of key parameters such as: vacuum integrity, pressure, gas flows, and RF power. It is imperative to baseline, monitor, and control these variables. This paper presents an overview of the methods and tools used by Motorola BMC fabrication facility to characterize an Applied Materials polysilicon etcher. Tool performance data obtained from our traditional measurement techniques are limited in their scope and do not provide a complete picture of the ultimate tool performance. Presently the BMC traditional characterization tools provide a snapshot of the static operation of the equipment under test (EUT); however, complete evaluation of the dynamic performance cannot be monitored without the aid of specialized diagnostic equipment. To provide us with a complete system baseline evaluation of the polysilicon etcher, three diagnostic tools were utilized: Lucas Labs Vacuum Diagnostic System, Residual Gas Analyzer, and the ENI Voltage/Impedance Probe. The diagnostic methodology used to baseline and match key parameters of qualified production equipment has had an immense impact on other equipment characterization in the facility. It has resulted in reduced cycle time for new equipment introduction as well.
Jorge, Taissa Ricciardi; Mosimann, Ana Luiza Pamplona; Noronha, Lucia de; Maron, Angela; Duarte Dos Santos, Claudia Nunes
2017-02-01
During a series of epizootics caused by Yellow fever virus in Brazil between 2007 and 2009, a monkey was found dead (May 2009) in a sylvatic area in the State of Paraná. Brain samples from this animal were used for immunohistochemical analysis and isolation of a wild-type strain of YFV. This viral strain was characterized, and sequence analyzes demonstrated that it is closely related with YFV strains of the recently identified subclade 1E of the South American genotype I. Further characterization included indirect-immunofluorescence of different infected cell lines and analysis of the kinetics of virus replication and infectivity inhibition by type I IFN. The generated data contributes to the knowledge of YFV evolution and phylogeny. Additionally, the reagents generated and characterized during this study, such as a panel of monoclonal antibodies, are useful tools for further studies on YFV. Lastly, this case stresses the importance of yellow fever surveillance through sentinel monkeys. Copyright © 2016 Elsevier B.V. All rights reserved.
Clinical microbiology informatics.
Rhoads, Daniel D; Sintchenko, Vitali; Rauch, Carol A; Pantanowitz, Liron
2014-10-01
The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Clinical Microbiology Informatics
Sintchenko, Vitali; Rauch, Carol A.; Pantanowitz, Liron
2014-01-01
SUMMARY The clinical microbiology laboratory has responsibilities ranging from characterizing the causative agent in a patient's infection to helping detect global disease outbreaks. All of these processes are increasingly becoming partnered more intimately with informatics. Effective application of informatics tools can increase the accuracy, timeliness, and completeness of microbiology testing while decreasing the laboratory workload, which can lead to optimized laboratory workflow and decreased costs. Informatics is poised to be increasingly relevant in clinical microbiology, with the advent of total laboratory automation, complex instrument interfaces, electronic health records, clinical decision support tools, and the clinical implementation of microbial genome sequencing. This review discusses the diverse informatics aspects that are relevant to the clinical microbiology laboratory, including the following: the microbiology laboratory information system, decision support tools, expert systems, instrument interfaces, total laboratory automation, telemicrobiology, automated image analysis, nucleic acid sequence databases, electronic reporting of infectious agents to public health agencies, and disease outbreak surveillance. The breadth and utility of informatics tools used in clinical microbiology have made them indispensable to contemporary clinical and laboratory practice. Continued advances in technology and development of these informatics tools will further improve patient and public health care in the future. PMID:25278581
Martin, Markus; Dressing, Andrea; Bormann, Tobias; Schmidt, Charlotte S M; Kümmerer, Dorothee; Beume, Lena; Saur, Dorothee; Mader, Irina; Rijntjes, Michel; Kaller, Christoph P; Weiller, Cornelius
2017-08-01
The study aimed to elucidate areas involved in recognizing tool-associated actions, and to characterize the relationship between recognition and active performance of tool use.We performed voxel-based lesion-symptom mapping in a prospective cohort of 98 acute left-hemisphere ischemic stroke patients (68 male, age mean ± standard deviation, 65 ± 13 years; examination 4.4 ± 2 days post-stroke). In a video-based test, patients distinguished correct tool-related actions from actions with spatio-temporal (incorrect grip, kinematics, or tool orientation) or conceptual errors (incorrect tool-recipient matching, e.g., spreading jam on toast with a paintbrush). Moreover, spatio-temporal and conceptual errors were determined during actual tool use.Deficient spatio-temporal error discrimination followed lesions within a dorsal network in which the inferior parietal lobule (IPL) and the lateral temporal cortex (sLTC) were specifically relevant for assessing functional hand postures and kinematics, respectively. Conversely, impaired recognition of conceptual errors resulted from damage to ventral stream regions including anterior temporal lobe. Furthermore, LTC and IPL lesions impacted differently on action recognition and active tool use, respectively.In summary, recognition of tool-associated actions relies on a componential network. Our study particularly highlights the dissociable roles of LTC and IPL for the recognition of action kinematics and functional hand postures, respectively. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Kubis, Michael; Wise, Rich; Reijnen, Liesbeth; Viatkina, Katja; Jaenen, Patrick; Luca, Melisa; Mernier, Guillaume; Chahine, Charlotte; Hellin, David; Kam, Benjamin; Sobieski, Daniel; Vertommen, Johan; Mulkens, Jan; Dusa, Mircea; Dixit, Girish; Shamma, Nader; Leray, Philippe
2016-03-01
With shrinking design rules, the overall patterning requirements are getting aggressively tighter. For the 7-nm node and below, allowable CD uniformity variations are entering the Angstrom region (ref [1]). Optimizing inter- and intra-field CD uniformity of the final pattern requires a holistic tuning of all process steps. In previous work, CD control with either litho cluster or etch tool corrections has been discussed. Today, we present a holistic CD control approach, combining the correction capability of the etch tool with the correction capability of the exposure tool. The study is done on 10-nm logic node wafers, processed with a test vehicle stack patterning sequence. We include wafer-to-wafer and lot-to-lot variation and apply optical scatterometry to characterize the fingerprints. Making use of all available correction capabilities (lithography and etch), we investigated single application of exposure tool corrections and of etch tool corrections as well as combinations of both to reach the lowest CD uniformity. Results of the final pattern uniformity based on single and combined corrections are shown. We conclude on the application of this holistic lithography and etch optimization to 7nm High-Volume manufacturing, paving the way to ultimate within-wafer CD uniformity control.
Harburguer, Laura; Licastro, Susana; Masuh, Héctor; Zerba, Eduardo
2016-05-01
Aedes aegypti (L.) is a species of international concern because of its ability to transmit serious human arboviral diseases including yellow fever, dengue, and chikungunya, which have spread to all continents. Ovitraps are containers constructed to imitate Aedes' natural breeding sites and have been used for many decades as a sensitive and inexpensive surveillance tool for detecting the presence of container-inhabiting mosquitoes. In addition to their value for vector surveillance, various ovitrap devices have been evaluated as tools for suppressing Ae. aegypti populations. In this study, we performed a biological and chemical characterization of a new ovitrap prototype manufactured by injection molding of low-density polyethylene (LDPE) with the larvicide pyriproxyfen. Our research shows that pyriproxyfen was immediately released from the LDPE into the water of the ovitrap and led to an emergence inhibition of 100% for over 30 weeks. In addition, ovitraps continued to show a high larvicidal activity after over 20 washes. Pyriproxyfen was detectable in the water after 20 s and reached a peak after 24 h. Our results show that this ovitrap can be an effective, inexpensive, and low-maintenance tool for Ae. aegypti surveillance and control. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Priming of plant resistance by natural compounds. Hexanoic acid as a model
Aranega-Bou, Paz; de la O Leyva, Maria; Finiti, Ivan; García-Agustín, Pilar; González-Bosch, Carmen
2014-01-01
Some alternative control strategies of currently emerging plant diseases are based on the use of resistance inducers. This review highlights the recent advances made in the characterization of natural compounds that induce resistance by a priming mechanism. These include vitamins, chitosans, oligogalacturonides, volatile organic compounds, azelaic and pipecolic acid, among others. Overall, other than providing novel disease control strategies that meet environmental regulations, natural priming agents are valuable tools to help unravel the complex mechanisms underlying the induced resistance (IR) phenomenon. The data presented in this review reflect the novel contributions made from studying these natural plant inducers, with special emphasis placed on hexanoic acid (Hx), proposed herein as a model tool for this research field. Hx is a potent natural priming agent of proven efficiency in a wide range of host plants and pathogens. It can early activate broad-spectrum defenses by inducing callose deposition and the salicylic acid (SA) and jasmonic acid (JA) pathways. Later it can prime pathogen-specific responses according to the pathogen’s lifestyle. Interestingly, Hx primes redox-related genes to produce an anti-oxidant protective effect, which might be critical for limiting the infection of necrotrophs. Our Hx-IR findings also strongly suggest that it is an attractive tool for the molecular characterization of the plant alarmed state, with the added advantage of it being a natural compound. PMID:25324848
WEC Design Response Toolbox v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey
2016-03-30
The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.
Toward a Personalized Approach in Prebiotics Research
Dey, Moul
2017-01-01
Recent characterization of the human microbiome and its influences on health have led to dramatic conceptual shifts in dietary bioactives research. Prebiotic foods that include many dietary fibers and resistant starches are perceived as beneficial for maintaining a healthy gut microbiota. This article brings forward some current perspectives in prebiotic research to discuss why reporting of individual variations in response to interventions will be important to discern suitability of prebiotics as a disease prevention tool. PMID:28134778
2013-01-01
material models to describe the behavior of fibers and structures under high -rate loading conditions. With the utility of the CAE methods and tools largely...phenylene terephthalamide (PPTA), available commercially as Kevlar, Twaron, Technora, and so forth, are characterized by high specific axial stiffness...and high specific tensile strength. These fibers are often referred to as “ballistic fibers” since they are commonly used in different ballistic- and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bunshah, R.F.; Shabaik, A.H.
The process of Activated Reactive Evaporation is used to synthesize superhard materials like carbides, oxides, nitrides and ultrafine grain cermets. The deposits are characterized by hardness, microstructure, microprobe analysis for chemistry and lattice parameter measurements. The synthesis and characterization of TiC-Ni cermets and Al/sub 2/O/sub 3/ are given. High speed steel tool coated with TiC, TiC-Ni and TaC are tested for machining performance at different speeds and feeds. The machining evaluation and the selection of coatings is based on the rate of deterioration of the coating tool temperature, and cutting forces. Tool life tests show coated high speed steel toolsmore » having 150 to 300% improvement in tool life compared to uncoated tools. Variability in the quality of the ground edge on high speed steel inserts produce a great scatter in the machining evaluation data.« less
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Multispectral analysis tools can increase utility of RGB color images in histology
NASA Astrophysics Data System (ADS)
Fereidouni, Farzad; Griffin, Croix; Todd, Austin; Levenson, Richard
2018-04-01
Multispectral imaging (MSI) is increasingly finding application in the study and characterization of biological specimens. However, the methods typically used come with challenges on both the acquisition and the analysis front. MSI can be slow and photon-inefficient, leading to long imaging times and possible phototoxicity and photobleaching. The resulting datasets can be large and complex, prompting the development of a number of mathematical approaches for segmentation and signal unmixing. We show that under certain circumstances, just three spectral channels provided by standard color cameras, coupled with multispectral analysis tools, including a more recent spectral phasor approach, can efficiently provide useful insights. These findings are supported with a mathematical model relating spectral bandwidth and spectral channel number to achievable spectral accuracy. The utility of 3-band RGB and MSI analysis tools are demonstrated on images acquired using brightfield and fluorescence techniques, as well as a novel microscopy approach employing UV-surface excitation. Supervised linear unmixing, automated non-negative matrix factorization and phasor analysis tools all provide useful results, with phasors generating particularly helpful spectral display plots for sample exploration.
Characterizing Aerosols over Southeast Asia using the AERONET Data Synergy Tool
NASA Technical Reports Server (NTRS)
Giles, David M.; Holben, Brent N.; Eck, Thomas F.; Slutsker, Ilya; Slutsker, Ilya; Welton, Ellsworth, J.; Chin, Mian; Kucsera, Thomas; Schmaltz, Jeffery E.; Diehl, Thomas;
2007-01-01
Biomass burning, urban pollution and dust aerosols have significant impacts on the radiative forcing of the atmosphere over Asia. In order to better quanti@ these aerosol characteristics, the Aerosol Robotic Network (AERONET) has established over 200 sites worldwide with an emphasis in recent years on the Asian continent - specifically Southeast Asia. A total of approximately 15 AERONET sun photometer instruments have been deployed to China, India, Pakistan, Thailand, and Vietnam. Sun photometer spectral aerosol optical depth measurements as well as microphysical and optical aerosol retrievals over Southeast Asia will be analyzed and discussed with supporting ground-based instrument, satellite, and model data sets, which are freely available via the AERONET Data Synergy tool at the AERONET web site (http://aeronet.gsfc.nasa.gov). This web-based data tool provides access to groundbased (AERONET and MPLNET), satellite (MODIS, SeaWiFS, TOMS, and OMI) and model (GOCART and back trajectory analyses) databases via one web portal. Future development of the AERONET Data Synergy Tool will include the expansion of current data sets as well as the implementation of other Earth Science data sets pertinent to advancing aerosol research.
Tribology in secondary wood machining
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ko, P.L.; Hawthorne, H.M.; Andiappan, J.
Secondary wood manufacturing covers a wide range of products from furniture, cabinets, doors and windows, to musical instruments. Many of these are now mass produced in sophisticated, high speed numerical controlled machines. The performance and the reliability of the tools are key to an efficient and economical manufacturing process as well as to the quality of the finished products. A program concerned with three aspects of tribology of wood machining, namely, tool wear, tool-wood friction characteristics and wood surface quality characterization, was set up in the Integrated Manufacturing Technologies Institute (IMTI) of the National Research Council of Canada. The studiesmore » include friction and wear mechanism identification and modeling, wear performance of surface-engineered tool materials, friction-induced vibration and cutting efficiency, and the influence of wear and friction on finished products. This research program underlines the importance of tribology in secondary wood manufacturing and at the same time adds new challenges to tribology research since wood is a complex, heterogeneous, material and its behavior during machining is highly sensitive to the surrounding environments and to the moisture content in the work piece.« less
Final Documentation: Incident Management And Probabilities Courses of action Tool (IMPACT).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Donna M.; Ray, Jaideep; Tucker, Mark D.
This report pulls together the documentation produced for the IMPACT tool, a software-based decision support tool that provides situational awareness, incident characterization, and guidance on public health and environmental response strategies for an unfolding bio-terrorism incident.
Nepolean, Thirunavukkarsau; Kaul, Jyoti; Mukri, Ganapati; Mittal, Shikha
2018-01-01
Breeding science has immensely contributed to the global food security. Several varieties and hybrids in different food crops including maize have been released through conventional breeding. The ever growing population, decreasing agricultural land, lowering water table, changing climate, and other variables pose tremendous challenge to the researchers to improve the production and productivity of food crops. Drought is one of the major problems to sustain and improve the productivity of food crops including maize in tropical and subtropical production systems. With advent of novel genomics and breeding tools, the way of doing breeding has been tremendously changed in the last two decades. Drought tolerance is a combination of several component traits with a quantitative mode of inheritance. Rapid DNA and RNA sequencing tools and high-throughput SNP genotyping techniques, trait mapping, functional characterization, genomic selection, rapid generation advancement, and other tools are now available to understand the genetics of drought tolerance and to accelerate the breeding cycle. Informatics play complementary role by managing the big-data generated from the large-scale genomics and breeding experiments. Genome editing is the latest technique to alter specific genes to improve the trait expression. Integration of novel genomics, next-generation breeding, and informatics tools will accelerate the stress breeding process and increase the genetic gain under different production systems. PMID:29696027
Dale, Elizabeth L.; Mueller, Melissa A.; Wang, Li; Fogerty, Mary D.; Guy, Jeffrey S.; Nthumba, Peter M.
2013-01-01
Introduction In order to implement effective burn prevention strategies, the WHO has called for improved data collection to better characterize burn injuries in low and middle income countries (LMIC). This study was designed to gather information on burn injury in Kenya and to test a model for such data collection. Methods The study was designed as a retrospective case series study utilizing an electronic data collection tool to assess the scope of burn injuries requiring operation at Kijabe Hospital from January 2006 to May 2010. Data were entered into a web-based tool to test its utility as the potential Kenya Burn Repository (KBR). Results 174 patients were included. The median age was 10 years. There was a male predominance (59% vs. 41%). Findings included that timing of presentation was associated with burn etiology (p = 0.009). Length of stay (LOS) was associated with burn etiology (p < 0.001). Etiology differed depending on the age group, with scald being most prominent in children (p = 0.002). Conclusions Burn injuries in Kenya show similarities with other LMIC in etiology and pediatric predominance. Late presentation for care and prolonged LOS are areas for further investigation. The web-based database is an effective tool for data collection and international collaboration. PMID:23040425
Dale, Elizabeth L; Mueller, Melissa A; Wang, Li; Fogerty, Mary D; Guy, Jeffrey S; Nthumba, Peter M
2013-06-01
In order to implement effective burn prevention strategies, the WHO has called for improved data collection to better characterize burn injuries in low and middle income countries (LMIC). This study was designed to gather information on burn injury in Kenya and to test a model for such data collection. The study was designed as a retrospective case series study utilizing an electronic data collection tool to assess the scope of burn injuries requiring operation at Kijabe Hospital from January 2006 to May 2010. Data were entered into a web-based tool to test its utility as the potential Kenya Burn Repository (KBR). 174 patients were included. The median age was 10 years. There was a male predominance (59% vs. 41%). Findings included that timing of presentation was associated with burn etiology (p=0.009). Length of stay (LOS) was associated with burn etiology (p<0.001). Etiology differed depending on the age group, with scald being most prominent in children (p=0.002). Burn injuries in Kenya show similarities with other LMIC in etiology and pediatric predominance. Late presentation for care and prolonged LOS are areas for further investigation. The web-based database is an effective tool for data collection and international collaboration. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Urbanski, William M; Condie, Brian G
2009-12-01
Textpresso Site Specific Recombinases (http://ssrc.genetics.uga.edu/) is a text-mining web server for searching a database of more than 9,000 full-text publications. The papers and abstracts in this database represent a wide range of topics related to site-specific recombinase (SSR) research tools. Included in the database are most of the papers that report the characterization or use of mouse strains that express Cre recombinase as well as papers that describe or analyze mouse lines that carry conditional (floxed) alleles or SSR-activated transgenes/knockins. The database also includes reports describing SSR-based cloning methods such as the Gateway or the Creator systems, papers reporting the development or use of SSR-based tools in systems such as Drosophila, bacteria, parasites, stem cells, yeast, plants, zebrafish, and Xenopus as well as publications that describe the biochemistry, genetics, or molecular structure of the SSRs themselves. Textpresso Site Specific Recombinases is the only comprehensive text-mining resource available for the literature describing the biology and technical applications of SSRs. (c) 2009 Wiley-Liss, Inc.
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person, Suzette J.; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (I) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows thai SPA can dell-oct porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spurgeon, Steven R.; Chambers, Scott A.
Scanning transmission electron microscopy (STEM) has become one of the fundamental tools to characterize oxide interfaces and superlattices. Atomic-scale structure, chemistry, and composition mapping can now be conducted on a wide variety of materials systems thanks to the development of aberration-correctors and advanced detectors. STEM imaging and diffraction, coupled with electron energy loss (EELS) and energy-dispersive X-ray (EDS) spectroscopies, offer unparalleled, high-resolution analysis of structure-property relationships. In this chapter we highlight investigations into key phenomena, including interfacial conductivity in oxide superlattices, charge screening effects in magnetoelectric heterostructures, the design of high-quality iron oxide interfaces, and the complex physics governing atomic-scalemore » chemical mapping. These studies illustrate how unique insights from STEM characterization can be integrated with other techniques and first-principles calculations to develop better models for the behavior of functional oxides.« less
Detecting and Characterizing Semantic Inconsistencies in Ported Code
NASA Technical Reports Server (NTRS)
Ray, Baishakhi; Kim, Miryung; Person,Suzette; Rungta, Neha
2013-01-01
Adding similar features and bug fixes often requires porting program patches from reference implementations and adapting them to target implementations. Porting errors may result from faulty adaptations or inconsistent updates. This paper investigates (1) the types of porting errors found in practice, and (2) how to detect and characterize potential porting errors. Analyzing version histories, we define five categories of porting errors, including incorrect control- and data-flow, code redundancy, inconsistent identifier renamings, etc. Leveraging this categorization, we design a static control- and data-dependence analysis technique, SPA, to detect and characterize porting inconsistencies. Our evaluation on code from four open-source projects shows that SPA can detect porting inconsistencies with 65% to 73% precision and 90% recall, and identify inconsistency types with 58% to 63% precision and 92% to 100% recall. In a comparison with two existing error detection tools, SPA improves precision by 14 to 17 percentage points.
Tools for the functional interpretation of metabolomic experiments.
Chagoyen, Monica; Pazos, Florencio
2013-11-01
The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.
Self-Report Dietary Assessment Tools Used in Canadian Research: A Scoping Review123
Vanderlee, Lana; Raffoul, Amanda; Stapleton, Jackie; Csizmadi, Ilona; Boucher, Beatrice A; Massarelli, Isabelle; Rondeau, Isabelle
2017-01-01
Choosing the most appropriate dietary assessment tool for a study can be a challenge. Through a scoping review, we characterized self-report tools used to assess diet in Canada to identify patterns in tool use and to inform strategies to strengthen nutrition research. The research databases Medline, PubMed, PsycINFO, and CINAHL were used to identify Canadian studies published from 2009 to 2014 that included a self-report assessment of dietary intake. The search elicited 2358 records that were screened to identify those that reported on self-report dietary intake among nonclinical, non-Aboriginal adult populations. A pool of 189 articles (reflecting 92 studies) was examined in-depth to assess the dietary assessment tools used. Food-frequency questionnaires (FFQs) and screeners were used in 64% of studies, whereas food records and 24-h recalls were used in 18% and 14% of studies, respectively. Three studies (3%) used a single question to assess diet, and for 3 studies the tool used was not clear. A variety of distinct FFQs and screeners, including those developed and/or adapted for use in Canada and those developed elsewhere, were used. Some tools were reported to have been evaluated previously in terms of validity or reliability, but details of psychometric testing were often lacking. Energy and fat were the most commonly studied, reported by 42% and 39% of studies, respectively. For ∼20% of studies, dietary data were used to assess dietary quality or patterns, whereas close to half assessed ≤5 dietary components. A variety of dietary assessment tools are used in Canadian research. Strategies to improve the application of current evidence on best practices in dietary assessment have the potential to support a stronger and more cohesive literature on diet and health. Such strategies could benefit from national and global collaboration. PMID:28298272
Liu, Jun-Jun; Shamoun, Simon Francis; Leal, Isabel; Kowbel, Robert; Sumampong, Grace; Zamany, Arezoo
2018-05-01
Characterization of genes involved in differentiation of pathogen species and isolates with variations of virulence traits provides valuable information to control tree diseases for meeting the challenges of sustainable forest health and phytosanitary trade issues. Lack of genetic knowledge and genomic resources hinders novel gene discovery, molecular mechanism studies and development of diagnostic tools in the management of forest pathogens. Here, we report on transcriptome profiling of Heterobasidion occidentale isolates with contrasting virulence levels. Comparative transcriptomic analysis identified orthologous groups exclusive to H. occidentale and its isolates, revealing biological processes involved in the differentiation of isolates. Further bioinformatics analyses identified an H. occidentale secretome, CYPome and other candidate effectors, from which genes with species- and isolate-specific expression were characterized. A large proportion of differentially expressed genes were revealed to have putative activities as cell wall modification enzymes and transcription factors, suggesting their potential roles in virulence and fungal pathogenesis. Next, large numbers of simple sequence repeats (SSRs) and single nucleotide polymorphisms (SNPs) were detected, including more than 14 000 interisolate non-synonymous SNPs. These polymorphic loci and species/isolate-specific genes may contribute to virulence variations and provide ideal DNA markers for development of diagnostic tools and investigation of genetic diversity. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Oral biopharmaceutics tools - time for a new initiative - an introduction to the IMI project OrBiTo.
Lennernäs, H; Aarons, L; Augustijns, P; Beato, S; Bolger, M; Box, K; Brewster, M; Butler, J; Dressman, J; Holm, R; Julia Frank, K; Kendall, R; Langguth, P; Sydor, J; Lindahl, A; McAllister, M; Muenster, U; Müllertz, A; Ojala, K; Pepin, X; Reppas, C; Rostami-Hodjegan, A; Verwei, M; Weitschies, W; Wilson, C; Karlsson, C; Abrahamsson, B
2014-06-16
OrBiTo is a new European project within the IMI programme in the area of oral biopharmaceutics tools that includes world leading scientists from nine European universities, one regulatory agency, one non-profit research organization, four SMEs together with scientists from twelve pharmaceutical companies. The OrBiTo project will address key gaps in our knowledge of gastrointestinal (GI) drug absorption and deliver a framework for rational application of predictive biopharmaceutics tools for oral drug delivery. This will be achieved through novel prospective investigations to define new methodologies as well as refinement of existing tools. Extensive validation of novel and existing biopharmaceutics tools will be performed using active pharmaceutical ingredient (API), formulations and supporting datasets from industry partners. A combination of high quality in vitro or in silico characterizations of API and formulations will be integrated into physiologically based in silico biopharmaceutics models capturing the full complexity of GI drug absorption. This approach gives an unparalleled opportunity to initiate a transformational change in industrial research and development to achieve model-based pharmaceutical product development in accordance with the Quality by Design concept. Benefits include an accelerated and more efficient drug candidate selection, formulation development process, particularly for challenging projects such as low solubility molecules (BCS II and IV), enhanced and modified-release formulations, as well as allowing optimization of clinical product performance for patient benefit. In addition, the tools emerging from OrBiTo are expected to significantly reduce demand for animal experiments in the future as well as reducing the number of human bioequivalence studies required to bridge formulations after manufacturing or composition changes. Copyright © 2013 Elsevier B.V. All rights reserved.
Bobst, Cedric E.; Kaltashov, Igor A.
2012-01-01
Mass spectrometry has already become an indispensable tool in the analytical armamentarium of the biopharmaceutical industry, although its current uses are limited to characterization of covalent structure of recombinant protein drugs. However, the scope of applications of mass spectrometry-based methods is beginning to expand to include characterization of the higher order structure and dynamics of biopharmaceutical products, a development which is catalyzed by the recent progress in mass spectrometry-based methods to study higher order protein structure. The two particularly promising methods that are likely to have the most significant and lasting impact in many areas of biopharmaceutical analysis, direct ESI MS and hydrogen/deuterium exchange, are focus of this article. PMID:21542797
Analysis and characterization of high-resolution and high-aspect-ratio imaging fiber bundles.
Motamedi, Nojan; Karbasi, Salman; Ford, Joseph E; Lomakin, Vitaliy
2015-11-10
High-contrast imaging fiber bundles (FBs) are characterized and modeled for wide-angle and high-resolution imaging applications. Scanning electron microscope images of FB cross sections are taken to measure physical parameters and verify the variations of irregular fibers due to the fabrication process. Modal analysis tools are developed that include irregularities in the fiber core shapes and provide results in agreement with experimental measurements. The modeling demonstrates that the irregular fibers significantly outperform a perfectly regular "ideal" array. Using this method, FBs are designed that can provide high contrast with core pitches of only a few wavelengths of the guided light. Structural modifications of the commercially available FB can reduce the core pitch by 60% for higher resolution image relay.
NASA Technical Reports Server (NTRS)
Rauscher, Bernard J.; Bolcar, Matthew R.; Clampin, Mark; Domagal-Goldman, Shawn D.; McElwain, Michael W.; Moseley, S. H.; Stahle, Carl; Stark, Christopher C.; Thronson, Harley A.
2015-01-01
Are we alone? Answering this ageless question will be a major focus for astrophysics in coming decades. Our tools will include unprecedentedly large UV-Optical-IR space telescopes working with advanced coronagraphs and starshades. Yet, these facilities will not live up to their full potential without better detectors than we have today. To inform detector development, this paper provides an overview of visible and near-IR (VISIR; lambda = 0.4 - 1.8 micrometers) detector needs for the Advanced Technology Large Aperture Space Telescope (ATLAST), specifically for spectroscopic characterization of atmospheric biosignature gasses. We also provide a brief status update on some promising detector technologies for meeting these needs in the context of a passively cooled ATLAST.
Qi, Fengxia; Yao, Lun; Tan, Xiaoming; Lu, Xuefeng
2013-10-01
An integrative gene expression system has been constructed for the directional assembly of biological components in Synechocystis PCC6803. We have characterized 11 promoter parts with various expression efficiencies for genetic engineering of Synechocystis for the production of fatty alcohols. This was achieved by integrating several genetic modifications including the expression of multiple-copies of fatty acyl-CoA reductase (FAR) under the control of strong promoters, disruption of the competing pathways for poly-β-hydroxybutyrate and glycogen synthesis, and for peptide truncation of the FAR. In shake-flask cultures, the production of fatty alcohols was significantly improved with a yield of 761 ± 216 μg/g cell dry weight in Synechocystis, which is the highest reported to date.
Wu, Mingxuan; Dul, Barbara E; Trevisan, Alexandra J; Fiedler, Dorothea
2013-01-01
The diphosphoinositol polyphosphates (PP-IPs) are a central group of eukaryotic second messengers. They regulate numerous processes, including cellular energy homeostasis and adaptation to environmental stresses. To date, most of the molecular details in PP-IP signalling have remained elusive, due to a lack of appropriate methods and reagents. Here we describe the expedient synthesis of methylene-bisphosphonate PP-IP analogues. Their characterization revealed that the analogues exhibit significant stability and mimic their natural counterparts very well. This was further confirmed in two independent biochemical assays, in which our analogues potently inhibited phosphorylation of the protein kinase Akt and hydrolytic activity of the Ddp1 phosphohydrolase. The non-hydrolysable PP-IPs thus emerge as important tools and hold great promise for a variety of applications.
Challenges to diagnosis of HIV-associated wasting.
Kotler, Donald
2004-12-01
There is a wide variability in the clinical presentation of the protein energy malnutrition often characterized as wasting in patients infected with HIV. Moreover, the clinical presentation has evolved over time. Initially, protein energy malnutrition was characterized by profound weight loss and depletion of body cell mass (BCM). Recently, unrelated concurrent metabolic abnormalities, such as lipodystrophy, may complicate the diagnosis of HIV wasting. Although measures of BCM are relatively accurate for the diagnosis of HIV wasting, the optimal tools for assessing BCM are not necessarily available to the clinician. From the practical standpoint, HIV wasting may be a self-evident diagnosis in advanced stages, but effective interpretation of the early signs of HIV wasting requires familiarity with other complications included in the differential diagnosis.
Technology of machine tools. Volume 4. Machine tool controls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-10-01
The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.
Technology of machine tools. Volume 3. Machine tool mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tlusty, J.
1980-10-01
The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.
Technology of machine tools. Volume 5. Machine tool accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hocken, R.J.
1980-10-01
The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.
Side Flow Effect on Surface Generation in Nano Cutting
NASA Astrophysics Data System (ADS)
Xu, Feifei; Fang, Fengzhou; Zhang, Xiaodong
2017-05-01
The side flow of material in nano cutting is one of the most important factors that deteriorate the machined surface quality. The effects of the crystallographic orientation, feed, and the cutting tool geometry, including tool edge radius, rake angle and inclination angle, on the side flow are investigated employing molecular dynamics simulation. The results show that the stagnation region is formed in front of tool edge and it is characterized by the stagnation radius R s and stagnation height h s . The side flow is formed because the material at or under the stagnation region is extruded by the tool edge to flow to the side of the tool edge. Higher stagnation height would increase the size of the side flow. The anisotropic nature of the material which partly determines the stagnation region also influences the side flow due to the different deformation mechanism under the action of the tool edge. At different cutting directions, the size of the side flow has a great difference which would finally affect the machined surface quality. The cutting directions of {100} < 011>, {110} < 001>, and {110} < 1-10 > are beneficial to obtain a better surface quality with small side flow. Besides that, the side flow could be suppressed by reducing the feed and optimizing the cutting tool geometry. Cutting tool with small edge radius, large positive rake angle, and inclination angle would decrease the side flow and consequently improve the machined surface quality.
Side Flow Effect on Surface Generation in Nano Cutting.
Xu, Feifei; Fang, Fengzhou; Zhang, Xiaodong
2017-12-01
The side flow of material in nano cutting is one of the most important factors that deteriorate the machined surface quality. The effects of the crystallographic orientation, feed, and the cutting tool geometry, including tool edge radius, rake angle and inclination angle, on the side flow are investigated employing molecular dynamics simulation. The results show that the stagnation region is formed in front of tool edge and it is characterized by the stagnation radius R s and stagnation height h s . The side flow is formed because the material at or under the stagnation region is extruded by the tool edge to flow to the side of the tool edge. Higher stagnation height would increase the size of the side flow. The anisotropic nature of the material which partly determines the stagnation region also influences the side flow due to the different deformation mechanism under the action of the tool edge. At different cutting directions, the size of the side flow has a great difference which would finally affect the machined surface quality. The cutting directions of {100} < 011>, {110} < 001>, and {110} < 1-10 > are beneficial to obtain a better surface quality with small side flow. Besides that, the side flow could be suppressed by reducing the feed and optimizing the cutting tool geometry. Cutting tool with small edge radius, large positive rake angle, and inclination angle would decrease the side flow and consequently improve the machined surface quality.
Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.
2012-01-01
The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.
Redesigning a Large-Enrollment Introductory Biology Course
Ueckert, Catherine; Adams, Alison; Lock, Judith
2011-01-01
Using an action research model, biology faculty examined, implemented, and evaluated learner-centered instructional strategies to reach the goal of increasing the level of student achievement in the introductory biology course BIO 181: Unity of Life I, which was characterized by both high enrollments and a high DFW rate. Outcomes included the creation and implementation of an assessment tool for biology content knowledge and attitudes, development and implementation of a common syllabus, modification of the course to include learner-centered instructional strategies, and the collection and analysis of data to evaluate the success of the modifications. The redesigned course resulted in greater student success, as measured by grades (reduced %DFW and increased %AB) as well as by achievement in the course assessment tool. In addition, the redesigned course led to increased student satisfaction and greater consistency among different sections. These findings have important implications for both students and institutions, as the significantly lower DFW rate means that fewer students have to retake the course. PMID:21633065
Automatic Estimation of the Radiological Inventory for the Dismantling of Nuclear Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Bermejo, R.; Felipe, A.; Gutierrez, S.
The estimation of the radiological inventory of Nuclear Facilities to be dismantled is a process that included information related with the physical inventory of all the plant and radiological survey. Estimation of the radiological inventory for all the components and civil structure of the plant could be obtained with mathematical models with statistical approach. A computer application has been developed in order to obtain the radiological inventory in an automatic way. Results: A computer application that is able to estimate the radiological inventory from the radiological measurements or the characterization program has been developed. In this computer applications has beenmore » included the statistical functions needed for the estimation of the central tendency and variability, e.g. mean, median, variance, confidence intervals, variance coefficients, etc. This computer application is a necessary tool in order to be able to estimate the radiological inventory of a nuclear facility and it is a powerful tool for decision taken in future sampling surveys.« less
Interim Draft: Biological Sampling and Analysis Plan Outline ...
Standard Operation Procedures This interim sampling and analysis plan (SAP) outline was developed specifically as an outline of the output that will be generated by a developing on-line tool called the MicroSAP. The goal of the MicroSAP tool is to assist users with development of SAPs needed for site characterization, verification sampling, and post decontamination sampling stages of biological sampling and analysis activities in which the EPA would be responsible for conducting sampling. These activities could include sampling and analysis for a biological contamination incident, a research study, or an exercise. The development of this SAP outline did not consider the initial response of an incident, as it is assumed that the initial response would have been previously completed by another agency during the response, or the clearance phase, as it is assumed that separate committee would be established to make decisions regarding clearing a site. This outline also includes considerations for capturing the associated data quality objectives in the SAP.
Integrating Local Green Assets into Brownfields Redevelopment: Tools and Examples
EnviroAtlas is a free, online public mapping tool that characterizes green infrastructure and its connection to human health and wellness. The high resolution data contained in this tool can be used to incorporate local green infrastructure into Brownfields redevelopment to benef...
Influence of Process Parameters on the Process Efficiency in Laser Metal Deposition Welding
NASA Astrophysics Data System (ADS)
Güpner, Michael; Patschger, Andreas; Bliedtner, Jens
Conventionally manufactured tools are often completely constructed of a high-alloyed, expensive tool steel. An alternative way to manufacture tools is the combination of a cost-efficient, mild steel and a functional coating in the interaction zone of the tool. Thermal processing methods, like laser metal deposition, are always characterized by thermal distortion. The resistance against the thermal distortion decreases with the reduction of the material thickness. As a consequence, there is a necessity of a special process management for the laser based coating of thin parts or tools. The experimental approach in the present paper is to keep the energy and the mass per unit length constant by varying the laser power, the feed rate and the powder mass flow. The typical seam parameters are measured in order to characterize the cladding process, define process limits and evaluate the process efficiency. Ways to optimize dilution, angular distortion and clad height are presented.
Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phelan, Ryan M.; Sachs, Daniel; Petkiewicz, Shayne J.
Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promotersmore » and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. In conclusion, these tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.« less
Semiconductor Characterization: from Growth to Manufacturing
NASA Astrophysics Data System (ADS)
Colombo, Luigi
The successful growth and/or deposition of materials for any application require basic understanding of the materials physics for a given device. At the beginning, the first and most obvious characterization tool is visual observation; this is particularly true for single crystal growth. The characterization tools are usually prioritized in order of ease of measurement, and have become especially sophisticated as we have moved from the characterization of macroscopic crystals and films to atomically thin materials and nanostructures. While a lot attention is devoted to characterization and understanding of materials physics at the nano level, the characterization of single crystals as substrates or active components is still critically important. In this presentation, I will review and discuss the basic materials characterization techniques used to get to the materials physics to bring crystals and thin films from research to manufacturing in the fields of infrared detection, non-volatile memories, and transistors. Finally I will present and discuss metrology techniques used to understand the physics and chemistry of atomically thin two-dimensional materials for future device applications.
NASA Astrophysics Data System (ADS)
Caltaru, M.; Badicioiu, M.; Ripeanu, R. G.; Dinita, A.; Minescu, M.; Laudacescu, E.
2018-01-01
Drill pipe is a seamless steel pipe with upset ends fitted with special threaded ends that are known as tool joints. During drilling operations, the wall thickness of the drill pipe and the outside diameter of the tool joints will be gradually reduced due to wear. The present research work investigate the possibility of reconditioning the drill pipe tool joints by hardbanding with a new metal-cored coppered flux cored wire, Cr-Mo alloyed, using the gas metal active welding process, taking into considerations two different hardbanding technologies, consisting in: hardbanding drill pipe tool joints after removing the old hardbanding material and surface reconstruction with a compensation material (case A), and hardbanding tool joint drill pipe, without removing the old hardbanding material (case B). The present paper brings forward the experimental researches regarding the tribological characterization of the reconditioned drill pipe tool joint by performing macroscopic analyses, metallographic analyses, Vickers hardness measurement, chemical composition measurement and wear tests conducted on ball on disk friction couples, in order to certify the quality of the hardbanding obtained by different technological approaches, to validate the optimum technology.
Electromagnetic Scattering from Realistic Targets
NASA Technical Reports Server (NTRS)
Lee, Shung- Wu; Jin, Jian-Ming
1997-01-01
The general goal of the project is to develop computational tools for calculating radar signature of realistic targets. A hybrid technique that combines the shooting-and-bouncing-ray (SBR) method and the finite-element method (FEM) for the radiation characterization of microstrip patch antennas in a complex geometry was developed. In addition, a hybridization procedure to combine moment method (MoM) solution and the SBR method to treat the scattering of waveguide slot arrays on an aircraft was developed. A list of journal articles and conference papers is included.
Recent Advances in Cardiovascular Magnetic Resonance Techniques and Applications
Salerno, Michael; Sharif, Behzad; Arheden, Håkan; Kumar, Andreas; Axel, Leon; Li, Debiao; Neubauer, Stefan
2018-01-01
Cardiovascular magnetic resonance imaging has become the gold standard for evaluating myocardial function, volumes, and scarring. Additionally, cardiovascular magnetic resonance imaging is unique in its comprehensive tissue characterization, including assessment of myocardial edema, myocardial siderosis, myocardial perfusion, and diffuse myocardial fibrosis. Cardiovascular magnetic resonance imaging has become an indispensable tool in the evaluation of congenital heart disease, heart failure, cardiac masses, pericardial disease, and coronary artery disease. This review will highlight some recent novel cardiovascular magnetic resonance imaging techniques, concepts, and applications. PMID:28611116
Attitude measurement: Principles and sensors
NASA Technical Reports Server (NTRS)
Duchon, P.; Vermande, M. P.
1981-01-01
Tools used in the measurement of satellite attitude are described. Attention is given to the elements that characterize an attitude sensor, the references employed (stars, moon, Sun, Earth, magnetic fields, etc.), and the detectors (optical, magnetic, and inertial). Several examples of attitude sensors are described, including sun sensors, star sensors, earth sensors, triaxial magnetometers, and gyrometers. Finally, sensor combinations that make it possible to determine a complete attitude are considered; the SPOT attitude measurement system and a combined CCD star sensor-gyrometer system are discussed.
Tohira, Hideo; Jacobs, Ian; Mountain, David; Gibson, Nick; Yeo, Allen
2011-01-01
The Abbreviated Injury Scale (AIS) was revised in 2005 and updated in 2008 (AIS 2008). We aimed to compare the outcome prediction performance of AIS-based injury severity scoring tools by using AIS 2008 and AIS 98. We used all major trauma patients hospitalized to the Royal Perth Hospital between 1994 and 2008. We selected five AIS-based injury severity scoring tools, including Injury Severity Score (ISS), New Injury Severity Score (NISS), modified Anatomic Profile (mAP), Trauma and Injury Severity Score (TRISS) and A Severity Characterization of Trauma (ASCOT). We selected survival after injury as a target outcome. We used the area under the Receiver Operating Characteristic curve (AUROC) as a performance measure. First, we compared the five tools using all cases whose records included all variables for the TRISS (complete dataset) using a 10-fold cross-validation. Second, we compared the ISS and NISS for AIS 98 and AIS 2008 using all subjects (whole dataset). We identified 1,269 and 4,174 cases for a complete dataset and a whole dataset, respectively. With the 10-fold cross-validation, there were no clear differences in the AUROCs between the AIS 98- and AIS 2008-based scores. With the second comparison, the AIS 98-based ISS performed significantly worse than the AIS 2008-based ISS (p<0.0001), while there was no significant difference between the AIS 98- and AIS 2008-based NISSs. Researchers should be aware of these findings when they select an injury severity scoring tool for their studies.
Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Dias, Danielle Monteiro Vilela; Scochi, Carmen Gracinda Silvan; Martins, José Carlos Amado; Rodrigues, Manuel Alves
2015-01-01
To evaluate students opinion regarding e-Baby educational technology. Exploratory descriptive study in which participated a sample composed of 14 nursing Portuguese students that used e-Baby digital educational technology in an extracurricular course. To achieve the aim of the study, the data collection was realized through an opinion instrument in Likert scale including the possibility of commentaries by students. Is was also collected data of participants' characterization. Students made very satisfactory evaluations regarding the game e-Baby, varying since usability acceptation through suggestions of expansion of the game to other nursing themes. Serious game e-Baby can be considered a didactic innovation and motivator tool of learning. Besides, it demonstrates have adequate interface in design and educative function aspects, evocating intense interaction between user and computational tool.
Viruses and Antiviral Immunity in Drosophila
Xu, Jie; Cherry, Sara
2013-01-01
Viral pathogens present many challenges to organisms, driving the evolution of a myriad of antiviral strategies to combat infections. A wide variety of viruses infect invertebrates, including both natural pathogens that are insect-restricted, and viruses that are transmitted to vertebrates. Studies using the powerful tools available in the model organism Drosophila have expanded our understanding of antiviral defenses against diverse viruses. In this review, we will cover three major areas. First, we will describe the tools used to study viruses in Drosophila. Second, we will survey the major viruses that have been studied in Drosophila. And lastly, we will discuss the well-characterized mechanisms that are active against these diverse pathogens, focusing on non-RNAi mediated antiviral mechanisms. Antiviral RNAi is discussed in another paper in this issue. PMID:23680639
Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo
2017-03-07
Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.
Norman, Laura; Tallent-Halsell, Nita; Labiosa, William; Weber, Matt; McCoy, Amy; Hirschboeck, Katie; Callegary, James; van Riper, Charles; Gray, Floyd
2010-01-01
Using respective strengths of the biological, physical, and social sciences, we are developing an online decision support tool, the Santa Cruz Watershed Ecosystem Portfolio Model (SCWEPM), to help promote the use of information relevant to water allocation and land management in a binational watershed along the U.S.-Mexico border. The SCWEPM will include an ES valuation system within a suite of linked regional driver-response models and will use a multicriteria scenario-evaluation framework that builds on GIS analysis and spatially-explicit models that characterize important ecological, economic, and societal endpoints and consequences that are sensitive to climate patterns, regional water budgets, and regional LULC change in the SCW.
Viruses and antiviral immunity in Drosophila.
Xu, Jie; Cherry, Sara
2014-01-01
Viral pathogens present many challenges to organisms, driving the evolution of a myriad of antiviral strategies to combat infections. A wide variety of viruses infect invertebrates, including both natural pathogens that are insect-restricted, and viruses that are transmitted to vertebrates. Studies using the powerful tools in the model organism Drosophila have expanded our understanding of antiviral defenses against diverse viruses. In this review, we will cover three major areas. First, we will describe the tools used to study viruses in Drosophila. Second, we will survey the major viruses that have been studied in Drosophila. And lastly, we will discuss the well-characterized mechanisms that are active against these diverse pathogens, focusing on non-RNAi mediated antiviral mechanisms. Antiviral RNAi is discussed in another paper in this issue. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shi, Dawei; Wang, Rui
2017-12-01
In this study, to solve the poor water resistance and the low mechanical properties of starch, a mixed-starch composite matrix which including glycerol, sorbitol, and urea, were prepared via single-crew extrusion, then adding oil-flax to improve its physical mechanical and used to a source of biodegradable plastics material. The composite matrix was systematically characterized using various analytic tools including XRD, SEM and TG. The composite showed a maximum tensile strength of 18.11Mpa and moisture absorption 17.67%, while the original starch matrix was only 12.51 Mpa and 24.98%, respectively.
Inspecting Engineering Samples
2017-12-08
Goddard's Ritsko Wins 2011 SAVE Award The winner of the 2011 SAVE Award is Matthew Ritsko, a Goddard financial manager. His tool lending library would track and enable sharing of expensive space-flight tools and hardware after projects no longer need them. This set of images represents the types of tools used at NASA. To read more go to: www.nasa.gov/topics/people/features/ritsko-save.html Dr. Doug Rabin (Code 671) and PI La Vida Cooper (Code 564) inspect engineering samples of the HAS-2 imager which will be tested and readout using a custom ASIC with a 16-bit ADC (analog to digital converter) and CDS (correlated double sampling) circuit designed by the Code 564 ASIC group as a part of an FY10 IRAD. The purpose of the IRAD was to develop and high resolution digitizer for Heliophysics applications such as imaging. Future goals for the collaboration include characterization testing and eventually a sounding rocket flight of the integrated system. *ASIC= Application Specific Integrated Circuit NASA/GSFC/Chris Gunn
Genome engineering and plant breeding: impact on trait discovery and development.
Nogué, Fabien; Mara, Kostlend; Collonnier, Cécile; Casacuberta, Josep M
2016-07-01
New tools for the precise modification of crops genes are now available for the engineering of new ideotypes. A future challenge in this emerging field of genome engineering is to develop efficient methods for allele mining. Genome engineering tools are now available in plants, including major crops, to modify in a predictable manner a given gene. These new techniques have a tremendous potential for a spectacular acceleration of the plant breeding process. Here, we discuss how genetic diversity has always been the raw material for breeders and how they have always taken advantage of the best available science to use, and when possible, increase, this genetic diversity. We will present why the advent of these new techniques gives to the breeders extremely powerful tools for crop breeding, but also why this will require the breeders and researchers to characterize the genes underlying this genetic diversity more precisely. Tackling these challenges should permit the engineering of optimized alleles assortments in an unprecedented and controlled way.
Copy Number Variations Detection: Unravelling the Problem in Tangible Aspects.
do Nascimento, Francisco; Guimaraes, Katia S
2017-01-01
In the midst of the important genomic variants associated to the susceptibility and resistance to complex diseases, Copy Number Variations (CNV) has emerged as a prevalent class of structural variation. Following the flood of next-generation sequencing data, numerous tools publicly available have been developed to provide computational strategies to identify CNV at improved accuracy. This review goes beyond scrutinizing the main approaches widely used for structural variants detection in general, including Split-Read, Paired-End Mapping, Read-Depth, and Assembly-based. In this paper, (1) we characterize the relevant technical details around the detection of CNV, which can affect the estimation of breakpoints and number of copies, (2) we pinpoint the most important insights related to GC-content and mappability biases, and (3) we discuss the paramount caveats in the tools evaluation process. The points brought out in this study emphasize common assumptions, a variety of possible limitations, valuable insights, and directions for desirable contributions to the state-of-the-art in CNV detection tools.
Microchip-Based Single-Cell Functional Proteomics for Biomedical Applications
Lu, Yao; Yang, Liu; Wei, Wei; Shi, Qihui
2017-01-01
Cellular heterogeneity has been widely recognized but only recently have single cell tools become available that allow characterizing heterogeneity at the genomic and proteomic levels. We review the technological advances in microchip-based toolkits for single-cell functional proteomics. Each of these tools has distinct advantages and limitations, and a few have advanced toward being applied to address biological or clinical problems that fail to be addressed by traditional population-based methods. High-throughput single-cell proteomic assays generate high-dimensional data sets that contain new information and thus require developing new analytical framework to extract new biology. In this review article, we highlight a few biological and clinical applications in which the microchip-based single-cell proteomic tools provide unique advantages. The examples include resolving functional heterogeneity and dynamics of immune cells, dissecting cell-cell interaction by creating well-contolled on-chip microenvironment, capturing high-resolution snapshots of immune system functions in patients for better immunotherapy and elucidating phosphoprotein signaling networks in cancer cells for guiding effective molecularly targeted therapies. PMID:28280819
A study of the relationship between the performance and dependability of a fault-tolerant computer
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This thesis studies the relationship by creating a tool (FTAPE) that integrates a high stress workload generator with fault injection and by using the tool to evaluate system performance under error conditions. The workloads are comprised of processes which are formed from atomic components that represent CPU, memory, and I/O activity. The fault injector is software-implemented and is capable of injecting any memory addressable location, including special registers and caches. This tool has been used to study a Tandem Integrity S2 Computer. Workloads with varying numbers of processes and varying compositions of CPU, memory, and I/O activity are first characterized in terms of performance. Then faults are injected into these workloads. The results show that as the number of concurrent processes increases, the mean fault latency initially increases due to increased contention for the CPU. However, for even higher numbers of processes (less than 3 processes), the mean latency decreases because long latency faults are paged out before they can be activated.
Kriegel, Fabian L; Köhler, Ralf; Bayat-Sarmadi, Jannike; Bayerl, Simon; Hauser, Anja E; Niesner, Raluca; Luch, Andreas; Cseresnyes, Zoltan
2018-03-01
Cells in their natural environment often exhibit complex kinetic behavior and radical adjustments of their shapes. This enables them to accommodate to short- and long-term changes in their surroundings under physiological and pathological conditions. Intravital multi-photon microscopy is a powerful tool to record this complex behavior. Traditionally, cell behavior is characterized by tracking the cells' movements, which yields numerous parameters describing the spatiotemporal characteristics of cells. Cells can be classified according to their tracking behavior using all or a subset of these kinetic parameters. This categorization can be supported by the a priori knowledge of experts. While such an approach provides an excellent starting point for analyzing complex intravital imaging data, faster methods are required for automated and unbiased characterization. In addition to their kinetic behavior, the 3D shape of these cells also provide essential clues about the cells' status and functionality. New approaches that include the study of cell shapes as well may also allow the discovery of correlations amongst the track- and shape-describing parameters. In the current study, we examine the applicability of a set of Fourier components produced by Discrete Fourier Transform (DFT) as a tool for more efficient and less biased classification of complex cell shapes. By carrying out a number of 3D-to-2D projections of surface-rendered cells, the applied method reduces the more complex 3D shape characterization to a series of 2D DFTs. The resulting shape factors are used to train a Self-Organizing Map (SOM), which provides an unbiased estimate for the best clustering of the data, thereby characterizing groups of cells according to their shape. We propose and demonstrate that such shape characterization is a powerful addition to, or a replacement for kinetic analysis. This would make it especially useful in situations where live kinetic imaging is less practical or not possible at all. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
NASA Astrophysics Data System (ADS)
Maity, Debotyam
This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.
Tyler, Ludmila; Fangel, Jonatan U; Fagerström, Alexandra Dotson; Steinwand, Michael A; Raab, Theodore K; Willats, William Gt; Vogel, John P
2014-01-14
The model grass Brachypodium distachyon is increasingly used to study various aspects of grass biology. A large and genotypically diverse collection of B. distachyon germplasm has been assembled by the research community. The natural variation in this collection can serve as a powerful experimental tool for many areas of inquiry, including investigating biomass traits. We surveyed the phenotypic diversity in a large collection of inbred lines and then selected a core collection of lines for more detailed analysis with an emphasis on traits relevant to the use of grasses as biofuel and grain crops. Phenotypic characters examined included plant height, growth habit, stem density, flowering time, and seed weight. We also surveyed differences in cell wall composition using near infrared spectroscopy (NIR) and comprehensive microarray polymer profiling (CoMPP). In all cases, we observed extensive natural variation including a two-fold variation in stem density, four-fold variation in ferulic acid bound to hemicellulose, and 1.7-fold variation in seed mass. These characterizations can provide the criteria for selecting diverse lines for future investigations of the genetic basis of the observed phenotypic variation.
Visualization and Analytics Tools for Infectious Disease Epidemiology: A Systematic Review
Carroll, Lauren N.; Au, Alan P.; Detwiler, Landon Todd; Fu, Tsung-chieh; Painter, Ian S.; Abernethy, Neil F.
2014-01-01
Background A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) Identify public health user needs and preferences for infectious disease information visualization tools; (2) Identify existing infectious disease information visualization tools and characterize their architecture and features; (3) Identify commonalities among approaches applied to different data types; and (4) Describe tool usability evaluation efforts and barriers to the adoption of such tools. Methods We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. Results A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. Discussion and Conclusion As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. PMID:24747356
Visualization and analytics tools for infectious disease epidemiology: a systematic review.
Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F
2014-10-01
A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool use. As the volume and complexity of infectious disease data increases, public health professionals must synthesize highly disparate data to facilitate communication with the public and inform decisions regarding measures to protect the public's health. Our review identified several themes: consideration of users' needs, preferences, and computer literacy; integration of tools into routine workflow; complications associated with understanding and use of visualizations; and the role of user trust and organizational support in the adoption of these tools. Interoperability also emerged as a prominent theme, highlighting challenges associated with the increasingly collaborative and interdisciplinary nature of infectious disease control and prevention. Future work should address methods for representing uncertainty and missing data to avoid misleading users as well as strategies to minimize cognitive overload. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Technology of machine tools. Volume 2. Machine tool systems management and utilization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomson, A.R.
1980-10-01
The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.
NASA Astrophysics Data System (ADS)
Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan
2004-05-01
We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.
Molecular Characterization of Growth Hormone-producing Tumors in the GC Rat Model of Acromegaly.
Martín-Rodríguez, Juan F; Muñoz-Bravo, Jose L; Ibañez-Costa, Alejandro; Fernandez-Maza, Laura; Balcerzyk, Marcin; Leal-Campanario, Rocío; Luque, Raúl M; Castaño, Justo P; Venegas-Moreno, Eva; Soto-Moreno, Alfonso; Leal-Cerro, Alfonso; Cano, David A
2015-11-09
Acromegaly is a disorder resulting from excessive production of growth hormone (GH) and consequent increase of insulin-like growth factor 1 (IGF-I), most frequently caused by pituitary adenomas. Elevated GH and IGF-I levels results in wide range of somatic, cardiovascular, endocrine, metabolic, and gastrointestinal morbidities. Subcutaneous implantation of the GH-secreting GC cell line in rats leads to the formation of tumors. GC tumor-bearing rats develop characteristics that resemble human acromegaly including gigantism and visceromegaly. However, GC tumors remain poorly characterized at a molecular level. In the present work, we report a detailed histological and molecular characterization of GC tumors using immunohistochemistry, molecular biology and imaging techniques. GC tumors display histopathological and molecular features of human GH-producing tumors, including hormone production, cell architecture, senescence activation and alterations in cell cycle gene expression. Furthermore, GC tumors cells displayed sensitivity to somatostatin analogues, drugs that are currently used in the treatment of human GH-producing adenomas, thus supporting the GC tumor model as a translational tool to evaluate therapeutic agents. The information obtained would help to maximize the usefulness of the GC rat model for research and preclinical studies in GH-secreting tumors.
Molecular Characterization of Growth Hormone-producing Tumors in the GC Rat Model of Acromegaly
Martín-Rodríguez, Juan F.; Muñoz-Bravo, Jose L.; Ibañez-Costa, Alejandro; Fernandez-Maza, Laura; Balcerzyk, Marcin; Leal-Campanario, Rocío; Luque, Raúl M.; Castaño, Justo P.; Venegas-Moreno, Eva; Soto-Moreno, Alfonso; Leal-Cerro, Alfonso; Cano, David A.
2015-01-01
Acromegaly is a disorder resulting from excessive production of growth hormone (GH) and consequent increase of insulin-like growth factor 1 (IGF-I), most frequently caused by pituitary adenomas. Elevated GH and IGF-I levels results in wide range of somatic, cardiovascular, endocrine, metabolic, and gastrointestinal morbidities. Subcutaneous implantation of the GH-secreting GC cell line in rats leads to the formation of tumors. GC tumor-bearing rats develop characteristics that resemble human acromegaly including gigantism and visceromegaly. However, GC tumors remain poorly characterized at a molecular level. In the present work, we report a detailed histological and molecular characterization of GC tumors using immunohistochemistry, molecular biology and imaging techniques. GC tumors display histopathological and molecular features of human GH-producing tumors, including hormone production, cell architecture, senescence activation and alterations in cell cycle gene expression. Furthermore, GC tumors cells displayed sensitivity to somatostatin analogues, drugs that are currently used in the treatment of human GH-producing adenomas, thus supporting the GC tumor model as a translational tool to evaluate therapeutic agents. The information obtained would help to maximize the usefulness of the GC rat model for research and preclinical studies in GH-secreting tumors. PMID:26549306
Photoacoustic microscopy imaging for microneedle drug delivery
NASA Astrophysics Data System (ADS)
Moothanchery, Mohesh; Seeni, Razina Z.; Xu, Chenjie; Pramanik, Manojit
2018-02-01
The recent development of novel transdermal drug delivery systems (TDDS) using microneedle technology allows micron-sized conduits to be formed within the outermost skin layers attracting keen interest in skin as an interface for localized and systemic delivery of therapeutics. In light of this, researchers are using microneedles as tools to deliver nanoparticle formulations to targeted sites for effective therapy. However, in such studies the use of traditional histological methods are employed for characterization and do not allow for the in vivo visualization of drug delivery mechanism. Hence, this study presents a novel imaging technology to characterize microneedle based nanoparticle delivery systems using optical resolution-photoacoustic microscopy (OR-PAM). In this study in vivo transdermal delivery of gold nanoparticles using microneedles in mice ear and the spatial distribution of the nanoparticles in the tissue was successfully illustrated. Characterization of parameters that are relevant in drug delivery studies such as penetration depth, efficiency of delivered gold nanoparticles were monitored using the system. Photoacoustic microscopy proves an ideal tool for the characterization studies of microneedle properties and the studies shows microneedles as an ideal tool for precise and controlled drug delivery.
Enhanced project management tool
NASA Technical Reports Server (NTRS)
Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)
2012-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.
Unifying Spectral and Timing Studies of Relativistic Reflection in Active Galactic Nuclei
NASA Astrophysics Data System (ADS)
Reynolds, Christopher
X-ray observations of active galactic nuclei (AGN) contain a wealth of information relevant for understanding the structure of AGN, the process of accretion, and the gravitational physics of supermassive black holes. A particularly exciting development over the past four years has been the discovery and subsequent characterization of time delays between variability of the X-ray power-law continuum and the inner disk reflection spectrum including the broad iron line. The fact that the broad iron line shows this echo, or reverberation, in XMM-Newton, Suzaku and NuSTAR data is a strong confirmation of the disk reflection paradigm and has already been used to place constraints on the extent and geometry of the X-ray corona. However, current studies of AGN X-ray variability, including broad iron line reverberation, are only scratching the surface of the available data. At the present time, essentially all studies conduct temporal analyzes in a manner that is largely divorced from detailed spectroscopy - consistency between timing results (e.g., conclusions regarding the location of the primary X-ray source) and detailed spectral fits is examined after the fact. We propose to develop and apply new analysis tools for conducting a truly unified spectraltiming analysis of the X-ray properties of AGN. Operationally, this can be thought of as spectral fitting except with additional parameters that are accessing the temporal properties of the dataset. Our first set of tools will be based on Fourier techniques (via the construction and fitting of the energy- and frequency-dependent cross-spectrum) and most readily applicable to long observations of AGN with XMM-Newton. Later, we shall develop more general schemes (of a more Bayesian nature) that can operate on irregularly sampled data or quasi-simultaneous data from multiple instruments. These shall be applied to the long joint XMM-Newton/NuSTAR and Suzaku/NuSTAR AGN campaigns as well as Swift monitoring campaigns. Another important dimension of our work is the introduction of spectral and spectral-timing models of X-ray reflection from black hole disks that include realistic disk thickness (as opposed to the razor-thin disks assumed in current analysis tools). The astrophysical implications of our work are: - The first rigorous decomposition of the time-lags into those from reverberation and those from intrinsic continuum processes. - A new method for determining the density of photoionized (warm) absorbers in AGN through a measurement of the recombination time lags. - AGN black hole mass estimates obtained purely from X-ray data, and hence complementary to (observationally expensive) optical broad line reverberation campaigns. - The best possible characterization of strong gravity signatures in the reflected disk emission. - Detection and characterization of non-trivial accretion disk structure. Each of our tools and data products will be made available to the community/public upon the publication of the first results with that tool. The proposed work is in direct support of the NASA Science Plan, and is of direct relevant and support to NASA's fleet of X-ray observatories.
EJSCREEN: Environmental Justice Screening and Mapping Tool
EJSCREEN is an environmental justice screening and mapping tool that provides EPA and the public with a nationally consistent approach to characterizing potential areas may warrant further consideration, analysis, or outreach.
Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.
McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E
2017-09-21
One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.
Characterization of the honeybee AmNaV1 channel and tools to assess the toxicity of insecticides.
Gosselin-Badaroudine, Pascal; Moreau, Adrien; Delemotte, Lucie; Cens, Thierry; Collet, Claude; Rousset, Matthieu; Charnet, Pierre; Klein, Michael L; Chahine, Mohamed
2015-07-23
Pollination is important for both agriculture and biodiversity. For a significant number of plants, this process is highly, and sometimes exclusively, dependent on the pollination activity of honeybees. The large numbers of honeybee colony losses reported in recent years have been attributed to colony collapse disorder. Various hypotheses, including pesticide overuse, have been suggested to explain the disorder. Using the Xenopus oocytes expression system and two microelectrode voltage-clamp, we report the functional expression and the molecular, biophysical, and pharmacological characterization of the western honeybee's sodium channel (Apis Mellifera NaV1). The NaV1 channel is the primary target for pyrethroid insecticides in insect pests. We further report that the honeybee's channel is also sensitive to permethrin and fenvalerate, respectively type I and type II pyrethroid insecticides. Molecular docking of these insecticides revealed a binding site that is similar to sites previously identified in other insects. We describe in vitro and in silico tools that can be used to test chemical compounds. Our findings could be used to assess the risks that current and next generation pesticides pose to honeybee populations.
General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark
2010-01-01
Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.
Explicit B-spline regularization in diffeomorphic image registration
Tustison, Nicholas J.; Avants, Brian B.
2013-01-01
Diffeomorphic mappings are central to image registration due largely to their topological properties and success in providing biologically plausible solutions to deformation and morphological estimation problems. Popular diffeomorphic image registration algorithms include those characterized by time-varying and constant velocity fields, and symmetrical considerations. Prior information in the form of regularization is used to enforce transform plausibility taking the form of physics-based constraints or through some approximation thereof, e.g., Gaussian smoothing of the vector fields [a la Thirion's Demons (Thirion, 1998)]. In the context of the original Demons' framework, the so-called directly manipulated free-form deformation (DMFFD) (Tustison et al., 2009) can be viewed as a smoothing alternative in which explicit regularization is achieved through fast B-spline approximation. This characterization can be used to provide B-spline “flavored” diffeomorphic image registration solutions with several advantages. Implementation is open source and available through the Insight Toolkit and our Advanced Normalization Tools (ANTs) repository. A thorough comparative evaluation with the well-known SyN algorithm (Avants et al., 2008), implemented within the same framework, and its B-spline analog is performed using open labeled brain data and open source evaluation tools. PMID:24409140
On the road to a stronger public health workforce: visual tools to address complex challenges.
Drehobl, Patricia; Stover, Beth H; Koo, Denise
2014-11-01
The public health workforce is vital to protecting the health and safety of the public, yet for years, state and local governmental public health agencies have reported substantial workforce losses and other challenges to the workforce that threaten the public's health. These challenges are complex, often involve multiple influencing or related causal factors, and demand comprehensive solutions. However, proposed solutions often focus on selected factors and might be fragmented rather than comprehensive. This paper describes approaches to characterizing the situation more comprehensively and includes two visual tools: (1) a fishbone, or Ishikawa, diagram that depicts multiple factors affecting the public health workforce; and (2) a roadmap that displays key elements-goals and strategies-to strengthen the public health workforce, thus moving from the problems depicted in the fishbone toward solutions. The visual tools aid thinking about ways to strengthen the public health workforce through collective solutions and to help leverage resources and build on each other's work. The strategic roadmap is intended to serve as a dynamic tool for partnership, prioritization, and gap assessment. These tools reflect and support CDC's commitment to working with partners on the highest priorities for strengthening the workforce to improve the public's health. Published by Elsevier Inc.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
Dietary screening tool identifies nutritional risk in older adults123
Miller, Paige E; Mitchell, Diane C; Hartman, Terryl J; Lawrence, Frank R; Sempos, Christopher T; Smiciklas-Wright, Helen
2009-01-01
Background: No rapid methods exist for screening overall dietary intakes in older adults. Objective: The purpose of this study was to develop and evaluate a scoring system for a diet screening tool to identify nutritional risk in community-dwelling older adults. Design: This cross-sectional study in older adults (n = 204) who reside in rural areas examined nutrition status by using an in-person interview, biochemical measures, and four 24-h recalls that included the use of dietary supplements. Results: The dietary screening tool was able to characterize 3 levels of nutritional risk: at risk, possible risk, and not at risk. Individuals classified as at nutritional risk had significantly lower indicators of diet quality (Healthy Eating Index and Mean Adequacy Ratio) and intakes of protein, most micronutrients, dietary fiber, fruit, and vegetables. The at-risk group had higher intakes of fats and oils and refined grains. The at-risk group also had the lowest serum vitamin B-12, folate, β-cryptoxanthin, lutein, and zeaxanthin concentrations. The not-at-nutritional-risk group had significantly higher lycopene and β-carotene and lower homocysteine and methylmalonic acid concentrations. Conclusion: The dietary screening tool is a simple and practical tool that can help to detect nutritional risk in older adults. PMID:19458013
Microfabricated X-Ray Optics Technology Development for the Constellation-X Mission
NASA Technical Reports Server (NTRS)
Schattenburg, Mark L.
2003-01-01
During the period of this Cooperative Agreement, MIT developed advanced methods for applying silicon micro-stuctures for the precision assembly of foil x-ray optics in support of the Constellution-X Spectroscopy X-ray Telescope (SXT) development effort at Goddard Space Flight Center (GSFC). MIT developed improved methods for fabricating and characterizing the precision silicon micro-combs. MIT also developed and characterized assembly tools and several types of metrology tools in order to characterize and reduce the errors associated with precision assembly of foil optics. Results of this effort were published and presented to the scientific community and the GSFC SXT team.
Characterization of Cytokinetic Mutants Using Small Fluorescent Probes.
Smertenko, Andrei; Moschou, Panagiotis; Zhang, Laining; Fahy, Deirdre; Bozhkov, Peter
2016-01-01
Cytokinesis is a powerful paradigm for addressing fundamental questions of plant biology including molecular mechanisms of development, cell division, cell signaling, membrane trafficking, cell wall synthesis, and cytoskeletal dynamics. Genetics was instrumental in identification of proteins regulating cytokinesis. Characterization of mutant lines generated using forward or reverse genetics includes microscopic analysis for defects in cell division. Typically, failure of cytokinesis results in appearance of multinucleate cells, formation of cell wall stubs, and isotropic cell expansion in the root elongation zone. Small fluorescent probes served as a very effective tool for the detection of cytokinetic defects. Such probes stain living or formaldehyde-fixed specimens avoiding complex preparatory steps. Although resolution of the fluorescence probes is inferior to electron microscopy, the procedure is fast, easy, and does not require expensive materials or equipment. This chapter describes techniques for staining DNA with the probes DAPI and SYTO82, for staining membranes with FM4-64, and for staining cell wall with propidium iodide.
Simbaqueba, Jaime; Sánchez, Pilar; Sanchez, Erika; Núñez Zarantes, Victor Manuel; Chacon, Maria Isabel; Barrero, Luz Stella; Mariño-Ramírez, Leonardo
2011-01-01
Physalis peruviana, commonly known as Cape gooseberry, is an Andean Solanaceae fruit with high nutritional value and interesting medicinal properties. In the present study we report the development and characterization of microsatellite loci from a P. peruviana commercial Colombian genotype. We identified 932 imperfect and 201 perfect Simple Sequence Repeats (SSR) loci in untranslated regions (UTRs) and 304 imperfect and 83 perfect SSR loci in coding regions from the assembled Physalis peruviana leaf transcriptome. The UTR SSR loci were used for the development of 162 primers for amplification. The efficiency of these primers was tested via PCR in a panel of seven P. peruviana accessions including Colombia, Kenya and Ecuador ecotypes and one closely related species Physalis floridana. We obtained an amplification rate of 83% and a polymorphic rate of 22%. Here we report the first P. peruviana specific microsatellite set, a valuable tool for a wide variety of applications, including functional diversity, conservation and improvement of the species. PMID:22039540
Simbaqueba, Jaime; Sánchez, Pilar; Sanchez, Erika; Núñez Zarantes, Victor Manuel; Chacon, Maria Isabel; Barrero, Luz Stella; Mariño-Ramírez, Leonardo
2011-01-01
Physalis peruviana, commonly known as Cape gooseberry, is an Andean Solanaceae fruit with high nutritional value and interesting medicinal properties. In the present study we report the development and characterization of microsatellite loci from a P. peruviana commercial Colombian genotype. We identified 932 imperfect and 201 perfect Simple Sequence Repeats (SSR) loci in untranslated regions (UTRs) and 304 imperfect and 83 perfect SSR loci in coding regions from the assembled Physalis peruviana leaf transcriptome. The UTR SSR loci were used for the development of 162 primers for amplification. The efficiency of these primers was tested via PCR in a panel of seven P. peruviana accessions including Colombia, Kenya and Ecuador ecotypes and one closely related species Physalis floridana. We obtained an amplification rate of 83% and a polymorphic rate of 22%. Here we report the first P. peruviana specific microsatellite set, a valuable tool for a wide variety of applications, including functional diversity, conservation and improvement of the species.
NASA Astrophysics Data System (ADS)
Geertsema, Marten; Blais-Stevens, Andrée; Kwoll, Eva; Menounos, Brian; Venditti, Jeremy G.; Grenier, Alain; Wiebe, Kelsey
2018-02-01
The Lakelse Lake area in northwestern British Columbia, Canada, has a long history, and prehistory, of rapid sensitive clay landslides moving on very low gradients. However, until now, many landslides have gone undetected. We use an array of modern tools to identify hitherto unknown or poorly known landslide deposits, including acoustic subbottom profiles, multibeam sonar, and LiDAR. The combination of these methods reveals not only landslide deposits, but also geomorphic and sedimentologic structures that give clues about landslide type and mode of emplacement. LiDAR and bathymetric data reveal the areal extent of landslide deposits as well as the orientation of ridges that differentiate between spreading and flowing kinematics. The subbottom profiles show two-dimensional structures of disturbed landslide deposits, including horst and grabens indicative of landslides classified as spreads. A preliminary computer tomography (CT) scan of a sediment core confirms the structures of one subbottom profile. We also use archival data from the Ministry of Transportation and Infrastructure and resident interviews to better characterize historic landslides.
Plan for conducting an international machine tool task force
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, G.P.; McClure, E.R.; Schuman, J.F.
1978-08-28
The basic objectives of the Machine Tool Task Force (MTTF) are to characterize and summarize the state of the art of cutting machine tool technology and to identify promising areas of future R and D. These goals will be accomplished with a series of multidisciplinary teams of prominent experts and individuals experienced in the specialized technologies of machine tools or in the management of machine tool operations. Experts will be drawn from all areas of the machine tool community: machine tool users or buyer organizations, builders, and R and D establishments including universities and government laboratories, both domestic and foreign.more » A plan for accomplishing this task is presented. The area of machine tool technology has been divided into about two dozen technology subjects on which teams of one or more experts will work. These teams are, in turn, organized into four principal working groups dealing, respectively, with machine tool accuracy, mechanics, control, and management systems/utilization. Details are presented on specific subjects to be covered, the organization of the Task Force and its four working groups, and the basic approach to determining the state of the art of technology and the future directions of this technology. The planned review procedure, the potential benefits, our management approach, and the schedule, as well as the key participating personnel and their background are discussed. The initial meeting of MTTF members will be held at a plenary session on October 16 and 17, 1978, in Scottsdale, AZ. The MTTF study will culminate in a conference on September 1, 1980, in Chicago, IL, immediately preceeding the 1980 International Machine Tool Show. At this time, our results will be released to the public; a series of reports will be published in late 1980.« less
NASA Astrophysics Data System (ADS)
Perez-Moreno, Javier
2015-09-01
Understanding the fundamental mechanisms behind the radiation resistance of polymers and molecules would allow us to tailor new materials with enhanced performance in space and adverse environments. Previous studies of the radiation effects on polymer-based photonic materials indicate that they are very dependent on the choice of polymer-host and guest-chromophores. The best results have been reported from the combination of CLD1 as a guest-chromophore doped in APC as host polymer, where improvement of the performance was observed upon gamma-irradiation at moderate doses. In this paper, we report on the different complementary tools that have been tried to characterize the origin of such enhancement: characterization of the linear and nonlinear response, characterization of chemical properties, and application of an all-optical protocol. We derive some general conclusions by contrasting the results of each characterization, and propose complementary experiments based on microscopy techniques.
Hamad, Rita; Modrek, Sepideh; Kubo, Jessica; Goldstein, Benjamin A; Cullen, Mark R
2015-01-01
Investigators across many fields often struggle with how best to capture an individual's overall health status, with options including both subjective and objective measures. With the increasing availability of "big data," researchers can now take advantage of novel metrics of health status. These predictive algorithms were initially developed to forecast and manage expenditures, yet they represent an underutilized tool that could contribute significantly to health research. In this paper, we describe the properties and possible applications of one such "health risk score," the DxCG Intelligence tool. We link claims and administrative datasets on a cohort of U.S. workers during the period 1996-2011 (N = 14,161). We examine the risk score's association with incident diagnoses of five disease conditions, and we link employee data with the National Death Index to characterize its relationship with mortality. We review prior studies documenting the risk score's association with other health and non-health outcomes, including healthcare utilization, early retirement, and occupational injury. We find that the risk score is associated with outcomes across a variety of health and non-health domains. These examples demonstrate the broad applicability of this tool in multiple fields of research and illustrate its utility as a measure of overall health status for epidemiologists and other health researchers.
Defining a Computational Framework for the Assessment of ...
The Adverse Outcome Pathway (AOP) framework describes the effects of environmental stressors across multiple scales of biological organization and function. This includes an evaluation of the potential for each key event to occur across a broad range of species in order to determine the taxonomic applicability of each AOP. Computational tools are needed to facilitate this process. Recently, we developed a tool that uses sequence homology to evaluate the applicability of molecular initiating events across species (Lalone et al., Toxicol. Sci., 2016). To extend our ability to make computational predictions at higher levels of biological organization, we have created the AOPdb. This database links molecular targets identified associated with key events in the AOPwiki to publically available data (e.g. gene-protein, pathway, species orthology, ontology, chemical, disease) including ToxCast assay information. The AOPdb combines different data types in order to characterize the impacts of chemicals to human health and the environment and serves as a decision support tool for case study development in the area of taxonomic applicability. As a proof of concept, the AOPdb allows identification of relevant molecular targets, biological pathways, and chemical and disease associations across species for four AOPs from the AOP-Wiki (https://aopwiki.org): Estrogen receptor antagonism leading to reproductive dysfunction (Aop:30); Aromatase inhibition leading to reproductive d
Using music as a therapy tool to motivate troubled adolescents.
Keen, Alexander W
2004-01-01
Children and adolescents with emotional disorders may often be characterized by having problems in peer and adult relations and in display of inappropriate behaviours. These include suicide attempts, anger, withdrawal from family, social isolation from peers, aggression, school failure, running away, and alcohol and/or drug abuse. A lack of self-concept and self-esteem is often central to these difficulties. Traditional treatment methods with young people usually includes cognitive- behavioural approaches with psychotherapy. Unfortunately these children often lack a solid communication base, creating a block to successful treatment. In my private clinical practice, I have endeavoured to break through these communication barriers by using music as a therapy tool. This paper describes and discusses my use of music as a therapy tool with troubled adolescents. Pre- and post-testing of the effectiveness of this intervention technique by using the Psychosocial Functioning Inventory for Primary School Children (PFI-PSC) has yielded positive initial results, lending support to its continued use. Music has often been successful in helping these adolescents engage in the therapeutic process with minimised resistance as they relate to the music and the therapist becomes a safe and trusted adult. Various techniques such as song discussion, listening, writing lyrics, composing music, and performing music.
Near Real Time Tools for ISS Plasma Science and Engineering Applications
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Willis, Emily M.; Parker, Linda Neergaard; Shim, Ja Soon; Kuznetsova, Maria M.; Pulkkinen, Antti, A.
2013-01-01
The International Space Station (ISS) program utilizes a plasma environment forecast for estimating electrical charging hazards for crews during extravehicular activity (EVA). The process uses ionospheric electron density (Ne) and temperature (Te) measurements from the ISS Floating Potential Measurement Unit (FPMU) instrument suite with the assumption that the plasma conditions will remain constant for one to fourteen days with a low probability for a space weather event which would significantly change the environment before an EVA. FPMU data is typically not available during EVA's, therefore, the most recent FPMU data available for characterizing the state of the ionosphere during EVA is typically a day or two before the start of an EVA or after the EVA has been completed. Three near real time space weather tools under development for ISS applications are described here including: (a) Ne from ground based ionosonde measurements of foF2 (b) Ne from near real time satellite radio occultation measurements of electron density profiles (c) Ne, Te from a physics based ionosphere model These applications are used to characterize the ISS space plasma environment during EVA periods when FPMU data is not available, monitor for large changes in ionosphere density that could render the ionosphere forecast and plasma hazard assessment invalid, and validate the "persistence of conditions" forecast assumption. In addition, the tools are useful for providing space environment input to science payloads on ISS and anomaly investigations during periods the FPMU is not operating.
Cruciani, Federica; Biagi, Elena; Severgnini, Marco; Consolandi, Clarissa; Calanni, Fiorella; Donders, Gilbert; Brigidi, Patrizia
2015-01-01
The healthy vaginal microbiota is generally dominated by lactobacilli that confer antimicrobial protection and play a crucial role in health. Bacterial vaginosis (BV) is the most prevalent lower genital tract infection in women in reproductive age and is characterized by a shift in the relative abundances of Lactobacillus spp. to a greater abundance of strictly anaerobic bacteria. In this study, we designed a new phylogenetic microarray-based tool (VaginArray) that includes 17 probe sets specific for the most representative bacterial groups of the human vaginal ecosystem. This tool was implemented using the ligase detection reaction-universal array (LDR-UA) approach. The entire probe set properly recognized the specific targets and showed an overall sensitivity of 6 to 12 ng per probe. The VaginArray was applied to assess the efficacy of rifaximin vaginal tablets for the treatment of BV, analyzing the vaginal bacterial communities of 22 BV-affected women treated with rifaximin vaginal tablets at a dosage of 25 mg/day for 5 days. Our results showed the ability of rifaximin to reduce the growth of various BV-related bacteria (Atopobium vaginae, Prevotella, Megasphaera, Mobiluncus, and Sneathia spp.), with the highest antibiotic susceptibility for A. vaginae and Sneathia spp. Moreover, we observed an increase of Lactobacillus crispatus levels in the subset of women who maintained remission after 1 month of therapy, opening new perspectives for the treatment of BV. PMID:25733514
PyRhO: A Multiscale Optogenetics Simulation Platform
Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037
PyRhO: A Multiscale Optogenetics Simulation Platform.
Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.
Diagnosing schistosomiasis: where are we?
Gomes, Luciana Inácia; Enk, Martin Johannes; Rabello, Ana
2014-01-01
In light of the World Health Organization's initiative to extend schistosomiasis morbidity and mortality control programs by including a disease elimination strategy in low endemic settings, this paper reviews diagnostic tools described during the last decades and provide an overview of ongoing efforts in making an efficient diagnostic tool available worldwide. A literature search on PubMed using the search criteria schistosomiasis and diagnosis within the period from 1978 to 2013 was carried out. Articles with abstract in English and that used laboratory techniques specifically developed for the detection of schistosomiasis in humans were included. Publications were categorized according to the methodology applied (parasitological, immunological, or molecular) and stage of development (in house development, limited field, or large scale field testing). The initial research generated 4,535 publications, of which only 643 met the inclusion criteria. The vast majority (537) of the publications focused on immunological techniques; 81 focused on parasitological diagnosis, and 25 focused on molecular diagnostic methods. Regarding the stage of development, 307 papers referred to in-house development, 202 referred to limited field tests, and 134 referred to large scale field testing. The data obtained show that promising new diagnostic tools, especially for Schistosoma antigen and deoxyribonucleic acid (DNA) detection, which are characterized by high sensitivity and specificity, are being developed. In combination with international funding initiatives these tools may result in a significant step forward in successful disease elimination and surveillance, which is to make efficient tests accessible and its large use self-sustainable for control programs in endemic countries.
Elmasri, Wael A; Zhu, Rui; Peng, Wenjing; Al-Hariri, Moustafa; Kobeissy, Firas; Tran, Phat; Hamood, Abdul N; Hegazy, Mohamed F; Paré, Paul W; Mechref, Yehia
2017-07-07
Growth inhibition of the pathogen Staphylococcus aureus with currently available antibiotics is problematic in part due to bacterial biofilm protection. Although recently characterized natural products, including 3',4',5-trihydroxy-6,7-dimethoxy-flavone [1], 3',4',5,6,7-pentahydroxy-flavone [2], and 5-hydroxy-4',7-dimethoxy-flavone [3], exhibit both antibiotic and biofilm inhibitory activities, the mode of action of such hydroxylated flavonoids with respect to S. aureus inhibition is yet to be characterized. Enzymatic digestion and high-resolution MS analysis of differentially expressed proteins from S. aureus with and without exposure to antibiotic flavonoids (1-3) allowed for the characterization of global protein alterations induced by metabolite treatment. A total of 56, 92, and 110 proteins were differentially expressed with bacterial exposure to 1, 2, or 3, respectively. The connectivity of the identified proteins was characterized using a search tool for the retrieval of interacting genes/proteins (STRING) with multitargeted S. aureus inhibition of energy metabolism and biosynthesis by the assayed flavonoids. Identifying the mode of action of natural products as antibacterial agents is expected to provide insight into the potential use of flavonoids alone or in combination with known therapeutic agents to effectively control S. aureus infection.
The EPIC-MOS Particle-Induced Background Spectra
NASA Technical Reports Server (NTRS)
Kuntz, K. D.; Sowden, S. L.
2007-01-01
In order to analyse diffuse emission that fills the field of view, one must accurately characterize the instrumental backgrounds. For the XMM-Newton EPIC instrument these backgrounds include a temporally variable "quiescent" component. as well as the strongly variable soft proton contamination. We have characterized the spectral and spatial response of the EPIC detectors to these background components and have developed tools to remove these backgrounds from observations. The "quiescent" component was characterized using a combination of the filter-wheel-closed data and a database of unexposed-region data. The soft proton contamination was characterized by differencing images and spectra taken during flared and flare-free intervals. After application of our modeled backgrounds, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear spectral evidence of solar wind charge exchange emission. Using a large sample of blank sky data, we show that strong magnetospheric SWCX emission requires elevated solar wind fluxes; observations through the densest part of the magnetosheath are not necessarily strongly contaminated with SWCX emission.
Improving Planetary Rover Attitude Estimation via MEMS Sensor Characterization
Hidalgo, Javier; Poulakis, Pantelis; Köhler, Johan; Del-Cerro, Jaime; Barrientos, Antonio
2012-01-01
Micro Electro-Mechanical Systems (MEMS) are currently being considered in the space sector due to its suitable level of performance for spacecrafts in terms of mechanical robustness with low power consumption, small mass and size, and significant advantage in system design and accommodation. However, there is still a lack of understanding regarding the performance and testing of these new sensors, especially in planetary robotics. This paper presents what is missing in the field: a complete methodology regarding the characterization and modeling of MEMS sensors with direct application. A reproducible and complete approach including all the intermediate steps, tools and laboratory equipment is described. The process of sensor error characterization and modeling through to the final integration in the sensor fusion scheme is explained with detail. Although the concept of fusion is relatively easy to comprehend, carefully characterizing and filtering sensor information is not an easy task and is essential for good performance. The strength of the approach has been verified with representative tests of novel high-grade MEMS inertia sensors and exemplary planetary rover platforms with promising results. PMID:22438761
This page provides access to a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases,
NASA Astrophysics Data System (ADS)
Robert-Perron, Etienne; Blais, Carl; Pelletier, Sylvain; Thomas, Yannig
2007-06-01
The green machining process is an interesting approach for solving the mediocre machining behavior of high-performance powder metallurgy (PM) steels. This process appears as a promising method for extending tool life and reducing machining costs. Recent improvements in binder/lubricant technologies have led to high green strength systems that enable green machining. So far, tool wear has been considered negligible when characterizing the machinability of green PM specimens. This inaccurate assumption may lead to the selection of suboptimum cutting conditions. The first part of this study involves the optimization of the machining parameters to minimize the effects of tool wear on the machinability in turning of green PM components. The second part of our work compares the sintered mechanical properties of components machined in green state with other machined after sintering.
Diffusion-weighted Breast MRI: Clinical Applications and Emerging Techniques
Partridge, Savannah C.; Nissan, Noam; Rahbar, Habib; Kitsch, Averi E.; Sigmund, Eric E.
2016-01-01
Diffusion weighted MRI (DWI) holds potential to improve the detection and biological characterization of breast cancer. DWI is increasingly being incorporated into breast MRI protocols to address some of the shortcomings of routine clinical breast MRI. Potential benefits include improved differentiation of benign and malignant breast lesions, assessment and prediction of therapeutic efficacy, and non-contrast detection of breast cancer. The breast presents a unique imaging environment with significant physiologic and inter-subject variations, as well as specific challenges to achieving reliable high quality diffusion weighted MR images. Technical innovations are helping to overcome many of the image quality issues that have limited widespread use of DWI for breast imaging. Advanced modeling approaches to further characterize tissue perfusion, complexity, and glandular organization may expand knowledge and yield improved diagnostic tools. PMID:27690173
The Utility of Nanopore Technology for Protein and Peptide Sensing.
Robertson, Joseph W F; Reiner, Joseph E
2018-06-28
Resistive-pulse nanopore sensing enables label-free single-molecule analysis of a wide range of analytes. An increasing number of studies have demonstrated the feasibility and usefulness of nanopore sensing for protein and peptide characterization. Nanopores offer the potential to study a variety of protein-related phenomena that includes unfolding kinetics, differences in unfolding pathways, protein structure stability and free energy profiles of DNA-protein and RNA-protein binding. In addition to providing a tool for fundamental protein characterization, nanopores have also been used as highly selective protein detectors in various solution mixtures and conditions. This review highlights these and other developments in the area of nanopore-based protein and peptide detection. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Synthesis and characterization of non-hydrolysable diphosphoinositol polyphosphate second messengers
Wu, Mingxuan; Dul, Barbara E.; Trevisan, Alexandra J.; Fiedler, Dorothea
2012-01-01
The diphosphoinositol polyphosphates (PP-IPs) are a central group of eukaryotic second messengers. They regulate numerous processes, including cellular energy homeostasis and adaptation to environmental stresses. To date, most of the molecular details in PP-IP signalling have remained elusive, due to a lack of appropriate methods and reagents. Here we describe the expedient synthesis of methylene-bisphosphonate PP-IP analogues. Their characterization revealed that the analogues exhibit significant stability and mimic their natural counterparts very well. This was further confirmed in two independent biochemical assays, in which our analogues potently inhibited phosphorylation of the protein kinase Akt and hydrolytic activity of the Ddp1 phosphohydrolase. The non-hydrolysable PP-IPs thus emerge as important tools and hold great promise for a variety of applications. PMID:23378892
Losh, Molly; Gordon, Peter C.
2014-01-01
Autism Spectrum Disorder (ASD) is characterized by difficulties with social communication and functioning, and ritualistic/repetitive behaviors (American Psychiatric Association, 2013). While substantial heterogeneity exists in symptom expression, impairments in language discourse skills, including narrative, are universally observed (Tager-Flusberg, Paul, & Lord, 2005). This study applied a computational linguistic tool, Latent Semantic Analysis (LSA), to objectively characterize narrative performance in ASD across two narrative contexts differing in interpersonal and cognitive demands. Results indicated that individuals with ASD produced narratives comparable in semantic content to those from controls when narrating from a picture book, but produced narratives diminished in semantic quality in a more demanding narrative recall task. Results are discussed in terms of the utility of LSA as a quantitative, objective, and efficient measure of narrative ability. PMID:24915929
Young, Carissa L; Britton, Zachary T; Robinson, Anne S
2012-05-01
Protein fusion tags are indispensible tools used to improve recombinant protein expression yields, enable protein purification, and accelerate the characterization of protein structure and function. Solubility-enhancing tags, genetically engineered epitopes, and recombinant endoproteases have resulted in a versatile array of combinatorial elements that facilitate protein detection and purification in microbial hosts. In this comprehensive review, we evaluate the most frequently used solubility-enhancing and affinity tags. Furthermore, we provide summaries of well-characterized purification strategies that have been used to increase product yields and have widespread application in many areas of biotechnology including drug discovery, therapeutics, and pharmacology. This review serves as an excellent literature reference for those working on protein fusion tags. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Spotting the differences in two-dimensional materials - the Raman scattering perspective.
Zhang, Shishu; Zhang, Na; Zhao, Yan; Cheng, Ting; Li, Xiaobo; Feng, Rui; Xu, Hua; Liu, Zhirong; Zhang, Jin; Tong, Lianming
2018-05-08
Two-dimensional (2D) layered materials have attracted tremendous attention and led to a prosperous development in both fundamental investigation and device applications in various fields, such as nanoelectronics, flexible devices, sustainable energy and catalysts. The precise characterization of the structure and properties of 2D materials is in urgent need. Raman scattering spectroscopy is one of the most popular characterization tools that is convenient, rapid and non-invasive. It provides information on both the lattice structure from the frequency of phonon modes and the electronic band structure through the intensity due to electronic resonance Raman scattering. Although a few morphological characterization tools can image 2D materials with atomic resolution, Raman scattering measurements are more tolerant to the conditions of sample preparation such as the substrate and less technically demanding, and have been one of the routine tools for the characterization of 2D materials. In this review, we focus on the characterization of 2D materials using Raman scattering spectroscopy, in particular, the revealing of differences from primitive 2D materials, such as defects, doping effects, van der Waals heterostructures and the interaction with molecules. The characteristic Raman features of such differences and the corresponding interpretation will be discussed. We hope that this review will be useful for wide research communities of materials, physics, chemistry and engineering.
NASA Astrophysics Data System (ADS)
Huang, Chien-Jung; White, Susan; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff
2016-11-01
Obstructive sleep apnea (OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the upper airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The ultimate goal of this research is the development of a versatile numerical tool for simulation of air-tissue interactions in the patient specific upper airway geometry. This tool is expected to capture several phenomena, including flow-induced vibration (snoring) and large deformations during airway collapse of the complex airway geometry in respiratory flow conditions. Here, we present our ongoing progress toward this goal. To avoid mesh regeneration, for flow model, a sharp-interface embedded boundary method is used on Cartesian grids for resolving the fluid-structure interface, while for the structural model, a cut-cell finite element method is used. Also, to properly resolve large displacements, non-linear elasticity model is used. The fluid and structure solvers are connected with the strongly coupled iterative algorithm. The parallel computation is achieved with the numerical library PETSc. Some two- and three- dimensional preliminary results are shown to demonstrate the ability of this tool.
Mission Systems Open Architecture Science and Technology (MOAST) program
NASA Astrophysics Data System (ADS)
Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.
2017-04-01
The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.
Wang, Fan; Wang, Xiangzhao; Ma, Mingying
2006-08-20
As the feature size decreases, degradation of image quality caused by wavefront aberrations of projection optics in lithographic tools has become a serious problem in the low-k1 process. We propose a novel measurement technique for in situ characterizing aberrations of projection optics in lithographic tools. Considering the impact of the partial coherence illumination, we introduce a novel algorithm that accurately describes the pattern displacement and focus shift induced by aberrations. Employing the algorithm, the measurement condition is extended from three-beam interference to two-, three-, and hybrid-beam interferences. The experiments are performed to measure the aberrations of projection optics in an ArF scanner.
Developing Decontamination Tools and Approaches to ...
Developing Decontamination Tools and Approaches to Address Indoor Pesticide Contamination from Improper Bed Bug Treatments The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.
Microstructural Characterization of Friction Stir Welded Aluminum-Steel Joints
NASA Astrophysics Data System (ADS)
Patterson, Erin E.; Hovanski, Yuri; Field, David P.
2016-06-01
This work focuses on the microstructural characterization of aluminum to steel friction stir welded joints. Lap weld configuration coupled with scribe technology used for the weld tool have produced joints of adequate quality, despite the significant differences in hardness and melting temperatures of the alloys. Common to friction stir processes, especially those of dissimilar alloys, are microstructural gradients including grain size, crystallographic texture, and precipitation of intermetallic compounds. Because of the significant influence that intermetallic compound formation has on mechanical and ballistic behavior, the characterization of the specific intermetallic phases and the degree to which they are formed in the weld microstructure is critical to predicting weld performance. This study used electron backscatter diffraction, energy dispersive spectroscopy, scanning electron microscopy, and Vickers micro-hardness indentation to explore and characterize the microstructures of lap friction stir welds between an applique 6061-T6 aluminum armor plate alloy and a RHA homogeneous armor plate steel alloy. Macroscopic defects such as micro-cracks were observed in the cross-sectional samples, and binary intermetallic compound layers were found to exist at the aluminum-steel interfaces of the steel particles stirred into the aluminum weld matrix and across the interfaces of the weld joints. Energy dispersive spectroscopy chemical analysis identified the intermetallic layer as monoclinic Al3Fe. Dramatic decreases in grain size in the thermo-mechanically affected zones and weld zones that evidenced grain refinement through plastic deformation and recrystallization. Crystallographic grain orientation and texture were examined using electron backscatter diffraction. Striated regions in the orientations of the aluminum alloy were determined to be the result of the severe deformation induced by the complex weld tool geometry. Many of the textures observed in the weld zone and thermo-mechanically affected zones exhibited shear texture components; however, there were many textures that deviated from ideal simple shear. Factors affecting the microstructure which are characteristic of the friction stir welding process, such as post-recrystallization deformation and complex deformation induced by tool geometry were discussed as causes for deviation from simple shear textures.
Development and characterization of a guinea pig model for Marburg virus.
Wong, Gary; Cao, Wen-Guang; He, Shi-Hua; Zhang, Zi-Rui; Zhu, Wen-Jun; Moffat, Estella; Ebihara, Hideki; Embury-Hyatt, Carissa; Qiu, Xiang-Guo
2018-01-18
The Angolan strain of Marburg virus (MARV/Ang) can cause lethal disease in humans with a case fatality rate of up to 90%, but infection of immunocompetent rodents do not result in any observable symptoms. Our previous work includes the development and characterization of a MARV/Ang variant that can cause lethal disease in mice (MARV/Ang-MA), with the aim of using this tool to screen for promising prophylactic and therapeutic candidates. An intermediate animal model is needed to confirm any findings from mice studies before testing in the gold-standard non-human primate (NHP) model. In this study, we serially passaged the clinical isolate of MARV/Ang in the livers and spleens of guinea pigs until a variant emerged that causes 100% lethality in guinea pigs (MARV/Ang-GA). Animals infected with MARV/Ang-GA showed signs of filovirus infection including lymphocytopenia, thrombocytopenia, and high viremia leading to spread to major organs, including the liver, spleen, lungs, and kidneys. The MARV/Ang-GA guinea pigs died between 7-9 days after infection, and the LD 50 was calculated to be 1.1×10 -1 TCID 50 (median tissue culture infective dose). Mutations in MARV/Ang-GA were identified and compared to sequences of known rodent-adapted MARV/Ang variants, which may benefit future studies characterizing important host adaptation sites in the MARV/Ang viral genome.
Chen, Chao; Leavey, Shannon; Krasner, Stuart W; Mel Suffet, I H
2014-06-15
Certain nitrosamines in water are disinfection byproducts that are probable human carcinogens. Nitrosamines have diverse and complex precursors that include effluent organic matter, some anthropogenic chemicals, and natural (likely non-humic) substances. An easy and selective tool was first developed to characterize nitrosamine precursors in treated wastewaters, including different process effluents. This tool takes advantages of the polarity rapid assessment method (PRAM) and ultrafiltration (UF) (molecular weight distribution) to locate the fractions with the strongest contributions to the nitrosamine precursor pool in the effluent organic matter. Strong cation exchange (SCX) and C18 solid-phase extraction cartridges were used for their high selectivity for nitrosamine precursors. The details of PRAM operation, such as cartridge clean-up, capacity, pH influence, and quality control were included in this paper, as well as the main parameters of UF operation. Preliminary testing of the PRAM/UF method with effluents from one wastewater treatment plant gave very informative results. SCX retained 45-90% of the N-nitrosodimethylamine (NDMA) formation potential (FP)-a measure of the precursors-in secondary and tertiary wastewater effluents. These results are consistent with NDMA precursors likely having a positively charged amine group. C18 adsorbed 30-45% of the NDMAFP, which indicates that a substantial portion of these precursors were non-polar. The small molecular weight (MW) (<1 kDa) and large MW (>10 kDa) fractions obtained from UF were the primary contributors to NDMAFP. The combination of PRAM and UF brings important information on the characteristics of nitrosamine precursors in water with easy operation. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.
2018-01-01
The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.
Doing accelerator physics using SDDS, UNIX, and EPICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borland, M.; Emery, L.; Sereno, N.
1995-12-31
The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less
Grubb, Stephen C.; Maddatu, Terry P.; Bult, Carol J.; Bogue, Molly A.
2009-01-01
The Mouse Phenome Database (MPD; http://www.jax.org/phenome) is an open source, web-based repository of phenotypic and genotypic data on commonly used and genetically diverse inbred strains of mice and their derivatives. MPD is also a facility for query, analysis and in silico hypothesis testing. Currently MPD contains about 1400 phenotypic measurements contributed by research teams worldwide, including phenotypes relevant to human health such as cancer susceptibility, aging, obesity, susceptibility to infectious diseases, atherosclerosis, blood disorders and neurosensory disorders. Electronic access to centralized strain data enables investigators to select optimal strains for many systems-based research applications, including physiological studies, drug and toxicology testing, modeling disease processes and complex trait analysis. The ability to select strains for specific research applications by accessing existing phenotype data can bypass the need to (re)characterize strains, precluding major investments of time and resources. This functionality, in turn, accelerates research and leverages existing community resources. Since our last NAR reporting in 2007, MPD has added more community-contributed data covering more phenotypic domains and implemented several new tools and features, including a new interactive Tool Demo available through the MPD homepage (quick link: http://phenome.jax.org/phenome/trytools). PMID:18987003
Sharan, Malvika; Förstner, Konrad U; Eulalio, Ana; Vogel, Jörg
2017-06-20
RNA-binding proteins (RBPs) have been established as core components of several post-transcriptional gene regulation mechanisms. Experimental techniques such as cross-linking and co-immunoprecipitation have enabled the identification of RBPs, RNA-binding domains (RBDs) and their regulatory roles in the eukaryotic species such as human and yeast in large-scale. In contrast, our knowledge of the number and potential diversity of RBPs in bacteria is poorer due to the technical challenges associated with the existing global screening approaches. We introduce APRICOT, a computational pipeline for the sequence-based identification and characterization of proteins using RBDs known from experimental studies. The pipeline identifies functional motifs in protein sequences using position-specific scoring matrices and Hidden Markov Models of the functional domains and statistically scores them based on a series of sequence-based features. Subsequently, APRICOT identifies putative RBPs and characterizes them by several biological properties. Here we demonstrate the application and adaptability of the pipeline on large-scale protein sets, including the bacterial proteome of Escherichia coli. APRICOT showed better performance on various datasets compared to other existing tools for the sequence-based prediction of RBPs by achieving an average sensitivity and specificity of 0.90 and 0.91 respectively. The command-line tool and its documentation are available at https://pypi.python.org/pypi/bio-apricot. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Multi-faceted characterization of battery reactions: the case of spinel hosts for Mg-ion batteries
NASA Astrophysics Data System (ADS)
Cabana, Jordi
2015-03-01
Electrochemical energy storage was an important enabler of the wireless revolution and it is touted as a key component of a society that shifts away from its dependence on fossil fuels. Batteries are the primary technology when high energy devices are required. They are complex reactors in which multiple physico-chemical phenomena are concurrent in time and space. As a result, it is increasingly clear that holistic approaches to define such phenomena require a breadth of characterization tools. I will exemplify this need in the context of our quest for hosts that are able to reversibly intercalate Mg2+ ions. Systems based on the intercalation of multivalent ions are pushed as next generation devices because, while they can resemble systems using Li+ ions, they can store more charge per mol of intercalated species, and adopt metals as the anode. Using a combination of characterization tools, including X-ray diffraction, spectroscopy and scattering, electron microscopy and nuclear magnetic resonance, we ascertained that spinel oxides are able to reversibly and extensively accommodate Mg2+. The mechanisms of this reaction were also elucidated. The rationale for the choice of techniques and the key pieces they provided to complete the picture will be discussed. This work was supported as part of the Joint Center for Energy Storage Research (JCESR), an Energy Innovation Hub funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences.
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Kim, Kang; Wagner, William R
2016-03-01
With the rapid expansion of biomaterial development and coupled efforts to translate such advances toward the clinic, non-invasive and non-destructive imaging tools to evaluate implants in situ in a timely manner are critically needed. The required multi-level information is comprehensive, including structural, mechanical, and biological changes such as scaffold degradation, mechanical strength, cell infiltration, extracellular matrix formation and vascularization to name a few. With its inherent advantages of non-invasiveness and non-destructiveness, ultrasound imaging can be an ideal tool for both preclinical and clinical uses. In this review, currently available ultrasound imaging technologies that have been applied in vitro and in vivo for tissue engineering and regenerative medicine are discussed and some new emerging ultrasound technologies and multi-modality approaches utilizing ultrasound are introduced.
Towards the assessment and management of contaminated dredged materials.
Agius, Suzanne J; Porebski, Linda
2008-04-01
Environment Canada's Disposal at Sea Programme hosted the Contaminated Dredged Material Management Decisions Workshop in Montreal, Quebec, Canada, on 28-30 November 2006. The workshop brought together over 50 sediment assessment and management experts from academic, industrial, and regulatory backgrounds and charged them with drafting a potential framework to assess contaminated dredged materials and compare the risks of various disposal alternatives. This article summarizes the recommendations made during the workshop concerning the development of sediment assessment tools, the interpretation of these tools, and the essential attributes of a comparative risk assessment process. The major outcomes of the workshop include a strong recommendation to develop a national dredging or sediment management strategy, a potential decision-making framework for the assessment of dredged materials and comparative risk assessment of disposal options, and the expansion of minimum sediment characterization requirements for nonroutine disposal permit applications.
Molecular plant breeding: methodology and achievements.
Varshney, Rajeev K; Hoisington, Dave A; Nayak, Spurthi N; Graner, Andreas
2009-01-01
The progress made in DNA marker technology has been remarkable and exciting in recent years. DNA markers have proved valuable tools in various analyses in plant breeding, for example, early generation selection, enrichment of complex F(1)s, choice of donor parent in backcrossing, recovery of recurrent parent genotype in backcrossing, linkage block analysis and selection. Other main areas of applications of molecular markers in plant breeding include germplasm characterization/fingerprinting, determining seed purity, systematic sampling of germplasm, and phylogenetic analysis. Molecular markers, thus, have proved powerful tools in replacing the bioassays and there are now many examples available to show the efficacy of such markers. We have illustrated some basic concepts and methodology of applying molecular markers for enhancing the selection efficiency in plant breeding. Some successful examples of product developments of molecular breeding have also been presented.
Kim, Kang; Wagner, William R.
2015-01-01
With the rapid expansion of biomaterial development and coupled efforts to translate such advances toward the clinic, non-invasive and non-destructive imaging tools to evaluate implants in situ in a timely manner are critically needed. The required multilevel information is comprehensive, including structural, mechanical, and biological changes such as scaffold degradation, mechanical strength, cell infiltration, extracellular matrix formation and vascularization to name a few. With its inherent advantages of non-invasiveness and non-destructiveness, ultrasound imaging can be an ideal tool for both preclinical and clinical uses. In this review, currently available ultrasound imaging technologies that have been applied in vitro and in vivo for tissue engineering and regenerative medicine are discussed and some new emerging ultrasound technologies and multi-modality approaches utilizing ultrasound are introduced. PMID:26518412
Ezeonu, Chukwuma S.; Tagbo, Richard; Anike, Ephraim N.; Oje, Obinna A.; Onwurah, Ikechukwu N. E.
2012-01-01
The environment is a very important component necessary for the existence of both man and other biotic organisms. The degree of sustainability of the physical environment is an index of the survival and well-being of the entire components in it. Additionally, it is not sufficient to try disposing toxic/deleterious substances with any known method. The best method of sustaining the environment is such that returns back all the components (wastes) in a recyclable way so that the waste becomes useful and helps the biotic and abiotic relationship to maintain an aesthetic and healthy equilibrium that characterizes an ideal environment. In this study, the method investigated includes biological method of environmental sustainability which seeks to investigate the various biotechnological tools (biotools) in current use and those undergoing investigations for future use. PMID:22611499
A computational continuum model of poroelastic beds
Zampogna, G. A.
2017-01-01
Despite the ubiquity of fluid flows interacting with porous and elastic materials, we lack a validated non-empirical macroscale method for characterizing the flow over and through a poroelastic medium. We propose a computational tool to describe such configurations by deriving and validating a continuum model for the poroelastic bed and its interface with the above free fluid. We show that, using stress continuity condition and slip velocity condition at the interface, the effective model captures the effects of small changes in the microstructure anisotropy correctly and predicts the overall behaviour in a physically consistent and controllable manner. Moreover, we show that the performance of the effective model is accurate by validating with fully microscopic resolved simulations. The proposed computational tool can be used in investigations in a wide range of fields, including mechanical engineering, bio-engineering and geophysics. PMID:28413355
NASA Tools for Climate Impacts on Water Resources
NASA Technical Reports Server (NTRS)
Toll, David; Doorn, Brad
2010-01-01
Climate and environmental change are expected to fundamentally alter the nation's hydrological cycle and water availability. Satellites provide global or near-global coverage using instruments, allowing for consistent, well-calibrated, and equivalent-quality data of the Earth system. A major goal for NASA climate and environmental change research is to create multi-instrument data sets to span the multi-decadal time scales of climate change and to combine these data with those from modeling and surface-based observing systems to improve process understanding and predictions. NASA and Earth science data and analyses will ultimately enable more accurate climate prediction, and characterization of uncertainties. NASA's Applied Sciences Program works with other groups, including other federal agencies, to transition demonstrated observational capabilities to operational capabilities. A summary of some of NASA tools for improved water resources management will be presented.
Regression Models for Identifying Noise Sources in Magnetic Resonance Images
Zhu, Hongtu; Li, Yimei; Ibrahim, Joseph G.; Shi, Xiaoyan; An, Hongyu; Chen, Yashen; Gao, Wei; Lin, Weili; Rowe, Daniel B.; Peterson, Bradley S.
2009-01-01
Stochastic noise, susceptibility artifacts, magnetic field and radiofrequency inhomogeneities, and other noise components in magnetic resonance images (MRIs) can introduce serious bias into any measurements made with those images. We formally introduce three regression models including a Rician regression model and two associated normal models to characterize stochastic noise in various magnetic resonance imaging modalities, including diffusion-weighted imaging (DWI) and functional MRI (fMRI). Estimation algorithms are introduced to maximize the likelihood function of the three regression models. We also develop a diagnostic procedure for systematically exploring MR images to identify noise components other than simple stochastic noise, and to detect discrepancies between the fitted regression models and MRI data. The diagnostic procedure includes goodness-of-fit statistics, measures of influence, and tools for graphical display. The goodness-of-fit statistics can assess the key assumptions of the three regression models, whereas measures of influence can isolate outliers caused by certain noise components, including motion artifacts. The tools for graphical display permit graphical visualization of the values for the goodness-of-fit statistic and influence measures. Finally, we conduct simulation studies to evaluate performance of these methods, and we analyze a real dataset to illustrate how our diagnostic procedure localizes subtle image artifacts by detecting intravoxel variability that is not captured by the regression models. PMID:19890478
Advances in diagnostic and treatment options in patients with fibromyalgia syndrome
Gur, Ali; Oktayoglu, Pelin
2009-01-01
Fibromyalgia (FM) is characterized as a chronic, painful, noninflammatory syndrome affecting the musculoskeletal system. In addition to pain, common co-morbid symptoms associated with FM include sleep disturbances, fatigue, morning stiffness, affective disorders, chronic daily headache, dyscognition, irritable bowel syndrome, and irritable bladder. Fibromyalgia is usually classified by application of the American College of Rheumatology (ACR) criteria. Although these criteria are accepted among investigators who agree with the concept of fibromyalgia, they do so with some reservations. Tender points and widespread pain alone does not describe the esence of fibromyalgia. New diagnostic tools including either clinical or radiological components are studied to diminish these problems. Although various pharmacological solutions have been studied for treating fibromyalgia, no single drug or groups of drugs have proved to be useful in treating fibromyalgia patients. Recently, three drugs, pregabalin, duloxetine and milnacipran, were approved for the treatment of FM by the US Food and Drug Administration (FDA). Novel therapeutic approaches to the management of FM include cannabinoids, sodium channel blockade and new generation antiepileptics. This review evaluates both new diagnostic tools, including clinical or radiological regimes, and tries to highlight the efficacy of medicinal and nonmedicinal treatments with new therapeutic approaches in the management of FM with a wide perspective. PMID:27789991
Cognitive learning: a machine learning approach for automatic process characterization from design
NASA Astrophysics Data System (ADS)
Foucher, J.; Baderot, J.; Martinez, S.; Dervilllé, A.; Bernard, G.
2018-03-01
Cutting edge innovation requires accurate and fast process-control to obtain fast learning rate and industry adoption. Current tools available for such task are mainly manual and user dependent. We present in this paper cognitive learning, which is a new machine learning based technique to facilitate and to speed up complex characterization by using the design as input, providing fast training and detection time. We will focus on the machine learning framework that allows object detection, defect traceability and automatic measurement tools.
MemAxes: Visualization and Analytics for Characterizing Complex Memory Performance Behaviors.
Gimenez, Alfredo; Gamblin, Todd; Jusufi, Ilir; Bhatele, Abhinav; Schulz, Martin; Bremer, Peer-Timo; Hamann, Bernd
2018-07-01
Memory performance is often a major bottleneck for high-performance computing (HPC) applications. Deepening memory hierarchies, complex memory management, and non-uniform access times have made memory performance behavior difficult to characterize, and users require novel, sophisticated tools to analyze and optimize this aspect of their codes. Existing tools target only specific factors of memory performance, such as hardware layout, allocations, or access instructions. However, today's tools do not suffice to characterize the complex relationships between these factors. Further, they require advanced expertise to be used effectively. We present MemAxes, a tool based on a novel approach for analytic-driven visualization of memory performance data. MemAxes uniquely allows users to analyze the different aspects related to memory performance by providing multiple visual contexts for a centralized dataset. We define mappings of sampled memory access data to new and existing visual metaphors, each of which enabling a user to perform different analysis tasks. We present methods to guide user interaction by scoring subsets of the data based on known performance problems. This scoring is used to provide visual cues and automatically extract clusters of interest. We designed MemAxes in collaboration with experts in HPC and demonstrate its effectiveness in case studies.
Tohira, Hideo; Jacobs, Ian; Mountain, David; Gibson, Nick; Yeo, Allen
2011-01-01
The Abbreviated Injury Scale (AIS) was revised in 2005 and updated in 2008 (AIS 2008). We aimed to compare the outcome prediction performance of AIS-based injury severity scoring tools by using AIS 2008 and AIS 98. We used all major trauma patients hospitalized to the Royal Perth Hospital between 1994 and 2008. We selected five AIS-based injury severity scoring tools, including Injury Severity Score (ISS), New Injury Severity Score (NISS), modified Anatomic Profile (mAP), Trauma and Injury Severity Score (TRISS) and A Severity Characterization of Trauma (ASCOT). We selected survival after injury as a target outcome. We used the area under the Receiver Operating Characteristic curve (AUROC) as a performance measure. First, we compared the five tools using all cases whose records included all variables for the TRISS (complete dataset) using a 10-fold cross-validation. Second, we compared the ISS and NISS for AIS 98 and AIS 2008 using all subjects (whole dataset). We identified 1,269 and 4,174 cases for a complete dataset and a whole dataset, respectively. With the 10-fold cross-validation, there were no clear differences in the AUROCs between the AIS 98- and AIS 2008-based scores. With the second comparison, the AIS 98-based ISS performed significantly worse than the AIS 2008-based ISS (p<0.0001), while there was no significant difference between the AIS 98- and AIS 2008-based NISSs. Researchers should be aware of these findings when they select an injury severity scoring tool for their studies. PMID:22105401
Building the European Seismological Research Infrastructure: results from 4 years NERIES EC project
NASA Astrophysics Data System (ADS)
van Eck, T.; Giardini, D.
2010-12-01
The EC Research Infrastructure (RI) project, Network of Research Infrastructures for European Seismology (NERIES), implemented a comprehensive European integrated RI for earthquake seismological data that is scalable and sustainable. NERIES opened a significant amount of additional seismological data, integrated different distributed data archives, implemented and produced advanced analysis tools and advanced software packages and tools. A single seismic data portal provides a single access point and overview for European seismological data available for the earth science research community. Additional data access tools and sites have been implemented to meet user and robustness requirements, notably those at the EMSC and ORFEUS. The datasets compiled in NERIES and available through the portal include among others: - The expanded Virtual European Broadband Seismic Network (VEBSN) with real-time access to more then 500 stations from > 53 observatories. This data is continuously monitored, quality controlled and archived in the European Integrated Distributed waveform Archive (EIDA). - A unique integration of acceleration datasets from seven networks in seven European or associated countries centrally accessible in a homogeneous format, thus forming the core comprehensive European acceleration database. Standardized parameter analysis and actual software are included in the database. - A Distributed Archive of Historical Earthquake Data (AHEAD) for research purposes, containing among others a comprehensive European Macroseismic Database and Earthquake Catalogue (1000 - 1963, M ≥5.8), including analysis tools. - Data from 3 one year OBS deployments at three sites, Atlantic, Ionian and Ligurian Sea within the general SEED format, thus creating the core integrated data base for ocean, sea and land based seismological observatories. Tools to facilitate analysis and data mining of the RI datasets are: - A comprehensive set of European seismological velocity reference model including a standardized model description with several visualisation tools currently adapted on a global scale. - An integrated approach to seismic hazard modelling and forecasting, a community accepted forecasting testing and model validation approach and the core hazard portal developed along the same technologies as the NERIES data portal. - Implemented homogeneous shakemap estimation tools at several large European observatories and a complementary new loss estimation software tool. - A comprehensive set of new techniques for geotechnical site characterization with relevant software packages documented and maintained (www.geopsy.org). - A set of software packages for data mining, data reduction, data exchange and information management in seismology as research and observatory analysis tools NERIES has a long-term impact and is coordinated with related US initiatives IRIS and EarthScope. The follow-up EC project of NERIES, NERA (2010 - 2014), is funded and will integrate the seismological and the earthquake engineering infrastructures. NERIES further provided the proof of concept for the ESFRI2008 initiative: the European Plate Observing System (EPOS). Its preparatory phase (2010 - 2014) is also funded by the EC.
Goy, C B; Dominguez, J M; Gómez López, M A; Madrid, R E; Herrera, M C
2013-08-01
The ambulatory monitoring of biosignals involves the use of sensors, electrodes, actuators, processing tools and wireless communication modules. When a garment includes these elements with the purpose of recording vital signs and responding to specific situations it is call a 'Smart Wearable System'. Over the last years several authors have suggested that conductive textile material (e-textiles) could perform as electrode for these systems. This work aims at implementing an electrical characterization of e-textiles and an evaluation of their ability to act as textile electrodes for lower extremity venous occlusion plethysmography (LEVOP). The e-textile electrical characterization is carried out using two experimental set-ups (in vitro evaluation). Besides, LEVOP records are obtained from healthy volunteers (in vivo evaluation). Standard Ag/AgCl electrodes are used for comparison in all tests. Results shown that the proposed e-textiles are suitable for LEVOP recording and a good agreement between evaluations (in vivo and in vitro) is found.
Characterization and Differentiation of Petroleum-Derived Products by E-Nose Fingerprints
Ferreiro-González, Marta; Palma, Miguel; Ayuso, Jesús; Álvarez, José A.; Barroso, Carmelo G.
2017-01-01
Characterization of petroleum-derived products is an area of continuing importance in environmental science, mainly related to fuel spills. In this study, a non-separative analytical method based on E-Nose (Electronic Nose) is presented as a rapid alternative for the characterization of several different petroleum-derived products including gasoline, diesel, aromatic solvents, and ethanol samples, which were poured onto different surfaces (wood, cork, and cotton). The working conditions about the headspace generation were 145 °C and 10 min. Mass spectroscopic data (45–200 m/z) combined with chemometric tools such as hierarchical cluster analysis (HCA), later principal component analysis (PCA), and finally linear discriminant analysis (LDA) allowed for a full discrimination of the samples. A characteristic fingerprint for each product can be used for discrimination or identification. The E-Nose can be considered as a green technique, and it is rapid and easy to use in routine analysis, thus providing a good alternative to currently used methods. PMID:29113069
Recent advances in the development and application of nanoelectrodes.
Fan, Yunshan; Han, Chu; Zhang, Bo
2016-10-07
Nanoelectrodes have key advantages compared to electrodes of conventional size and are the tool of choice for numerous applications in both fundamental electrochemistry research and bioelectrochemical analysis. This Minireview summarizes recent advances in the development, characterization, and use of nanoelectrodes in nanoscale electroanalytical chemistry. Methods of nanoelectrode preparation include laser-pulled glass-sealed metal nanoelectrodes, mass-produced nanoelectrodes, carbon nanotube based and carbon-filled nanopipettes, and tunneling nanoelectrodes. Several new topics of their recent application are covered, which include the use of nanoelectrodes for electrochemical imaging at ultrahigh spatial resolution, imaging with nanoelectrodes and nanopipettes, electrochemical analysis of single cells, single enzymes, and single nanoparticles, and the use of nanoelectrodes to understand single nanobubbles.
Biomass Characterization | Bioenergy | NREL
analytical methods for biomass characterization available for downloading. View the Biomass Compositional Methods Molecular Beam Mass Spectrometry Photo of a man in front of multiple computer screens that present Characterization of Biomass We develop new methods and tools to understand the chemical composition of raw biomass
Characterizing Task-Based OpenMP Programs
Muddukrishna, Ananya; Jonsson, Peter A.; Brorsson, Mats
2015-01-01
Programmers struggle to understand performance of task-based OpenMP programs since profiling tools only report thread-based performance. Performance tuning also requires task-based performance in order to balance per-task memory hierarchy utilization against exposed task parallelism. We provide a cost-effective method to extract detailed task-based performance information from OpenMP programs. We demonstrate the utility of our method by quickly diagnosing performance problems and characterizing exposed task parallelism and per-task instruction profiles of benchmarks in the widely-used Barcelona OpenMP Tasks Suite. Programmers can tune performance faster and understand performance tradeoffs more effectively than existing tools by using our method to characterize task-based performance. PMID:25860023
Fabrication and Metrology of High-Precision Foil Mirror Mounting Elements
NASA Technical Reports Server (NTRS)
Schattenburg, Mark L.
2002-01-01
During the period of this Cooperative Agreement, MIT (Massachusetts Institute of Technology) developed advanced methods for applying silicon microstructures for the precision assembly of foil x-ray optics in support of the Constellation-X Spectroscopy X-ray Telescope (SXT) development effort at Goddard Space Flight Center (GSFC). MIT developed improved methods for fabricating and characterizing the precision silicon micro-combs. MIT also developed and characterized assembly tools and several types of metrology tools in order to characterize and reduce the errors associated with precision assembly of foil optics. Results of this effort were published and presented to the scientific community and the GSFC SXT team. A bibliography of papers and presentations is offered.
Technology of machine tools. Volume 1. Executive summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutton, G.P.
1980-10-01
The Machine Tool Task Force (MTTF) was formed to characterize the state of the art of machine tool technology and to identify promising future directions of this technology. This volume is one of a five-volume series that presents the MTTF findings; reports on various areas of the technology were contributed by experts in those areas.
Exposure Assessment Tools by Media - Air
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Inhalation
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Food
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Ingestion
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Approaches
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Biological
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Routes - Dermal
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Physical
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Aquatic
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Chemical
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - Biota
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Terrestrial
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Design of compound libraries for fragment screening
NASA Astrophysics Data System (ADS)
Blomberg, Niklas; Cosgrove, David A.; Kenny, Peter W.; Kolmodin, Karin
2009-08-01
Approaches to the design of libraries for fragment screening are illustrated with reference to a 20 k generic fragment screening library and a 1.2 k generic NMR screening library. Tools and methods for library design that have been developed within AstraZeneca are described, including Foyfi fingerprints and the Flush program for neighborhood characterization. It will be shown how Flush and the BigPicker, which selects maximally diverse sets of compounds, are used to apply the Core and Layer method for library design. Approaches to partitioning libraries into cocktails are also described.
Development of POINTS as a planetology instrument
NASA Technical Reports Server (NTRS)
Reasenberg, Robert D.
1994-01-01
During the reporting period, we carried out investigations required to enhance our design of POINTS as a tool for the search for and characterization of extra-solar planetary systems. The results of that work were included in a paper on POINTS as well as one on Newcomb, which will soon appear in the proceedings of SPIE Conference 2200. (Newcomb is a spinoff of POINTS. It is a small astrometric interferometer now being developed jointly by SAO and the U.S. Navy. It could help establish some of the technology needed for POINTS.) These papers are appended.
NASA Technical Reports Server (NTRS)
Ravenhall, R.; Salemme, C. T.
1977-01-01
A total of 38 quiet clean short haul experimental engine under the wing composite fan blades were manufactured for various component tests, process and tooling, checkout, and use in the QCSEE UTW engine. The component tests included frequency characterization, strain distribution, bench fatigue, platform static load, whirligig high cycle fatigue, whirligig low cycle fatigue, whirligig strain distribution, and whirligig over-speed. All tests were successfully completed. All blades planned for use in the engine were subjected to and passed a whirligig proof spin test.
In situ nanomechanical testing of twinned metals in a transmission electron microscope
Li, Nan; Wang, Jiangwei; Mao, Scott; ...
2016-04-01
This paper focuses on in situ transmission electron microscope (TEM) characterization to explore twins in face-centered-cubic and body-centered-cubic monolithic metals, and their impact on the overall mechanical performance. Taking advantage of simultaneous nanomechanical deformation and nanoscale imaging using versatile in situ TEM tools, direct correlation of these unique microscopic defects with macroscopic mechanical performance becomes possible. This article summarizes recent evidence to support the mechanisms related to strengthening and plasticity in metals, including nanotwinned Cu, Ni, Al, Au, and others in bulk, thin film, and nanowire forms.
In situ nanomechanical testing of twinned metals in a transmission electron microscope
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Nan; Wang, Jiangwei; Mao, Scott
This paper focuses on in situ transmission electron microscope (TEM) characterization to explore twins in face-centered-cubic and body-centered-cubic monolithic metals, and their impact on the overall mechanical performance. Taking advantage of simultaneous nanomechanical deformation and nanoscale imaging using versatile in situ TEM tools, direct correlation of these unique microscopic defects with macroscopic mechanical performance becomes possible. This article summarizes recent evidence to support the mechanisms related to strengthening and plasticity in metals, including nanotwinned Cu, Ni, Al, Au, and others in bulk, thin film, and nanowire forms.
Recent advances in synthetic biology of cyanobacteria.
Sengupta, Annesha; Pakrasi, Himadri B; Wangikar, Pramod P
2018-05-09
Cyanobacteria are attractive hosts that can be engineered for the photosynthetic production of fuels, fine chemicals, and proteins from CO 2 . Moreover, the responsiveness of these photoautotrophs towards different environmental signals, such as light, CO 2 , diurnal cycle, and metals make them potential hosts for the development of biosensors. However, engineering these hosts proves to be a challenging and lengthy process. Synthetic biology can make the process of biological engineering more predictable through the use of standardized biological parts that are well characterized and tools to assemble them. While significant progress has been made with model heterotrophic organisms, many of the parts and tools are not portable in cyanobacteria. Therefore, efforts are underway to develop and characterize parts derived from cyanobacteria. In this review, we discuss the reported parts and tools with the objective to develop cyanobacteria as cell factories or biosensors. We also discuss the issues related to characterization, tunability, portability, and the need to develop enabling technologies to engineer this "green" chassis.
NASA Astrophysics Data System (ADS)
Islam, Amina; Chevalier, Sylvie; Sassi, Mohamed
2018-04-01
With advances in imaging techniques and computational power, Digital Rock Physics (DRP) is becoming an increasingly popular tool to characterize reservoir samples and determine their internal structure and flow properties. In this work, we present the details for imaging, segmentation, as well as numerical simulation of single-phase flow through a standard homogenous Silurian dolomite core plug sample as well as a heterogeneous sample from a carbonate reservoir. We develop a procedure that integrates experimental results into the segmentation step to calibrate the porosity. We also look into using two different numerical tools for the simulation; namely Avizo Fire Xlab Hydro that solves the Stokes' equations via the finite volume method and Palabos that solves the same equations using the Lattice Boltzmann Method. Representative Elementary Volume (REV) and isotropy studies are conducted on the two samples and we show how DRP can be a useful tool to characterize rock properties that are time consuming and costly to obtain experimentally.
Maturity of hospital information systems: Most important influencing factors.
Vidal Carvalho, João; Rocha, Álvaro; Abreu, António
2017-07-01
Maturity models facilitate organizational management, including information systems management, with hospital organizations no exception. This article puts forth a study carried out with a group of experts in the field of hospital information systems management with a view to identifying the main influencing factors to be included in an encompassing maturity model for hospital information systems management. This study is based on the results of a literature review, which identified maturity models in the health field and relevant influencing factors. The development of this model is justified to the extent that the available maturity models for the hospital information systems management field reveal multiple limitations, including lack of detail, absence of tools to determine their maturity and lack of characterization for stages of maturity structured by different influencing factors.
EMU battery/SMM power tool characterization study
NASA Technical Reports Server (NTRS)
Palandati, C.
1982-01-01
The power tool which will be used to replace the attitude control system in the SMM spacecraft was modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery was tested for the power tool application. The results are that the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.
[Present-day metal-cutting tools and working conditions].
Kondratiuk, V P
1990-01-01
Polyfunctional machine-tools of a processing centre type are characterized by a set of hygienic advantages as compared to universal machine-tools. But low degree of mechanization and automation of some auxiliary processes, and constructional defects which decrease the ergonomic characteristics of the tools, involve labour intensity in multi-machine processing. The article specifies techniques of allowable noise level assessment, and proposes hygienic recommendations, some of which have been introduced into practice.
Analysis of biosurfaces by neutron reflectometry: From simple to complex interfaces
Junghans, Ann; Watkins, Erik B.; Barker, Robert D.; ...
2015-03-16
Because of its high sensitivity for light elements and the scattering contrast manipulation via isotopic substitutions, neutron reflectometry (NR) is an excellent tool for studying the structure of soft-condensed material. These materials include model biophysical systems as well as in situ living tissue at the solid–liquid interface. The penetrability of neutrons makes NR suitable for probing thin films with thicknesses of 5–5000 Å at various buried, for example, solid–liquid, interfaces [J. Daillant and A. Gibaud, Lect. Notes Phys. 770, 133 (2009); G. Fragneto-Cusani, J. Phys.: Condens. Matter 13, 4973 (2001); J. Penfold, Curr. Opin. Colloid Interface Sci. 7, 139 (2002)].more » Over the past two decades, NR has evolved to become a key tool in the characterization of biological and biomimetic thin films. Highlighted In the current report are some of the authors' recent accomplishments in utilizing NR to study highly complex systems, including in-situ experiments. Such studies will result in a much better understanding of complex biological problems, have significant medical impact by suggesting innovative treatment, and advance the development of highly functionalized biomimetic materials.« less
NASA Astrophysics Data System (ADS)
Li, Xinlong; Reber, Melanie A. R.; Corder, Christopher; Chen, Yuning; Zhao, Peng; Allison, Thomas K.
2016-09-01
We present a detailed description of the design, construction, and performance of high-power ultrafast Yb:fiber laser frequency combs in operation in our laboratory. We discuss two such laser systems: an 87 MHz, 9 W, 85 fs laser operating at 1060 nm and an 87 MHz, 80 W, 155 fs laser operating at 1035 nm. Both are constructed using low-cost, commercially available components, and can be assembled using only basic tools for cleaving and splicing single-mode fibers. We describe practical methods for achieving and characterizing low-noise single-pulse operation and long-term stability from Yb:fiber oscillators based on nonlinear polarization evolution. Stabilization of the combs using a variety of transducers, including a new method for tuning the carrier-envelope offset frequency, is discussed. High average power is achieved through chirped-pulse amplification in simple fiber amplifiers based on double-clad photonic crystal fibers. We describe the use of these combs in several applications, including ultrasensitive femtosecond time-resolved spectroscopy and cavity-enhanced high-order harmonic generation.
Zhou, Mowei; Paša-Tolić, Ljiljana; Stenoien, David L
2017-02-03
As histones play central roles in most chromosomal functions including regulation of DNA replication, DNA damage repair, and gene transcription, both their basic biology and their roles in disease development have been the subject of intense study. Because multiple post-translational modifications (PTMs) along the entire protein sequence are potential regulators of histones, a top-down approach, where intact proteins are analyzed, is ultimately required for complete characterization of proteoforms. However, significant challenges remain for top-down histone analysis primarily because of deficiencies in separation/resolving power and effective identification algorithms. Here we used state-of-the-art mass spectrometry and a bioinformatics workflow for targeted data analysis and visualization. The workflow uses ProMex for intact mass deconvolution, MSPathFinder as a search engine, and LcMsSpectator as a data visualization tool. When complemented with the open-modification tool TopPIC, this workflow enabled identification of novel histone PTMs including tyrosine bromination on histone H4 and H2A, H3 glutathionylation, and mapping of conventional PTMs along the entire protein for many histone subunits.
Refsgaard, A; Jacobsen, T; Jacobsen, B; Ørum, J-E
2007-01-01
The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling approach applied in river basin management. Point sources (e.g. sewage treatment plant discharges) and distributed diffuse sources (nitrate leakage) are included to provide a modelling tool capable of simulating pollution transport from source to recipient to analyse the effects of specific, localized basin water management plans. The paper also includes a land rent modelling approach which can be used to choose the most cost-effective measures and the location of these measures. As a forerunner to the use of basin-scale models in WFD basin water management plans this project demonstrates the potential and limitations of comprehensive, integrated modelling tools.
Andraski, Brian J.; Stonestrom, David A.; Nicholson, T.J.; Arlt, H.D.
2011-01-01
In 1976 the U.S. Geological Survey (USGS) began studies of unsaturated zone hydrology next to the Nation’s first commercial disposal facility for low-level radioactive waste (LLRW) near Beatty, NV. Recognizing the need for long-term data collection, the USGS in 1983 established research management areas in the vicinity of the waste-burial facility through agreements with the Bureau of Land Management and the State of Nevada. Within this framework, the Amargosa Desert Research Site (ADRS; http://nevada.usgs.gov/adrs/) is serving as a field laboratory for the sustained study of water-, gas-, and contaminant-transport processes, and the development of models and methods to characterize flow and transport. The research is built on multiple lines of data that include: micrometeorology; evapotranspiration; plant metrics; soil and sediment properties; unsaturated-zone moisture, temperature, and gas composition; geology and geophysics; and groundwater. Contaminant data include tritium, radiocarbon, volatile-organic compounds (VOCs), and elemental mercury. Presented here is a summary of monitoring tools and techniques that are being applied in studies of waste isolation and contaminant migration.
SUPIN: A Computational Tool for Supersonic Inlet Design
NASA Technical Reports Server (NTRS)
Slater, John W.
2016-01-01
A computational tool named SUPIN is being developed to design and analyze the aerodynamic performance of supersonic inlets. The inlet types available include the axisymmetric pitot, three-dimensional pitot, axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flow-field is divided into parts to provide a framework for the geometry and aerodynamic modeling. Each part of the inlet is defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick design and analysis. SUPIN provides inlet geometry in the form of coordinates, surface angles, and cross-sectional areas. SUPIN can generate inlet surface grids and three-dimensional, structured volume grids for use with higher-fidelity computational fluid dynamics (CFD) analysis. Capabilities highlighted in this paper include the design and analysis of streamline-traced external-compression inlets, modeling of porous bleed, and the design and analysis of mixed-compression inlets. CFD analyses are used to verify the SUPIN results.
Hu, Long; Xu, Zhiyu; Hu, Boqin; Lu, Zhi John
2017-01-09
Recent genomic studies suggest that novel long non-coding RNAs (lncRNAs) are specifically expressed and far outnumber annotated lncRNA sequences. To identify and characterize novel lncRNAs in RNA sequencing data from new samples, we have developed COME, a coding potential calculation tool based on multiple features. It integrates multiple sequence-derived and experiment-based features using a decompose-compose method, which makes it more accurate and robust than other well-known tools. We also showed that COME was able to substantially improve the consistency of predication results from other coding potential calculators. Moreover, COME annotates and characterizes each predicted lncRNA transcript with multiple lines of supporting evidence, which are not provided by other tools. Remarkably, we found that one subgroup of lncRNAs classified by such supporting features (i.e. conserved local RNA secondary structure) was highly enriched in a well-validated database (lncRNAdb). We further found that the conserved structural domains on lncRNAs had better chance than other RNA regions to interact with RNA binding proteins, based on the recent eCLIP-seq data in human, indicating their potential regulatory roles. Overall, we present COME as an accurate, robust and multiple-feature supported method for the identification and characterization of novel lncRNAs. The software implementation is available at https://github.com/lulab/COME. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Wenisch, Robert; Lungwitz, Frank; Hanf, Daniel; Heller, René; Zscharschuch, Jens; Hübner, René; von Borany, Johannes; Abrasonis, Gintautas; Gemming, Sibylle; Escobar-Galindo, Ramon; Krause, Matthias
2018-06-13
A new cluster tool for in situ real-time processing and depth-resolved compositional, structural and optical characterization of thin films at temperatures from -100 to 800 °C is described. The implemented techniques comprise magnetron sputtering, ion irradiation, Rutherford backscattering spectrometry, Raman spectroscopy, and spectroscopic ellipsometry. The capability of the cluster tool is demonstrated for a layer stack MgO/amorphous Si (∼60 nm)/Ag (∼30 nm), deposited at room temperature and crystallized with partial layer exchange by heating up to 650 °C. Its initial and final composition, stacking order, and structure were monitored in situ in real time and a reaction progress was defined as a function of time and temperature.
Phases of ERA - Risk Characterization
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Nanomaterials
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Other Organics
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Lifestages
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Consumer Products
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Water and Sediment
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Tiers and Types
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Soil and Dust
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Pesticides
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Media - Aquatic Biota
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.
Exposure Assessment Tools by Media - Soil and Dust
2017-02-13
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Chemical Classes - Other ...
2017-02-13
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Soil
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Food Chains
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - References
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - Habitats and Ecosystems
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Effects - Effects In ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Stressors - Stressors in ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Receptors - Receptors in ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Air
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA ExpoBox: Submit Tool Information
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
Modelling and interpreting spectral energy distributions of galaxies with BEAGLE
NASA Astrophysics Data System (ADS)
Chevallard, Jacopo; Charlot, Stéphane
2016-10-01
We present a new-generation tool to model and interpret spectral energy distributions (SEDs) of galaxies, which incorporates in a consistent way the production of radiation and its transfer through the interstellar and intergalactic media. This flexible tool, named BEAGLE (for BayEsian Analysis of GaLaxy sEds), allows one to build mock galaxy catalogues as well as to interpret any combination of photometric and spectroscopic galaxy observations in terms of physical parameters. The current version of the tool includes versatile modelling of the emission from stars and photoionized gas, attenuation by dust and accounting for different instrumental effects, such as spectroscopic flux calibration and line spread function. We show a first application of the BEAGLE tool to the interpretation of broad-band SEDs of a published sample of ˜ 10^4 galaxies at redshifts 0.1 ≲ z ≲ 8. We find that the constraints derived on photometric redshifts using this multipurpose tool are comparable to those obtained using public, dedicated photometric-redshift codes and quantify this result in a rigorous statistical way. We also show how the post-processing of BEAGLE output data with the PYTHON extension PYP-BEAGLE allows the characterization of systematic deviations between models and observations, in particular through posterior predictive checks. The modular design of the BEAGLE tool allows easy extensions to incorporate, for example, the absorption by neutral galactic and circumgalactic gas, and the emission from an active galactic nucleus, dust and shock-ionized gas. Information about public releases of the BEAGLE tool will be maintained on http://www.jacopochevallard.org/beagle.
RNA motif search with data-driven element ordering.
Rampášek, Ladislav; Jimenez, Randi M; Lupták, Andrej; Vinař, Tomáš; Brejová, Broňa
2016-05-18
In this paper, we study the problem of RNA motif search in long genomic sequences. This approach uses a combination of sequence and structure constraints to uncover new distant homologs of known functional RNAs. The problem is NP-hard and is traditionally solved by backtracking algorithms. We have designed a new algorithm for RNA motif search and implemented a new motif search tool RNArobo. The tool enhances the RNAbob descriptor language, allowing insertions in helices, which enables better characterization of ribozymes and aptamers. A typical RNA motif consists of multiple elements and the running time of the algorithm is highly dependent on their ordering. By approaching the element ordering problem in a principled way, we demonstrate more than 100-fold speedup of the search for complex motifs compared to previously published tools. We have developed a new method for RNA motif search that allows for a significant speedup of the search of complex motifs that include pseudoknots. Such speed improvements are crucial at a time when the rate of DNA sequencing outpaces growth in computing. RNArobo is available at http://compbio.fmph.uniba.sk/rnarobo .
Advanced Tools for River Science: EAARL and MD_SWMS: Chapter 3
Kinzel, Paul J.
2009-01-01
Disruption of flow regimes and sediment supplies, induced by anthropogenic or climatic factors, can produce dramatic alterations in river form, vegetation patterns, and associated habitat conditions. To improve habitat in these fluvial systems, resource managers may choose from a variety of treatments including flow and/or sediment prescriptions, vegetation management, or engineered approaches. Monitoring protocols developed to assess the morphologic response of these treatments require techniques that can measure topographic changes above and below the water surface efficiently, accurately, and in a standardized, cost-effective manner. Similarly, modeling of flow, sediment transport, habitat, and channel evolution requires characterization of river morphology for model input and verification. Recent developments by the U.S. Geological Survey with regard to both remotely sensed methods (the Experimental Advanced Airborne Research LiDAR; EAARL) and computational modeling software (the Multi-Dimensional Surface-Water Modeling System; MD_SWMS) have produced advanced tools for spatially explicit monitoring and modeling in aquatic environments. In this paper, we present a pilot study conducted along the Platte River, Nebraska, that demonstrates the combined use of these river science tools.
Chemical and Biological Tools for the Preparation of Modified Histone Proteins
Howard, Cecil J.; Yu, Ruixuan R.; Gardner, Miranda L.; Shimko, John C.; Ottesen, Jennifer J.
2016-01-01
Eukaryotic chromatin is a complex and dynamic system in which the DNA double helix is organized and protected by interactions with histone proteins. This system is regulated through, a large network of dynamic post-translational modifications (PTMs) exists to ensure proper gene transcription, DNA repair, and other processes involving DNA. Homogenous protein samples with precisely characterized modification sites are necessary to better understand the functions of modified histone proteins. Here, we discuss sets of chemical and biological tools that have been developed for the preparation of modified histones, with a focus on the appropriate choice of tool for a given target. We start with genetic approaches for the creation of modified histones, including the incorporation of genetic mimics of histone modifications, chemical installation of modification analogs, and the use of the expanded genetic code to incorporate modified amino acids. Additionally, we will cover the chemical ligation techniques that have been invaluable in the generation of complex modified histones that are indistinguishable from the natural counterparts. Finally, we will end with a prospectus on future directions of synthetic chromatin in living systems. PMID:25863817
Allen, Carrie; Zarowitz, Barbara; O'Shea, Terrence; Peterson, Edward; Yonan, Charles; Waterman, Fanta
Pseudobulbar Affect (PBA) is a neurologic condition characterized by involuntary outbursts of crying and/or laughing disproportionate to patient mood or social context. Although an estimated 9% of nursing home residents have symptoms suggestive of PBA, they are not routinely screened. Our goal was to develop an electronic screening tool based upon characteristics common to nursing home residents with PBA identified through medical record data. Nursing home residents with PBA treated with dextromethorphan hydrobromide/quinidine sulfate (n = 140) were compared to age-, gender-, and dementia-diagnosis-matched controls without PBA or treatment (n = 140). Comparative categories included diagnoses, medication use and symptom documentation. Using a multivariable regression and best decision rule analysis, we found PBA in nursing home residents was associated with chart documentation of uncontrollable crying, presence of a neurologic disorder (e.g., Parkinson's disease), or by the documented presence of at least 2 of the following: stroke, severe cognitive impairment, and schizophrenia. Based on these risk factors, an electronic screening tool was created. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh; Smith, Curtis Lee; Burns, Douglas Edward
This report describes the development plan for a new multi-partner External Hazards Experimental Group (EHEG) coordinated by Idaho National Laboratory (INL) within the Risk-Informed Safety Margin Characterization (RISMC) technical pathway of the Light Water Reactor Sustainability Program. Currently, there is limited data available for development and validation of the tools and methods being developed in the RISMC Toolkit. The EHEG is being developed to obtain high-quality, small- and large-scale experimental data validation of RISMC tools and methods in a timely and cost-effective way. The group of universities and national laboratories that will eventually form the EHEG (which is ultimately expectedmore » to include both the initial participants and other universities and national laboratories that have been identified) have the expertise and experimental capabilities needed to both obtain and compile existing data archives and perform additional seismic and flooding experiments. The data developed by EHEG will be stored in databases for use within RISMC. These databases will be used to validate the advanced external hazard tools and methods.« less
Complex Networks Analysis of Manual and Machine Translations
NASA Astrophysics Data System (ADS)
Amancio, Diego R.; Antiqueira, Lucas; Pardo, Thiago A. S.; da F. Costa, Luciano; Oliveira, Osvaldo N.; Nunes, Maria G. V.
Complex networks have been increasingly used in text analysis, including in connection with natural language processing tools, as important text features appear to be captured by the topology and dynamics of the networks. Following previous works that apply complex networks concepts to text quality measurement, summary evaluation, and author characterization, we now focus on machine translation (MT). In this paper we assess the possible representation of texts as complex networks to evaluate cross-linguistic issues inherent in manual and machine translation. We show that different quality translations generated by MT tools can be distinguished from their manual counterparts by means of metrics such as in- (ID) and out-degrees (OD), clustering coefficient (CC), and shortest paths (SP). For instance, we demonstrate that the average OD in networks of automatic translations consistently exceeds the values obtained for manual ones, and that the CC values of source texts are not preserved for manual translations, but are for good automatic translations. This probably reflects the text rearrangements humans perform during manual translation. We envisage that such findings could lead to better MT tools and automatic evaluation metrics.
NASA Astrophysics Data System (ADS)
Wanare, S. P.; Kalyankar, V. D.
2018-04-01
Friction stir welding is emerging as a promising technique for joining of lighter metal alloys due to its several advantages over conventional fusion welding processes such as low thermal distortion, good mechanical properties, fine weld joint microstructure, etc. This review article mainly focuses on analysis of microstructure and mechanical properties of friction stir welded joints. Various microstructure characterization techniques used by previous researchers such as optical microscopes, x-ray diffraction, electron probe microscope, transmission electron microscope, scanning electron microscopes with electron back scattered diffraction, electron dispersive microscopy, etc. are thoroughly overviewed and their results are discussed. The effects of friction stir welding process parameters such as tool rotational speed, welding speed, tool plunge depth, axial force, tool shoulder diameter to tool pin diameter ratio, tool geometry etc. on microstructure and mechanical properties of welded joints are studied and critical observations are noted down. The microstructure examination carried out by previous researchers on various zones of welded joints such as weld zone, heat affected zone and base metal are studied and critical remarks have been presented. Mechanical performances of friction stir welded joints based on tensile test, micro-hardness test, etc. are discussed. This article includes exhaustive literature review of standard research articles which may become ready information for subsequent researchers to establish their line of action.
Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges
NASA Technical Reports Server (NTRS)
Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam
2014-01-01
As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.
Ask Pete, software planning and estimation through project characterization
NASA Technical Reports Server (NTRS)
Kurtz, T.
2001-01-01
Ask Pete, was developed by NASA to provide a tool for integrating the estimation and planning activities for a software development effort. It incorporates COCOMO II estimating with NASA's software development practices and IV&V criteria to characterize a project. This characterization is then used to generate estimates and tailored planning documents.
Fabrication and Characterization of High Temperature Resin/Carbon Nanofiber Composites
NASA Technical Reports Server (NTRS)
Ghose, Sayata; Watson, Kent A.; Working, Dennis C.; Criss, Jim M.; Siochi, Emilie J.; Conell, John W.
2005-01-01
As part of ongoing efforts to develop multifunctional advanced composites, blends of PETI-330 and carbon nanofibers (CNF) were prepared and characterized. Dry mixing techniques were employed and the effect of CNF loading level on melt viscosity was determined. The resulting powders were characterized for degree of mixing, thermal and rheological properties. Based on the characterization results, samples containing 30 and 40 wt% CNF were scaled up to approx.300 g and used to fabricate moldings 10.2 cm x 15.2 cm x 0.32 cm thick. The moldings were fabricated by injecting the mixtures at 260-280 C into a stainless steel tool followed by curing for 1 h at 371 C. The tool was designed to impart high shear during the injection process in an attempt to achieve some alignment of CNFs in the flow direction. Moldings were obtained that were subsequently characterized for thermal, mechanical and electrical properties. The degree of dispersion and alignment of CNFs were investigated using high-resolution scanning electron microscopy. The preparation and preliminary characterization of PETI-330/CNF composites will be discussed.
Vuong, Helen E.; de Sevilla Müller, Luis Pérez; Hardi, Claudia N.; McMahon, Douglas G.; Brecha, Nicholas C.
2015-01-01
Transgenic mouse lines are essential tools for understanding the connectivity, physiology and function of neuronal circuits, including those in the retina. This report compares transgene expression in the retina of a tyrosine hydroxylase (TH)-red fluorescent protein (RFP) line with three catecholamine-related Cre recombinase lines [TH-bacterial artificial chromosome (BAC)-, TH-, and dopamine transporter (DAT)-Cre] that were crossed with a ROSA26-tdTomato reporter line. Retinas were evaluated and immunostained with commonly used antibodies including those directed to TH, GABA and glycine to characterize the RFP or tdTomato fluorescent-labeled amacrine cells, and an antibody directed to RNA-binding protein with multiple splicing to identify ganglion cells. In TH-RFP retinas, types 1 and 2 dopamine (DA) amacrine cells were identified by their characteristic cellular morphology and type 1 DA cells by their expression of TH immunoreactivity. In the TH-BAC-, TH-, and DAT-tdTomato retinas, less than 1%, ~6%, and 0%, respectively, of the fluorescent cells were the expected type 1 DA amacrine cells. Instead, in the TH-BAC-tdTomato retinas, fluorescently labeled AII amacrine cells were predominant, with some medium somal diameter ganglion cells. In TH-tdTomato retinas, fluorescence was in multiple neurochemical amacrine cell types, including four types of polyaxonal amacrine cells. In DAT-tdTomato retinas, fluorescence was in GABA immunoreactive amacrine cells, including two types of bistratified and two types of monostratified amacrine cells. Although each of the Cre lines were generated with the intent to specifically label DA cells, our findings show a cellular diversity in Cre expression in the adult retina and indicate the importance of careful characterization of transgene labeling patterns. These mouse lines with their distinctive cellular labeling patterns will be useful tools for future studies of retinal function and visual processing. PMID:26335381
Vuong, H E; Pérez de Sevilla Müller, L; Hardi, C N; McMahon, D G; Brecha, N C
2015-10-29
Transgenic mouse lines are essential tools for understanding the connectivity, physiology and function of neuronal circuits, including those in the retina. This report compares transgene expression in the retina of a tyrosine hydroxylase (TH)-red fluorescent protein (RFP) mouse line with three catecholamine-related Cre recombinase mouse lines [TH-bacterial artificial chromosome (BAC)-, TH-, and dopamine transporter (DAT)-Cre] that were crossed with a ROSA26-tdTomato reporter line. Retinas were evaluated and immunostained with commonly used antibodies including those directed to TH, GABA and glycine to characterize the RFP or tdTomato fluorescent-labeled amacrine cells, and an antibody directed to RNA-binding protein with multiple splicing to identify ganglion cells. In TH-RFP retinas, types 1 and 2 dopamine (DA) amacrine cells were identified by their characteristic cellular morphology and type 1 DA cells by their expression of TH immunoreactivity. In the TH-BAC-, TH-, and DAT-tdTomato retinas, less than 1%, ∼ 6%, and 0%, respectively, of the fluorescent cells were the expected type 1 DA amacrine cells. Instead, in the TH-BAC-tdTomato retinas, fluorescently labeled AII amacrine cells were predominant, with some medium diameter ganglion cells. In TH-tdTomato retinas, fluorescence was in multiple neurochemical amacrine cell types, including four types of polyaxonal amacrine cells. In DAT-tdTomato retinas, fluorescence was in GABA immunoreactive amacrine cells, including two types of bistratified and two types of monostratified amacrine cells. Although each of the Cre lines was generated with the intent to specifically label DA cells, our findings show a cellular diversity in Cre expression in the adult retina and indicate the importance of careful characterization of transgene labeling patterns. These mouse lines with their distinctive cellular labeling patterns will be useful tools for future studies of retinal function and visual processing. Published by Elsevier Ltd.
Fafetine, J M; Domingos, A; Antunes, S; Esteves, A; Paweska, J T; Coetzer, J A W; Rutten, V P M G; Neves, L
2013-11-01
Due to the unpredictable and explosive nature of Rift Valley fever (RVF) outbreaks, rapid and accurate diagnostic assays for low-resource settings are urgently needed. To improve existing diagnostic assays, monoclonal antibodies (MAbs) specific for the nucleocapsid protein of RVF virus (RVFV) were produced and characterized. Four IgG2a MAbs showed specific binding to denatured nucleocapsid protein, both from a recombinant source and from inactivated RVFV, in Western blot analysis and in an enzyme-linked immunosorbent assay (ELISA). Cross-reactivity with genetically related and non-related arboviruses including Bunyamwera and Calovo viruses (Bunyaviridae family), West Nile and Dengue-2 viruses (Flaviviridae family), and Sindbis and Chikungunya viruses (Togaviridae family) was not detected. These MAbs represent a useful tool for the development of rapid diagnostic assays for early recognition of RVF. © 2013 Blackwell Verlag GmbH.
GeoLab's First Field Trials, 2010 Desert RATS: Evaluating Tools for Early Sample Characterization
NASA Technical Reports Server (NTRS)
Evans, Cindy A.; Bell, M. S.; Calaway, M. J.; Graff, Trevor; Young, Kelsey
2011-01-01
As part of an accelerated prototyping project to support science operations tests for future exploration missions, we designed and built a geological laboratory, GeoLab, that was integrated into NASA's first generation Habitat Demonstration Unit-1/Pressurized Excursion Module (HDU1-PEM). GeoLab includes a pressurized glovebox for transferring and handling samples collected on geological traverses, and a suite of instruments for collecting preliminary data to help characterize those samples. The GeoLab and the HDU1-PEM were tested for the first time as part of the 2010 Desert Research and Technology Studies (DRATS), NASA's analog field exercise for testing mission technologies. The HDU1- PEM and GeoLab participated in two weeks of joint operations in northern Arizona with two crewed rovers and the DRATS science team.
Kim, Unyong; Oh, Myung Jin; Seo, Youngsuk; Jeon, Yinae; Eom, Joon-Ho; An, Hyun Joo
2017-09-01
Glycosylation of recombinant human erythropoietins (rhEPOs) is significantly associated with drug's quality and potency. Thus, comprehensive characterization of glycosylation is vital to assess the biotherapeutic quality and establish the equivalency of biosimilar rhEPOs. However, current glycan analysis mainly focuses on the N-glycans due to the absence of analytical tools to liberate O-glycans with high sensitivity. We developed selective and sensitive method to profile native O-glycans on rhEPOs. O-glycosylation on rhEPO including O-acetylation on a sialic acid was comprehensively characterized. Details such as O-glycan structure and O-acetyl-modification site were obtained from tandem MS. This method may be applied to QC and batch analysis of not only rhEPOs but also other biotherapeutics bearing multiple O-glycosylations.
Atomization and vaporization characteristics of airblast fuel injection inside a venturi tube
NASA Technical Reports Server (NTRS)
Sun, H.; Chue, T.-H.; Lai, M.-C.; Tacina, R. R.
1993-01-01
This paper describes the experimental and numerical characterization of the capillary fuel injection, atomization, dispersion, and vaporization of liquid fuel in a coflowing air stream inside a single venturi tube. The experimental techniques used are all laser-based. Phase Doppler analyzer was used to characterize the atomization and vaporization process. Planar laser-induced fluorescence visualizations give good qualitative picture of the fuel droplet and vapor distribution. Limited quantitative capabilities of the technique are also demonstrated. A modified version of the KIVA-II was used to simulate the entire spray process, including breakup and vaporization. The advantage of venturi nozzle is demonstrated in terms of better atomization, more uniform F/A distribution, and less pressure drop. Multidimensional spray calculations can be used as a design tool only if care is taken for the proper breakup model, and wall impingement process.
Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials
Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.
2015-01-01
Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347
Development and Characterization of Titanium Compound N anostructures
NASA Astrophysics Data System (ADS)
Zhou, Zhou
The development and characterization of titanium compound nanostructures have been achieved, for potential applications in energy industry. Oil and gas, one of the traditional industry fields, observes accumulating demands on active implementations of nanotechnology, for the numerous advantages that nanomaterials can introduce to both product performances and field operations. By using chemical vapor deposition and liquid exfoliation, various titanium compound nanostructures have been synthesized through this project. Attractively, these two material fabrication methods have been recognized to be industrial friendly in terms of cost efficiency and productivity. The development of nanostructures, aiming at oil and gas field applications, presents novel solutions for existing issues, such as low durability of drilling tools, high friction in mechanical operations and ineffective heat dissipation. Titanium compound nanostructures, including titanium borides, nitrides and sulfides are therefore investigated for such applications as protective coating, lubrication and thermal management.
Response to Paper III Economics in the Civics Curriculum. A Reaction to Andrew F. Brimmer.
ERIC Educational Resources Information Center
Schug, Mark C.
According to the document, Dr. Andrew Brimmer did an excellent job of identifying emerging economic concerns. Dr. Brimmer's characterization of economics as a tool kit can help young people examine important social questions using principles of economics as the tool for analysis. One way to build an economics tool kit is by placing more stress on…
Machine tools error characterization and compensation by on-line measurement of artifact
NASA Astrophysics Data System (ADS)
Wahid Khan, Abdul; Chen, Wuyi; Wu, Lili
2009-11-01
Most manufacturing machine tools are utilized for mass production or batch production with high accuracy at a deterministic manufacturing principle. Volumetric accuracy of machine tools depends on the positional accuracy of the cutting tool, probe or end effector related to the workpiece in the workspace volume. In this research paper, a methodology is presented for volumetric calibration of machine tools by on-line measurement of an artifact or an object of a similar type. The machine tool geometric error characterization was carried out through a standard or an artifact, having similar geometry to the mass production or batch production product. The artifact was measured at an arbitrary position in the volumetric workspace with a calibrated Renishaw touch trigger probe system. Positional errors were stored into a computer for compensation purpose, to further run the manufacturing batch through compensated codes. This methodology was found quite effective to manufacture high precision components with more dimensional accuracy and reliability. Calibration by on-line measurement gives the advantage to improve the manufacturing process by use of deterministic manufacturing principle and found efficient and economical but limited to the workspace or envelop surface of the measured artifact's geometry or the profile.
Elastography: current status, future prospects, and making it work for you.
Garra, Brian S
2011-09-01
Elastography has emerged as a useful adjunct tool for ultrasound diagnosis. Elastograms are images of tissue stiffness and may be in color, grayscale, or a combination of the two. The first and most common application of elastography is for the diagnosis of breast lesions where studies have shown an area under the receiver operating characteristic curve of 0.88 to 0.95 for distinguishing cancer from benign lesions. The technique is also useful for the diagnosis of complex cysts, although different scanners may vary in how they display such lesions. Recent advances in elastography include quantification using strain ratios, acoustic radiation force impulse imaging, and shear wave velocity estimation. These are useful not only for characterizing focal masses but also for diagnosing diffuse organ diseases such as liver cirrhosis. Other near term potential applications for elastography include characterization of thyroid nodules and lymph node evaluation for metastatic disease. Prostate cancer detection is also a potential application, but obtaining high-quality elastograms may be difficult. This area is evolving. Other promising applications include atheromatous plaque and arterial wall evaluation, venous thrombus evaluation, graft rejection, and monitoring of tumor ablation therapy. When contemplating the acquisition of a system with elastography in this rapidly evolving field, a clear picture of the manufacturer's plans for future upgrades (including quantification) should be obtained.
Connecting Projects to Complete the In Situ Resource Utilization Paradigm
NASA Technical Reports Server (NTRS)
Linne, Diane L.; Sanders, Gerald B.
2017-01-01
Terrain Identify specifics such as slope, rockiness, traction parameters Identify what part of ISRU needs each Physical Geotechnical Hardness, density, cohesion, etc. Identify what part of ISRU needs each (e.g., excavation needs to know hardness, density; soil processing needs to know density, cohesion; etc.)Mineral Identify specifics Identify what part of ISRU needs each Volatile Identify specifics Identify what part of ISRU needs each Atmosphere Identify specifics Identify what part of ISRU needs each Environment Identify specifics Identify what part of ISRU needs each Resource Characterization What: Develop an instrument suite to locate and evaluate the physical, mineral, and volatile resources at the lunar poles Neutron Spectrometer Near Infrared (IR) to locate subsurface hydrogen surface water Near IR for mineral identification Auger drill for sample removal down to 1 m Oven with Gas Chromatograph Mass Spectrometer to quantify volatiles present ISRU relevance: Water volatile resource characterization and subsurface material access removal Site Evaluation Resource Mapping What: Develop and utilize new data products and tools for evaluating potential exploration sites for selection and overlay mission data to map terrain, environment, and resource information e.g., New techniques applied to generate Digital Elevation Map (DEMs) at native scale of images (1mpxl)ISRU relevance: Resource mapping and estimation with terrain and environment information is needed for extraction planning Mission Planning and Operations What: Develop and utilize tools and procedures for planning mission operations and real time changes Planning tools include detailed engineering models (e.g., power and data) of surface segment systems allows evaluation of designs ISRU relevance: Allows for iterative engineering as a function of environment and hardware performance.
Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde
2010-07-01
Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.
NASA Astrophysics Data System (ADS)
Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.
2016-03-01
Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.
Towards Accurate Application Characterization for Exascale (APEX)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Simon David
Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns.more » Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.« less
Boyce, Richard D.; Handler, Steven M.; Karp, Jordan F.; Perera, Subashan; Reynolds, Charles F.
2016-01-01
Introduction: A potential barrier to nursing home research is the limited availability of research quality data in electronic form. We describe a case study of converting electronic health data from five skilled nursing facilities to a research quality longitudinal dataset by means of open-source tools produced by the Observational Health Data Sciences and Informatics (OHDSI) collaborative. Methods: The Long-Term Care Minimum Data Set (MDS), drug dispensing, and fall incident data from five SNFs were extracted, translated, and loaded into version 4 of the OHDSI common data model. Quality assurance involved identifying errors using the Achilles data characterization tool and comparing both quality measures and drug exposures in the new database for concordance with externally available sources. Findings: Records for a total 4,519 patients (95.1%) made it into the final database. Achilles identified 10 different types of errors that were addressed in the final dataset. Drug exposures based on dispensing were generally accurate when compared with medication administration data from the pharmacy services provider. Quality measures were generally concordant between the new database and Nursing Home Compare for measures with a prevalence ≥ 10%. Fall data recorded in MDS was found to be more complete than data from fall incident reports. Conclusions: The new dataset is ready to support observational research on topics of clinical importance in the nursing home including patient-level prediction of falls. The extraction, translation, and loading process enabled the use of OHDSI data characterization tools that improved the quality of the final dataset. PMID:27891528
Vernooij, Robin W M; Willson, Melina; Gagliardi, Anna R
2016-04-14
Self-management is an important component of care for patients or consumers (henceforth termed patients) with chronic conditions. Research shows that patients view guidelines as potential sources of self-management support. However, few guidelines provide such support. The primary purpose of this study was to characterize effective types of self-management interventions that could be packaged as resources in (i.e., appendices) or with guidelines (i.e., accompanying products). We conducted a meta-review of systematic reviews that evaluated self-management interventions. MEDLINE, EMBASE, and the Cochrane Library were searched from 2005 to 2014 for English language systematic reviews. Data were extracted on study characteristics, intervention (content, delivery, duration, personnel, single or multifaceted), and outcomes. Interventions were characterized by the type of component for different domains (inform, activate, collaborate). Summary statistics were used to report the characteristics, frequency, and impact of the types of self-management components. A Measurement Tool to Assess Systematic Reviews (AMSTAR) was used to assess the methodological quality of included reviews. Seventy-seven studies were included (14 low, 44 moderate, 18 high risk of bias). Reviews addressed numerous clinical topics, most frequently diabetes (23, 30 %). Fifty-four focused on single (38 educational, 16 self-directed) and 21 on multifaceted interventions. Support for collaboration with providers was the least frequently used form of self-management. Most conditions featured multiple types of self-management components. The most frequently occurring type of self-management component across all studies was lifestyle advice (72 %), followed by psychological strategies (69 %), and information about the condition (49 %). In most reviews, the intervention both informed and activated patients (57, 76 %). Among the reviews that achieved positive results, 83 % of interventions involved activation alone, 94 % in combination with information, and 95 % in combination with information and collaboration. No trends in the characteristics and impact of self-management by condition were observed. This study revealed numerous opportunities for enhancing guidelines with resources for both patients and providers to support self-management. This includes single resources that provide information and/or prompt activation. Further research is needed to more firmly establish the statistical association between the characteristics of self-management support and outcomes; and to and optimize the design of self-management resources that are included in or with guidelines, in particular, resources that prompt collaboration with providers.
Exposure Assessment Tools by Approaches - Indirect Estimation (Scenario Evaluation)
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
System Sketch: A Visualization Tool to Improve Community Decision Making
Making decisions in coastal and estuarine management requires a comprehensive understanding of the linkages between environmental, social, and economic systems. SystemSketch is a web-based scoping tool designed to assist resource managers in characterizing their systems, explorin...
Exposure Assessment Tools by Lifestages and Populations - General Population
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Occupational Workers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Lifestages and Populations - Residential Consumers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Tiers and Types - Aggregate and Cumulative
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
LANDSCAPE ASSESSMENT TOOLS FOR WATERSHED CHARACTERIZATION
A combination of process-based, empirical and statistical models has been developed to assist states in their efforts to assess water quality, locate impairments over large areas, and calculate TMDL allocations. By synthesizing outputs from a number of these tools, LIPS demonstr...
Exposure Assessment Tools by Chemical Classes - Inorganics and Fibers
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Water and Sediment
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
EPA EcoBox Tools by Exposure Pathways - Exposure Pathways In ERA
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Decoding the genome with an integrative analysis tool: combinatorial CRM Decoder.
Kang, Keunsoo; Kim, Joomyeong; Chung, Jae Hoon; Lee, Daeyoup
2011-09-01
The identification of genome-wide cis-regulatory modules (CRMs) and characterization of their associated epigenetic features are fundamental steps toward the understanding of gene regulatory networks. Although integrative analysis of available genome-wide information can provide new biological insights, the lack of novel methodologies has become a major bottleneck. Here, we present a comprehensive analysis tool called combinatorial CRM decoder (CCD), which utilizes the publicly available information to identify and characterize genome-wide CRMs in a species of interest. CCD first defines a set of the epigenetic features which is significantly associated with a set of known CRMs as a code called 'trace code', and subsequently uses the trace code to pinpoint putative CRMs throughout the genome. Using 61 genome-wide data sets obtained from 17 independent mouse studies, CCD successfully catalogued ∼12 600 CRMs (five distinct classes) including polycomb repressive complex 2 target sites as well as imprinting control regions. Interestingly, we discovered that ∼4% of the identified CRMs belong to at least two different classes named 'multi-functional CRM', suggesting their functional importance for regulating spatiotemporal gene expression. From these examples, we show that CCD can be applied to any potential genome-wide datasets and therefore will shed light on unveiling genome-wide CRMs in various species.
Lopez, Xavier Moles; Debeir, Olivier; Maris, Calliope; Rorive, Sandrine; Roland, Isabelle; Saerens, Marco; Salmon, Isabelle; Decaestecker, Christine
2012-09-01
Whole-slide scanners allow the digitization of an entire histological slide at very high resolution. This new acquisition technique opens a wide range of possibilities for addressing challenging image analysis problems, including the identification of tissue-based biomarkers. In this study, we use whole-slide scanner technology for imaging the proliferating activity patterns in tumor slides based on Ki67 immunohistochemistry. Faced with large images, pathologists require tools that can help them identify tumor regions that exhibit high proliferating activity, called "hot-spots" (HSs). Pathologists need tools that can quantitatively characterize these HS patterns. To respond to this clinical need, the present study investigates various clustering methods with the aim of identifying Ki67 HSs in whole tumor slide images. This task requires a method capable of identifying an unknown number of clusters, which may be highly variable in terms of shape, size, and density. We developed a hybrid clustering method, referred to as Seedlink. Compared to manual HS selections by three pathologists, we show that Seedlink provides an efficient way of detecting Ki67 HSs and improves the agreement among pathologists when identifying HSs. Copyright © 2012 International Society for Advancement of Cytometry.
Visualizing chemical functionality in plant cell walls
Zeng, Yining; Himmel, Michael E.; Ding, Shi-You
2017-11-30
Understanding plant cell wall cross-linking chemistry and polymeric architecture is key to the efficient utilization of biomass in all prospects from rational genetic modification to downstream chemical and biological conversion to produce fuels and value chemicals. In fact, the bulk properties of cell wall recalcitrance are collectively determined by its chemical features over a wide range of length scales from tissue, cellular to polymeric architectures. Microscopic visualization of cell walls from the nanometer to the micrometer scale offers an in situ approach to study their chemical functionality considering its spatial and chemical complexity, particularly the capabilities of characterizing biomass non-destructivelymore » and in real-time during conversion processes. Microscopic characterization has revealed heterogeneity in the distribution of chemical features, which would otherwise be hidden in bulk analysis. Key microscopic features include cell wall type, wall layering, and wall composition - especially cellulose and lignin distributions. Microscopic tools, such as atomic force microscopy, stimulated Raman scattering microscopy, and fluorescence microscopy, have been applied to investigations of cell wall structure and chemistry from the native wall to wall treated by thermal chemical pretreatment and enzymatic hydrolysis. While advancing our current understanding of plant cell wall recalcitrance and deconstruction, microscopic tools with improved spatial resolution will steadily enhance our fundamental understanding of cell wall function.« less
Sma3s: A universal tool for easy functional annotation of proteomes and transcriptomes.
Casimiro-Soriguer, Carlos S; Muñoz-Mérida, Antonio; Pérez-Pulido, Antonio J
2017-06-01
The current cheapening of next-generation sequencing has led to an enormous growth in the number of sequenced genomes and transcriptomes, allowing wet labs to get the sequences from their organisms of study. To make the most of these data, one of the first things that should be done is the functional annotation of the protein-coding genes. But it used to be a slow and tedious step that can involve the characterization of thousands of sequences. Sma3s is an accurate computational tool for annotating proteins in an unattended way. Now, we have developed a completely new version, which includes functionalities that will be of utility for fundamental and applied science. Currently, the results provide functional categories such as biological processes, which become useful for both characterizing particular sequence datasets and comparing results from different projects. But one of the most important implemented innovations is that it has now low computational requirements, and the complete annotation of a simple proteome or transcriptome usually takes around 24 hours in a personal computer. Sma3s has been tested with a large amount of complete proteomes and transcriptomes, and it has demonstrated its potential in health science and other specific projects. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fan, Ming; Thongsri, Tepwitoon; Axe, Lisa; Tyson, Trevor A
2005-06-01
A probabilistic approach was applied in an ecological risk assessment (ERA) to characterize risk and address uncertainty employing Monte Carlo simulations for assessing parameter and risk probabilistic distributions. This simulation tool (ERA) includes a Window's based interface, an interactive and modifiable database management system (DBMS) that addresses a food web at trophic levels, and a comprehensive evaluation of exposure pathways. To illustrate this model, ecological risks from depleted uranium (DU) exposure at the US Army Yuma Proving Ground (YPG) and Aberdeen Proving Ground (APG) were assessed and characterized. Probabilistic distributions showed that at YPG, a reduction in plant root weight is considered likely to occur (98% likelihood) from exposure to DU; for most terrestrial animals, likelihood for adverse reproduction effects ranges from 0.1% to 44%. However, for the lesser long-nosed bat, the effects are expected to occur (>99% likelihood) through the reduction in size and weight of offspring. Based on available DU data for the firing range at APG, DU uptake will not likely affect survival of aquatic plants and animals (<0.1% likelihood). Based on field and laboratory studies conducted at APG and YPG on pocket mice, kangaroo rat, white-throated woodrat, deer, and milfoil, body burden concentrations observed fall into the distributions simulated at both sites.
Visualizing chemical functionality in plant cell walls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Yining; Himmel, Michael E.; Ding, Shi-You
Understanding plant cell wall cross-linking chemistry and polymeric architecture is key to the efficient utilization of biomass in all prospects from rational genetic modification to downstream chemical and biological conversion to produce fuels and value chemicals. In fact, the bulk properties of cell wall recalcitrance are collectively determined by its chemical features over a wide range of length scales from tissue, cellular to polymeric architectures. Microscopic visualization of cell walls from the nanometer to the micrometer scale offers an in situ approach to study their chemical functionality considering its spatial and chemical complexity, particularly the capabilities of characterizing biomass non-destructivelymore » and in real-time during conversion processes. Microscopic characterization has revealed heterogeneity in the distribution of chemical features, which would otherwise be hidden in bulk analysis. Key microscopic features include cell wall type, wall layering, and wall composition - especially cellulose and lignin distributions. Microscopic tools, such as atomic force microscopy, stimulated Raman scattering microscopy, and fluorescence microscopy, have been applied to investigations of cell wall structure and chemistry from the native wall to wall treated by thermal chemical pretreatment and enzymatic hydrolysis. While advancing our current understanding of plant cell wall recalcitrance and deconstruction, microscopic tools with improved spatial resolution will steadily enhance our fundamental understanding of cell wall function.« less
Visualizing chemical functionality in plant cell walls.
Zeng, Yining; Himmel, Michael E; Ding, Shi-You
2017-01-01
Understanding plant cell wall cross-linking chemistry and polymeric architecture is key to the efficient utilization of biomass in all prospects from rational genetic modification to downstream chemical and biological conversion to produce fuels and value chemicals. In fact, the bulk properties of cell wall recalcitrance are collectively determined by its chemical features over a wide range of length scales from tissue, cellular to polymeric architectures. Microscopic visualization of cell walls from the nanometer to the micrometer scale offers an in situ approach to study their chemical functionality considering its spatial and chemical complexity, particularly the capabilities of characterizing biomass non-destructively and in real-time during conversion processes. Microscopic characterization has revealed heterogeneity in the distribution of chemical features, which would otherwise be hidden in bulk analysis. Key microscopic features include cell wall type, wall layering, and wall composition-especially cellulose and lignin distributions. Microscopic tools, such as atomic force microscopy, stimulated Raman scattering microscopy, and fluorescence microscopy, have been applied to investigations of cell wall structure and chemistry from the native wall to wall treated by thermal chemical pretreatment and enzymatic hydrolysis. While advancing our current understanding of plant cell wall recalcitrance and deconstruction, microscopic tools with improved spatial resolution will steadily enhance our fundamental understanding of cell wall function.
Chang, Bo
2016-01-01
Leber's congenital amaurosis (LCA) is an inherited retinal degenerative disease characterized by severe loss of vision in the first year of life. In addition to early vision loss, a variety of other eye-related abnormalities including roving eye movements, deep-set eyes, and sensitivity to bright light also occur with this disease. Many animal models of LCA are available and the study them has led to a better understanding of the pathology of the disease, and has led to the development of therapeutic strategies aimed at curing or slowing down LCA. Mouse models, with their well-developed genetics and similarity to human physiology and anatomy, serve as powerful tools with which to investigate the etiology of human LCA. Such mice provide reproducible, experimental systems for elucidating pathways of normal development, function, designing strategies and testing compounds for translational research and gene-based therapies aimed at delaying the diseases progression. In this chapter, I describe tools used in the discovery and evaluation of mouse models of LCA including a Phoenix Image-Guided Optical Coherence Tomography (OCT) and a Diagnosys Espion Visual Electrophysiology System. Three mouse models are described, the rd3 mouse model for LCA12 and LCA1, the rd12 mouse model for LCA2, and the rd16 mouse model for LCA10.
Joint properties of a tool machining process to guarantee fluid-proof abilities
NASA Astrophysics Data System (ADS)
Bataille, C.; Deltombe, R.; Jourani, A.; Bigerelle, M.
2017-12-01
This study addressed the impact of rod surface topography in contact with reciprocating seals. Rods were tooled with and without centreless grinding. All rods tooled with centreless grinding were fluid-proof, in contrast to rods tooled without centreless grinding that either had leaks or were fluid-proof. A method was developed to analyse the machining signature, and the software Mesrug™ was used in order to discriminate roughness parameters that can be used to characterize the sealing functionality. According to this surface roughness analysis, a fluid-proof rod tooled without centreless grinding presents aperiodic large plateaus, and the relevant roughness parameter for characterizing the sealing functionality is the density of summits S DS. Increasing the density of summits counteracts leakage, which may be because motif decomposition integrates three topographical components: circularity (perpendicular long-wave roughness), longitudinal waviness, and roughness thanks to the Wolf pruning algorithm. A 3D analytical contact model was applied to analyse the contact area of each type of sample with the seal surface. This model provides a leakage probability, and the results were consistent with the interpretation of the topographical analysis.
ESH assessment of advanced lithography materials and processes
NASA Astrophysics Data System (ADS)
Worth, Walter F.; Mallela, Ram
2004-05-01
The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.
Elliott, Caroline M.; Jacobson, Robert B.; Chojnacki, Kimberly A.
2006-01-01
Hydroacoustic tools were used to map depth, elevation, and substrate on DeSoto Lake in March 2006. DeSoto Lake, located on the DeSoto National Wildlife Refuge in Iowa and Nebraska, is one of the largest oxbow lakes of the Missouri River system. It is used by over 500,000 migratory birds each fall and spring and is also an important aquatic resource for anglers. Management concerns at the lake include the effects of erosion and sedimentation, aquatic vegetation establishment, shorebird habitat availability at different lake levels, and fish habitat structure. DeSoto Lake was cut off from the Missouri River in 1960, and the current mapping updates previous lower-resolution bathymetric maps created from lake surveys in 1967 and 1979. The new maps provide managers tools to assess aquatic habitats and provide a baseline for future monitoring of lake sedimentation and erosion.
Donated chemical probes for open science
Ackloo, Suzanne; Arrowsmith, Cheryl H; Bauser, Marcus; Baryza, Jeremy L; Blagg, Julian; Böttcher, Jark; Bountra, Chas; Brown, Peter J; Bunnage, Mark E; Carter, Adrian J; Damerell, David; Dötsch, Volker; Drewry, David H; Edwards, Aled M; Edwards, James; Elkins, Jon M; Fischer, Christian; Frye, Stephen V; Gollner, Andreas; Grimshaw, Charles E; IJzerman, Adriaan; Hanke, Thomas; Hartung, Ingo V; Hitchcock, Steve; Howe, Trevor; Hughes, Terry V; Laufer, Stefan; Li, Volkhart MJ; Liras, Spiros; Marsden, Brian D; Matsui, Hisanori; Mathias, John; O'Hagan, Ronan C; Owen, Dafydd R; Pande, Vineet; Rauh, Daniel; Rosenberg, Saul H; Roth, Bryan L; Schneider, Natalie S; Scholten, Cora; Singh Saikatendu, Kumar; Simeonov, Anton; Takizawa, Masayuki; Tse, Chris; Thompson, Paul R; Treiber, Daniel K; Viana, Amélia YI; Wells, Carrow I; Willson, Timothy M; Zuercher, William J; Knapp, Stefan
2018-01-01
Potent, selective and broadly characterized small molecule modulators of protein function (chemical probes) are powerful research reagents. The pharmaceutical industry has generated many high-quality chemical probes and several of these have been made available to academia. However, probe-associated data and control compounds, such as inactive structurally related molecules and their associated data, are generally not accessible. The lack of data and guidance makes it difficult for researchers to decide which chemical tools to choose. Several pharmaceutical companies (AbbVie, Bayer, Boehringer Ingelheim, Janssen, MSD, Pfizer, and Takeda) have therefore entered into a pre-competitive collaboration to make available a large number of innovative high-quality probes, including all probe-associated data, control compounds and recommendations on use (https://openscienceprobes.sgc-frankfurt.de/). Here we describe the chemical tools and target-related knowledge that have been made available, and encourage others to join the project. PMID:29676732
Design and Analysis Tool for External-Compression Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.
2012-01-01
A computational tool named SUPIN has been developed to design and analyze external-compression supersonic inlets for aircraft at cruise speeds from Mach 1.6 to 2.0. The inlet types available include the axisymmetric outward-turning, two-dimensional single-duct, two-dimensional bifurcated-duct, and streamline-traced Busemann inlets. The aerodynamic performance is characterized by the flow rates, total pressure recovery, and drag. The inlet flowfield is divided into parts to provide a framework for the geometry and aerodynamic modeling and the parts are defined in terms of geometric factors. The low-fidelity aerodynamic analysis and design methods are based on analytic, empirical, and numerical methods which provide for quick analysis. SUPIN provides inlet geometry in the form of coordinates and surface grids useable by grid generation methods for higher-fidelity computational fluid dynamics (CFD) analysis. SUPIN is demonstrated through a series of design studies and CFD analyses were performed to verify some of the analysis results.
Graph theory for feature extraction and classification: a migraine pathology case study.
Jorge-Hernandez, Fernando; Garcia Chimeno, Yolanda; Garcia-Zapirain, Begonya; Cabrera Zubizarreta, Alberto; Gomez Beldarrain, Maria Angeles; Fernandez-Ruanova, Begonya
2014-01-01
Graph theory is also widely used as a representational form and characterization of brain connectivity network, as is machine learning for classifying groups depending on the features extracted from images. Many of these studies use different techniques, such as preprocessing, correlations, features or algorithms. This paper proposes an automatic tool to perform a standard process using images of the Magnetic Resonance Imaging (MRI) machine. The process includes pre-processing, building the graph per subject with different correlations, atlas, relevant feature extraction according to the literature, and finally providing a set of machine learning algorithms which can produce analyzable results for physicians or specialists. In order to verify the process, a set of images from prescription drug abusers and patients with migraine have been used. In this way, the proper functioning of the tool has been proved, providing results of 87% and 92% of success depending on the classifier used.
Aeroelastic Optimization Study Based on X-56A Model
NASA Technical Reports Server (NTRS)
Li, Wesley; Pak, Chan-Gi
2014-01-01
A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. Two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center were presented. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. A hybrid and discretization optimization approach was implemented to improve accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study. The results provide guidance to modify the fabricated flexible wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished.
[Biomechanical modeling of pelvic organ mobility: towards personalized medicine].
Cosson, Michel; Rubod, Chrystèle; Vallet, Alexandra; Witz, Jean-François; Brieu, Mathias
2011-11-01
Female pelvic mobility is crucial for urinary, bowel and sexual function and for vaginal delivery. This mobility is ensured by a complex organ suspension system composed of ligaments, fascia and muscles. Impaired pelvic mobility affects one in three women of all ages and can be incapacitating. Surgical management has a high failure rate, largely owing to poor knowledge of the organ support system, including the barely discernible ligamentous system. We propose a 3D digital model of the pelvic cavity based on MRI images and quantitative tools, designed to locate the pelvic ligaments. We thus obtain a coherent anatomical and functional model which can be used to analyze pelvic pathophysiology. This work represents a first step towards creating a tool for localizing and characterizing the source of pelvic imbalance. We examine possible future applications of this model, in terms of personalized therapy and prevention.
Kobayashi, Takehito; Yagi, Yusuke; Nakamura, Takahiro
2016-01-01
The pentatricopeptide repeat (PPR) motif is a sequence-specific RNA/DNA-binding module. Elucidation of the RNA/DNA recognition mechanism has enabled engineering of PPR motifs as new RNA/DNA manipulation tools in living cells, including for genome editing. However, the biochemical characteristics of PPR proteins remain unknown, mostly due to the instability and/or unfolding propensities of PPR proteins in heterologous expression systems such as bacteria and yeast. To overcome this issue, we constructed reporter systems using animal cultured cells. The cell-based system has highly attractive features for PPR engineering: robust eukaryotic gene expression; availability of various vectors, reagents, and antibodies; highly efficient DNA delivery ratio (>80 %); and rapid, high-throughput data production. In this chapter, we introduce an example of such reporter systems: a PPR-based sequence-specific translational activation system. The cell-based reporter system can be applied to characterize plant genes of interested and to PPR engineering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juenger, Thomas; Wolfrum, Ed
Our DOE funded project focused on characterizing natural variation in C4 perennial grasses including switchgrass (Panicum virgatum) and Hall’s panicgrass (Panicum hallii). The main theme of our project was to better understand traits linked with plant performance and that impact the utility of plant biomass as a biofuel feedstock. In addition, our project developed tools and resources for studying genetic variation in Panicum hallii. Our project successfully screened both Panicum virgatum and Panicum hallii diverse natural collections for a host of phenotypes, developed genetic mapping populations for both species, completed genetic mapping for biofuel related traits, and helped in themore » development of genomic resources of Panicum hallii. Together, these studies have improved our understanding of the role of genetic and environmental factors in impacting plant performance. This information, along with new tools, will help foster the improvement of perennial grasses for feedstock applications.« less
Principles of Metamorphic Petrology
NASA Astrophysics Data System (ADS)
Williams, Michael L.
2009-05-01
The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.
The History of Electromagnetic Induction Techniques in Soil Survey
NASA Astrophysics Data System (ADS)
Brevik, Eric C.; Doolittle, Jim
2014-05-01
Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales.
Temporal Characterization of Aircraft Noise Sources
NASA Technical Reports Server (NTRS)
Grosveld, Ferdinand W.; Sullivan, Brenda M.; Rizzi, Stephen A.
2004-01-01
Current aircraft source noise prediction tools yield time-independent frequency spectra as functions of directivity angle. Realistic evaluation and human assessment of aircraft fly-over noise require the temporal characteristics of the noise signature. The purpose of the current study is to analyze empirical data from broadband jet and tonal fan noise sources and to provide the temporal information required for prediction-based synthesis. Noise sources included a one-tenth-scale engine exhaust nozzle and a one-fifth scale scale turbofan engine. A methodology was developed to characterize the low frequency fluctuations employing the Short Time Fourier Transform in a MATLAB computing environment. It was shown that a trade-off is necessary between frequency and time resolution in the acoustic spectrogram. The procedure requires careful evaluation and selection of the data analysis parameters, including the data sampling frequency, Fourier Transform window size, associated time period and frequency resolution, and time period window overlap. Low frequency fluctuations were applied to the synthesis of broadband noise with the resulting records sounding virtually indistinguishable from the measured data in initial subjective evaluations. Amplitude fluctuations of blade passage frequency (BPF) harmonics were successfully characterized for conditions equivalent to take-off and approach. Data demonstrated that the fifth harmonic of the BPF varied more in frequency than the BPF itself and exhibited larger amplitude fluctuations over the duration of the time record. Frequency fluctuations were found to be not perceptible in the current characterization of tonal components.
NASA Technical Reports Server (NTRS)
1985-01-01
The second task in the Space Station Data System (SSDS) Analysis/Architecture Study is the development of an information base that will support the conduct of trade studies and provide sufficient data to make key design/programmatic decisions. This volume identifies the preferred options in the technology category and characterizes these options with respect to performance attributes, constraints, cost, and risk. The technology category includes advanced materials, processes, and techniques that can be used to enhance the implementation of SSDS design structures. The specific areas discussed are mass storage, including space and round on-line storage and off-line storage; man/machine interface; data processing hardware, including flight computers and advanced/fault tolerant computer architectures; and software, including data compression algorithms, on-board high level languages, and software tools. Also discussed are artificial intelligence applications and hard-wire communications.
Dong, Yang; He, Honghui; He, Chao; Zhou, Jialing; Zeng, Nan; Ma, Hui
2016-08-10
Silk fibers suffer from microstructural changes due to various external environmental conditions including daily washings. In this paper, we take the backscattering Mueller matrix images of silk samples for non-destructive and real-time quantitative characterization of the wavelength-scale microstructure and examination of the effects of washing by different detergents. The 2D images of the 16 Mueller matrix elements are reduced to the frequency distribution histograms (FDHs) whose central moments reveal the dominant structural features of the silk fibers. A group of new parameters are also proposed to characterize the wavelength-scale microstructural changes of the silk samples during the washing processes. Monte Carlo (MC) simulations are carried out to better understand how the Mueller matrix parameters are related to the wavelength-scale microstructure of silk fibers. The good agreement between experiments and simulations indicates that the Mueller matrix polarimetry and FDH based parameters can be used to quantitatively detect the wavelength-scale microstructural features of silk fibers. Mueller matrix polarimetry may be used as a powerful tool for non-destructive and in situ characterization of the wavelength-scale microstructures of silk based materials.
Dong, Yang; He, Honghui; He, Chao; Zhou, Jialing; Zeng, Nan; Ma, Hui
2016-01-01
Silk fibers suffer from microstructural changes due to various external environmental conditions including daily washings. In this paper, we take the backscattering Mueller matrix images of silk samples for non-destructive and real-time quantitative characterization of the wavelength-scale microstructure and examination of the effects of washing by different detergents. The 2D images of the 16 Mueller matrix elements are reduced to the frequency distribution histograms (FDHs) whose central moments reveal the dominant structural features of the silk fibers. A group of new parameters are also proposed to characterize the wavelength-scale microstructural changes of the silk samples during the washing processes. Monte Carlo (MC) simulations are carried out to better understand how the Mueller matrix parameters are related to the wavelength-scale microstructure of silk fibers. The good agreement between experiments and simulations indicates that the Mueller matrix polarimetry and FDH based parameters can be used to quantitatively detect the wavelength-scale microstructural features of silk fibers. Mueller matrix polarimetry may be used as a powerful tool for non-destructive and in situ characterization of the wavelength-scale microstructures of silk based materials. PMID:27517919
Thermal characterization of TiCxOy thin films
NASA Astrophysics Data System (ADS)
Fernandes, A. C.; Vaz, F.; Gören, A.; Junge, K. H.; Gibkes, J.; Bein, B. K.; Macedo, F.
2008-01-01
Thermal wave characterization of thin films used in industrial applications can be a useful tool, not just to get information on the films' thermal properties, but to get information on structural-physical parameters, e.g. crystalline structure and surface roughness, and on the film deposition conditions, since the thermal film properties are directly related to the structural-physical parameters and to the deposition conditions. Different sets of TiCXOY thin films, deposited by reactive magnetron sputtering on steel, have been prepared, changing only one deposition parameter at a time. Here, the effect of the oxygen flow on the thermal film properties is studied. The thermal waves have been measured by modulated IR radiometry, and the phase lag data have been interpreted using an Extremum method by which the thermal coating parameters are directly related to the values and modulation frequencies of the relative extrema of the inverse calibrated thermal wave phases. Structural/morphological characterization has been done using X-ray diffraction (XRD) and atomic force microscopy (AFM). The characterization of the films also includes thickness, hardness, and electric resistivity measurements. The results obtained so far indicate strong correlations between the thermal diffusivity and conductivity, on the one hand, and the oxygen flow on the other hand.
The multilocus sequence typing network: mlst.net.
Aanensen, David M; Spratt, Brian G
2005-07-01
The unambiguous characterization of strains of a pathogen is crucial for addressing questions relating to its epidemiology, population and evolutionary biology. Multilocus sequence typing (MLST), which defines strains from the sequences at seven house-keeping loci, has become the method of choice for molecular typing of many bacterial and fungal pathogens (and non-pathogens), and MLST schemes and strain databases are available for a growing number of prokaryotic and eukaryotic organisms. Sequence data are ideal for strain characterization as they are unambiguous, meaning strains can readily be compared between laboratories via the Internet. Laboratories undertaking MLST can quickly progress from sequencing the seven gene fragments to characterizing their strains and relating them to those submitted by others and to the population as a whole. We provide the gateway to a number of MLST schemes, each of which contain a set of tools for the initial characterization of strains, and methods for relating query strains to other strains of the species, including clustering based on differences in allelic profiles, phylogenetic trees based on concatenated sequences, and a recently developed method (eBURST) for identifying clonal complexes within a species and displaying the overall structure of the population. This network of MLST websites is available at http://www.mlst.net.
NASA Astrophysics Data System (ADS)
Lu, Junpeng; Liu, Hongwei
2018-01-01
Accurately illustrating the photocarrier dynamics and photoelectrical properties of two dimensional (2D) materials is crucial in the development of 2D material-based optoelectronic devices. Considering this requirement, terahertz (THz) spectroscopy has emerged as a befitting characterization tool to provide deep insights into the carrier dynamics and measurements of the electrical/photoelectrical conductivity of 2D materials. THz spectroscopic measurements would provide information of transient behaviors of carriers with high accuracy in a nondestructive and noncontact manner. In this article, we present a comprehensive review on recent research efforts on investigations of 2D materials of graphene and transition metal dichalcogenides (TMDs) using THz spectroscopy. A brief introduction of THz time-domain spectroscopy (THz-TDS) and optical pump-THz probe spectroscopy (OPTP) is provided. The characterization of the electron transport of graphene at equilibrium state and transient behavior at non-equilibrium state is reviewed. We also review the characterizations of TMDs including MoS2 and WSe2. Finally, we conclude the recent reports and give a prospect on how THz characterizations would guide the design and optimization of 2D material-based optoelectronic devices.
Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F
2014-06-15
The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.
Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.
2014-01-01
The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495
Graph-based optimization of epitope coverage for vaccine antigen design
Theiler, James Patrick; Korber, Bette Tina Marie
2017-01-29
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
Graph-based optimization of epitope coverage for vaccine antigen design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theiler, James Patrick; Korber, Bette Tina Marie
Epigraph is a recently developed algorithm that enables the computationally efficient design of single or multi-antigen vaccines to maximize the potential epitope coverage for a diverse pathogen population. Potential epitopes are defined as short contiguous stretches of proteins, comparable in length to T-cell epitopes. This optimal coverage problem can be formulated in terms of a directed graph, with candidate antigens represented as paths that traverse this graph. Epigraph protein sequences can also be used as the basis for designing peptides for experimental evaluation of immune responses in natural infections to highly variable proteins. The epigraph tool suite also enables rapidmore » characterization of populations of diverse sequences from an immunological perspective. Fundamental distance measures are based on immunologically relevant shared potential epitope frequencies, rather than simple Hamming or phylogenetic distances. Here, we provide a mathematical description of the epigraph algorithm, include a comparison of different heuristics that can be used when graphs are not acyclic, and we describe an additional tool we have added to the web-based epigraph tool suite that provides frequency summaries of all distinct potential epitopes in a population. Lastly, we also show examples of the graphical output and summary tables that can be generated using the epigraph tool suite and explain their content and applications.« less
Exposure Assessment Tools by Tiers and Types - Screening-Level and Refined
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Exposure Assessment Tools by Approaches - Direct Measurement (Point-of-Contact Measurement)
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases, mode
Exposure Assessment Tools by Tiers and Types - Deterministic and Probabilistic Assessments
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
To address this need, new tools have been created for characterizing, simulating, and evaluating chemical biokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissu...
Advancing Exposure Characterization for Chemical Evaluation and Risk Assessment
A new generation of scientific tools has emerged to rapidly measure signals from cells, tissues, and organisms following exposure to chemicals. High-visibility efforts to apply these tools for efficient toxicity testing raise important research questions in exposure science. As v...
Tool for the Reduction and Assessment of Chemical and other Environmental Impacts
TRACI, the Tool for the Reduction and Assessment of Chemical and other environmental Impacts, has been developed by the US Environmental Protection Agency’s National Risk Management Research Laboratory to facilitate the characterization of stressors that have potential effects, ...
EPA EcoBox Tools by Receptors - Endangered, Threatened or Other Species of Concern
Eco-Box is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
New Tools for Investigating Chemical and Product Use
- The timely characterization of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge - High throughput (HT) risk prioritization relies on hazard and exposure characterization - While advances have been made ...
Trinh, Cong T.; Wlaschin, Aaron; Srienc, Friedrich
2010-01-01
Elementary Mode Analysis is a useful Metabolic Pathway Analysis tool to identify the structure of a metabolic network that links the cellular phenotype to the corresponding genotype. The analysis can decompose the intricate metabolic network comprised of highly interconnected reactions into uniquely organized pathways. These pathways consisting of a minimal set of enzymes that can support steady state operation of cellular metabolism represent independent cellular physiological states. Such pathway definition provides a rigorous basis to systematically characterize cellular phenotypes, metabolic network regulation, robustness, and fragility that facilitate understanding of cell physiology and implementation of metabolic engineering strategies. This mini-review aims to overview the development and application of elementary mode analysis as a metabolic pathway analysis tool in studying cell physiology and as a basis of metabolic engineering. PMID:19015845
Highly stable multi-anchored magnetic nanoparticles for optical imaging within biofilms
Stone, R. C.; Fellows, B. D.; Qi, B.; ...
2015-08-05
Magnetic nanoparticles are the next tool in medical diagnoses and treatment in many different biomedical applications, including magnetic hyperthermia as alternative treatment for cancer and bacterial infections, as well as the disruption of biofilms. The colloidal stability of the magnetic nanoparticles in a biological environment is crucial for efficient delivery. A surface that can be easily modifiable can also improve the delivery and imaging properties of the magnetic nanoparticle by adding targeting and imaging moieties, providing a platform for additional modification. The strategy presented in this paper includes multiple nitroDOPA anchors for robust binding to the surface tied to themore » same polymer backbone as multiple poly(ethylene oxide) chains for steric stability. This approach provides biocompatibility and enhanced stability in fetal bovine serum (FBS) and phosphate buffer saline (PBS). As a proof of concept, these polymer-particles complexes were then modified with a near infrared dye and utilized in characterizing the integration of magnetic nanoparticles in biofilms. Finally, the work presented in this manuscript describes the synthesis and characterization of a nontoxic platform for the labeling of near IR-dyes for bioimaging.« less
EMU Battery/module Service Tool Characterization Study
NASA Technical Reports Server (NTRS)
Palandati, C. F.
1984-01-01
The power tool which will be used to replace the attitude control system in the SMM spacecraft is being modified to operate from a self contained battery. The extravehicular mobility unit (EMU) battery, a silver zinc battery, was tested for the power tool application. The results obtained during show the EMU battery is capable of operating the power tool within the pulse current range of 2.0 to 15.0 amperes and battery temperature range of -10 to 40 degrees Celsius.
Yang, Xue-Dong; Tan, Hua-Wei; Zhu, Wei-Min
2016-01-01
Spinach (Spinacia oleracea L.), which originated in central and western Asia, belongs to the family Amaranthaceae. Spinach is one of most important leafy vegetables with a high nutritional value as well as being a perfect research material for plant sex chromosome models. As the completion of genome assembly and gene prediction of spinach, we developed SpinachDB (http://222.73.98.124/spinachdb) to store, annotate, mine and analyze genomics and genetics datasets efficiently. In this study, all of 21702 spinach genes were annotated. A total of 15741 spinach genes were catalogued into 4351 families, including identification of a substantial number of transcription factors. To construct a high-density genetic map, a total of 131592 SSRs and 1125743 potential SNPs located in 548801 loci of spinach genome were identified in 11 cultivated and wild spinach cultivars. The expression profiles were also performed with RNA-seq data using the FPKM method, which could be used to compare the genes. Paralogs in spinach and the orthologous genes in Arabidopsis, grape, sugar beet and rice were identified for comparative genome analysis. Finally, the SpinachDB website contains seven main sections, including the homepage; the GBrowse map that integrates genome, genes, SSR and SNP marker information; the Blast alignment service; the gene family classification search tool; the orthologous and paralogous gene pairs search tool; and the download and useful contact information. SpinachDB will be continually expanded to include newly generated robust genomics and genetics data sets along with the associated data mining and analysis tools.
2013-01-01
Animal models of disease states are valuable tools for developing new treatments and investigating underlying mechanisms. They should mimic the symptoms and pathology of the disease and importantly be predictive of effective treatments. Fibromyalgia is characterized by chronic widespread pain with associated co-morbid symptoms that include fatigue, depression, anxiety and sleep dysfunction. In this review, we present different animal models that mimic the signs and symptoms of fibromyalgia. These models are induced by a wide variety of methods that include repeated muscle insults, depletion of biogenic amines, and stress. All potential models produce widespread and long-lasting hyperalgesia without overt peripheral tissue damage and thus mimic the clinical presentation of fibromyalgia. We describe the methods for induction of the model, pathophysiological mechanisms for each model, and treatment profiles. PMID:24314231
Metabolic Network Modeling of Microbial Communities
Biggs, Matthew B.; Medlock, Gregory L.; Kolling, Glynis L.
2015-01-01
Genome-scale metabolic network reconstructions and constraint-based analysis are powerful methods that have the potential to make functional predictions about microbial communities. Current use of genome-scale metabolic networks to characterize the metabolic functions of microbial communities includes species compartmentalization, separating species-level and community-level objectives, dynamic analysis, the “enzyme-soup” approach, multi-scale modeling, and others. There are many challenges inherent to the field, including a need for tools that accurately assign high-level omics signals to individual community members, new automated reconstruction methods that rival manual curation, and novel algorithms for integrating omics data and engineering communities. As technologies and modeling frameworks improve, we expect that there will be proportional advances in the fields of ecology, health science, and microbial community engineering. PMID:26109480
Tebani, Abdellah; Afonso, Carlos; Bekri, Soumeya
2018-05-01
This work reports the second part of a review intending to give the state of the art of major metabolic phenotyping strategies. It particularly deals with inherent advantages and limits regarding data analysis issues and biological information retrieval tools along with translational challenges. This Part starts with introducing the main data preprocessing strategies of the different metabolomics data. Then, it describes the main data analysis techniques including univariate and multivariate aspects. It also addresses the challenges related to metabolite annotation and characterization. Finally, functional analysis including pathway and network strategies are discussed. The last section of this review is devoted to practical considerations and current challenges and pathways to bring metabolomics into clinical environments.
NASA Astrophysics Data System (ADS)
Bürmen, Miran; Usenik, Peter; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan
2011-03-01
Dental caries is a disease characterized by demineralization of enamel crystals leading to the penetration of bacteria into the dentin and pulp. If left untreated, the disease can lead to pain, infection and tooth loss. Early detection of enamel demineralization resulting in increased enamel porosity, commonly known as white spots, is a difficult diagnostic task. Several papers reported on near infrared (NIR) spectroscopy to be a potentially useful noninvasive spectroscopic technique for early detection of caries lesions. However, the conducted studies were mostly qualitative and did not include the critical assessment of the spectral variability of the sound and carious dental tissues and influence of the water content. Such assessment is essential for development and validation of reliable qualitative and especially quantitative diagnostic tools based on NIR spectroscopy. In order to characterize the described spectral variability, a standardized diffuse reflectance hyper-spectral database was constructed by imaging 12 extracted human teeth with natural lesions of various degrees in the spectral range from 900 to 1700 nm with spectral resolution of 10 nm. Additionally, all the teeth were imaged by digital color camera. The influence of water content on the acquired spectra was characterized by monitoring the teeth during the drying process. The images were assessed by an expert, thereby obtaining the gold standard. By analyzing the acquired spectra we were able to accurately model the spectral variability of the sound dental tissues and identify the advantages and limitations of NIR hyper-spectral imaging.
NASA Astrophysics Data System (ADS)
Houskeeper, H. F.; Kudela, R. M.
2016-12-01
Ocean color sensors have enabled daily, global monitoring of phytoplankton productivity in the world's oceans. However, to observe key structures such as food webs, or to identify regime shifts of dominant species, tools capable of distinguishing between phytoplankton functional types using satellite remote sensing reflectance are necessary. One such tool developed by Alvain et al. (2005), PHYSAT, successfully linked four phytoplankton functional types to chlorophyll-normalized remote sensing spectra, or radiance anomalies, in case-1 waters. Yet this tool was unable to characterize dinoflagellates because of their ubiquitous background presence in the open ocean. We employ a radiance anomaly technique based on PHYSAT to target phytoplankton functional types in Monterey Bay, a region where dinoflagellate populations are larger and more variable than in open ocean waters, and thus where they may be viable targets for satellite remote sensing characterization. We compare with an existing Santa Cruz Wharf photo-pigment time series spanning from 2006 to the present to regionally ground-truth the method's predictions, and we assess its accuracy in characterizing dinoflagellates, a phytoplankton group that impacts the region's fish stocks and water quality. For example, an increase in dinoflagellate abundance beginning in 2005 led to declines in commercially important fish stocks that persisted throughout the following year. Certain species of dinoflagellates in Monterey Bay are also responsible for some of the harmful algal bloom events that negatively impact the shellfish industry. Moving toward better tools to characterize phytoplankton blooms is important for understanding ecosystem shifts, as well as protecting human health in the surrounding areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
VOSED: a tool for the characterization of developing planetary systems
NASA Astrophysics Data System (ADS)
Solano, E.; Gutiérrez, R.; Delgado, A.; Sarro, L. M.; Merín, B.
2007-08-01
The transition phase from optically thick disks around young pre-main sequence stars to optically thin debris disks around Vega type stars is not well understood and plays an important role in the theory of planet formation. One of the most promising methods to characterize this process is the fitting of the observed SED with disk models. However, despite its potential, this technique is affected by two major problems if a non-VO methodology is used: on the one hand, SEDs building requires accessing to a variety of astronomical services which provide, in most of the cases, heterogeneous information. On the other hand, model fitting demands a tremendous amount of work and time which makes it very inefficient even for a modest dataset. This is an important issue considering the large volume of data that missions like Spitzer is producing. In the framework of the Spanish Virtual Observatory (SVO) we have developed VOSED
NASA Astrophysics Data System (ADS)
Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian
2017-03-01
The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control
EPA ExpoBox is a toolbox for exposure assessors. Its purpose is to provide a compendium of exposure assessment and risk characterization tools that will present comprehensive step-by-step guidance and links to relevant exposure assessment data bases
Technical Note: Estimation of Micro-Watershed Topographic Parameters Using Earth Observatory Tools
The study set out to analyze the feasibility of using Earth observatory tools to derive elevations to characterize topographic parameters of slope gradient and area useful in predicting erosion and for natural resources engineering education and instruction. Earth obseravtory too...
TRACI - THE TOOL FOR THE REDUCTION AND ASSESSMENT OF CHEMICAL AND OTHER ENVIRONMENTAL IMPACTS
TRACI, The Tool for the Reduction and Assessment of Chemical and other environmental Impacts, is described along with its history, the underlying research, methodologies, and insights within individual impact categories. TRACI facilitates the characterization of stressors that ma...
NASA Astrophysics Data System (ADS)
Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.
2018-03-01
Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations
Sandra, Koen; Vandenheede, Isabel; Sandra, Pat
2014-03-28
Protein biopharmaceuticals such as monoclonal antibodies and therapeutic proteins are currently in widespread use for the treatment of various life-threatening diseases including cancer, autoimmune disorders, diabetes and anemia. The complexity of protein therapeutics is far exceeding that of small molecule drugs; hence, unraveling this complexity represents an analytical challenge. The current review provides the reader with state-of-the-art chromatographic and mass spectrometric tools available to dissect primary and higher order structures, post-translational modifications, purity and impurity profiles and pharmacokinetic properties of protein therapeutics. Copyright © 2013 Elsevier B.V. All rights reserved.
Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre
2009-01-01
Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.