Sample records for sensitivity analysis capability

  1. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    NASA Technical Reports Server (NTRS)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  2. Development of a structural optimization capability for the aeroelastic tailoring of composite rotor blades with straight and swept tips

    NASA Technical Reports Server (NTRS)

    Friedmann, P. P.; Venkatesan, C.; Yuan, K.

    1992-01-01

    This paper describes the development of a new structural optimization capability aimed at the aeroelastic tailoring of composite rotor blades with straight and swept tips. The primary objective is to reduce vibration levels in forward flight without diminishing the aeroelastic stability margins of the blade. In the course of this research activity a number of complicated tasks have been addressed: (1) development of a new, aeroelastic stability and response analysis; (2) formulation of a new comprehensive sensitive analysis, which facilitates the generation of the appropriate approximations for the objective and the constraints; (3) physical understanding of the new model and, in particular, determination of its potential for aeroelastic tailoring, and (4) combination of the newly developed analysis capability, the sensitivity derivatives and the optimizer into a comprehensive optimization capability. The first three tasks have been completed and the fourth task is in progress.

  3. SCALE 6.2 Continuous-Energy TSUNAMI-3D Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2015-01-01

    The TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation) capabilities within the SCALE code system make use of sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different systems, quantifying computational biases, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved ease of use and fidelity and the desire to extend TSUNAMI analysis to advanced applications have motivated the development of a SCALE 6.2 module for calculating sensitivity coefficients using three-dimensional (3D) continuous-energy (CE) Montemore » Carlo methods: CE TSUNAMI-3D. This paper provides an overview of the theory, implementation, and capabilities of the CE TSUNAMI-3D sensitivity analysis methods. CE TSUNAMI contains two methods for calculating sensitivity coefficients in eigenvalue sensitivity applications: (1) the Iterated Fission Probability (IFP) method and (2) the Contributon-Linked eigenvalue sensitivity/Uncertainty estimation via Track length importance CHaracterization (CLUTCH) method. This work also presents the GEneralized Adjoint Response in Monte Carlo method (GEAR-MC), a first-of-its-kind approach for calculating adjoint-weighted, generalized response sensitivity coefficients—such as flux responses or reaction rate ratios—in CE Monte Carlo applications. The accuracy and efficiency of the CE TSUNAMI-3D eigenvalue sensitivity methods are assessed from a user perspective in a companion publication, and the accuracy and features of the CE TSUNAMI-3D GEAR-MC methods are detailed in this paper.« less

  4. Sensitivity analysis of a wing aeroelastic response

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.

    1991-01-01

    A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.

  5. Monte Carlo capabilities of the SCALE code system

    DOE PAGES

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less

  6. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  7. Evaluation of prediction capability, robustness, and sensitivity in non-linear landslide susceptibility models, Guantánamo, Cuba

    NASA Astrophysics Data System (ADS)

    Melchiorre, C.; Castellanos Abella, E. A.; van Westen, C. J.; Matteucci, M.

    2011-04-01

    This paper describes a procedure for landslide susceptibility assessment based on artificial neural networks, and focuses on the estimation of the prediction capability, robustness, and sensitivity of susceptibility models. The study is carried out in the Guantanamo Province of Cuba, where 186 landslides were mapped using photo-interpretation. Twelve conditioning factors were mapped including geomorphology, geology, soils, landuse, slope angle, slope direction, internal relief, drainage density, distance from roads and faults, rainfall intensity, and ground peak acceleration. A methodology was used that subdivided the database in 3 subsets. A training set was used for updating the weights. A validation set was used to stop the training procedure when the network started losing generalization capability, and a test set was used to calculate the performance of the network. A 10-fold cross-validation was performed in order to show that the results are repeatable. The prediction capability, the robustness analysis, and the sensitivity analysis were tested on 10 mutually exclusive datasets. The results show that by means of artificial neural networks it is possible to obtain models with high prediction capability and high robustness, and that an exploration of the effect of the individual variables is possible, even if they are considered as a black-box model.

  8. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  9. SVDS plume impingement modeling development. Sensitivity analysis supporting level B requirements

    NASA Technical Reports Server (NTRS)

    Chiu, P. B.; Pearson, D. J.; Muhm, P. M.; Schoonmaker, P. B.; Radar, R. J.

    1977-01-01

    A series of sensitivity analyses (trade studies) performed to select features and capabilities to be implemented in the plume impingement model is described. Sensitivity analyses were performed in study areas pertaining to geometry, flowfield, impingement, and dynamical effects. Recommendations based on these analyses are summarized.

  10. Experiences on p-Version Time-Discontinuous Galerkin's Method for Nonlinear Heat Transfer Analysis and Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    2004-01-01

    The focus of this research is on the development of analysis and sensitivity analysis equations for nonlinear, transient heat transfer problems modeled by p-version, time discontinuous finite element approximation. The resulting matrix equation of the state equation is simply in the form ofA(x)x = c, representing a single step, time marching scheme. The Newton-Raphson's method is used to solve the nonlinear equation. Examples are first provided to demonstrate the accuracy characteristics of the resultant finite element approximation. A direct differentiation approach is then used to compute the thermal sensitivities of a nonlinear heat transfer problem. The report shows that only minimal coding effort is required to enhance the analysis code with the sensitivity analysis capability.

  11. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  12. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  13. SPR based hybrid electro-optic biosensor for β-lactam antibiotics determination in water

    NASA Astrophysics Data System (ADS)

    Galatus, Ramona; Feier, Bogdan; Cristea, Cecilia; Cennamo, Nunzio; Zeni, Luigi

    2017-09-01

    The present work aims to provide a hybrid platform capable of complementary and sensitive detection of β-lactam antibiotics, ampicillin in particular. The use of an aptamer specific to ampicillin assures good selectivity and sensitivity for the detection of ampicillin from different matrice. This new approach is dedicated for a portable, remote sensing platform based on low-cost, small size and low-power consumption solution. The simple experimental hybrid platform integrates the results from the D-shape surface plasmon resonance plastic optical fiber (SPR-POF) and from the electrochemical (bio)sensor, for the analysis of ampicillin, delivering sensitive and reliable results. The SPR-POF already used in many previous applications is embedded in a new experimental setup with fluorescent fibers emitters, for broadband wavelength analysis, low-power consumption and low-heating capabilities of the sensing platform.

  14. GLSENS: A Generalized Extension of LSENS Including Global Reactions and Added Sensitivity Analysis for the Perfectly Stirred Reactor

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1996-01-01

    A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.

  15. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  16. Rapid solution of large-scale systems of equations

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.

    1994-01-01

    The analysis and design of complex aerospace structures requires the rapid solution of large systems of linear and nonlinear equations, eigenvalue extraction for buckling, vibration and flutter modes, structural optimization and design sensitivity calculation. Computers with multiple processors and vector capabilities can offer substantial computational advantages over traditional scalar computer for these analyses. These computers fall into two categories: shared memory computers and distributed memory computers. This presentation covers general-purpose, highly efficient algorithms for generation/assembly or element matrices, solution of systems of linear and nonlinear equations, eigenvalue and design sensitivity analysis and optimization. All algorithms are coded in FORTRAN for shared memory computers and many are adapted to distributed memory computers. The capability and numerical performance of these algorithms will be addressed.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henning, C.

    This report contains papers on the following topics: conceptual design; radiation damage of ITER magnet systems; insulation system of the magnets; critical current density and strain sensitivity; toroidal field coil structural analysis; stress analysis for the ITER central solenoid; and volt-second capabilities and PF magnet configurations.

  18. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  19. Adjoint-Based Sensitivity and Uncertainty Analysis for Density and Composition: A User’s Guide

    DOE PAGES

    Favorite, Jeffrey A.; Perko, Zoltan; Kiedrowski, Brian C.; ...

    2017-03-01

    The ability to perform sensitivity analyses using adjoint-based first-order sensitivity theory has existed for decades. This paper provides guidance on how adjoint sensitivity methods can be used to predict the effect of material density and composition uncertainties in critical experiments, including when these uncertain parameters are correlated or constrained. Two widely used Monte Carlo codes, MCNP6 (Ref. 2) and SCALE 6.2 (Ref. 3), are both capable of computing isotopic density sensitivities in continuous energy and angle. Additionally, Perkó et al. have shown how individual isotope density sensitivities, easily computed using adjoint methods, can be combined to compute constrained first-order sensitivitiesmore » that may be used in the uncertainty analysis. This paper provides details on how the codes are used to compute first-order sensitivities and how the sensitivities are used in an uncertainty analysis. Constrained first-order sensitivities are computed in a simple example problem.« less

  20. [Meta-analysis of diagnostic capability of frequency-doubling technology (FDT) for primary glaucoma].

    PubMed

    Liu, Ting; He, Xiang-ge

    2006-05-01

    To evaluate the overall diagnostic capabilities of frequency-doubling technology (FDT) in patients of primary glaucoma, with standard automated perimetry (SAP) and/or optic disc appearance as the gold standard. A comprehensive electric search in MEDLINE, EMBASE, Cochrane Library, BIOSIS, Previews, HMIC, IPA, OVID, CNKI, CBMdisc, VIP information, CMCC, CCPD, SSreader and 21dmedia and a manual search in related textbooks, journals, congress articles and their references were performed to identify relevant English and Chinese language articles. Criteria for adaptability were established according to validity criteria for diagnostic research published by the Cochrane Methods Group on Screening and Diagnostic Tests. Quality of the included articles was assessed and relevant materials were extracted for studying. Statistical analysis was performed with Meta Test version 0.6 software. Heterogeneity of the included articles was tested, which was used to select appropriate effect model to calculate pooled weighted sensitivity and specificity. Summary Receiver Operating Characteristic (SROC) curve was established and the area under the curve (AUC) was calculated. Finally, sensitivity analysis was performed. Fifteen English articles (21 studies) of 206 retrieved articles were included in the present study, with a total of 3172 patients. The reported sensitivity of FDT ranged from 0.51 to 1.00, and specificity from 0.58 to 1.00. The pooled weighted sensitivity and specificity for FDT with 95% confidence intervals (95% CI) after correction for standard error were 0.86 (0.80 - 0.90), 0.87 (0.81 - 0.91), respectively. The AUC of SROC was 93.01%. Sensitivity analysis demonstrated no disproportionate influences of individual study. The included articles are of good quality and FDT can be a highly efficient diagnostic test for primary glaucoma based on Meta-analysis. However, a high quality perspective study is still required for further analysis.

  1. Single-molecule detection: applications to ultrasensitive biochemical analysis

    NASA Astrophysics Data System (ADS)

    Castro, Alonso; Shera, E. Brooks

    1995-06-01

    Recent developments in laser-based detection of fluorescent molecules have made possible the implementation of very sensitive techniques for biochemical analysis. We present and discuss our experiments on the applications of our recently developed technique of single-molecule detection to the analysis of molecules of biological interest. These newly developed methods are capable of detecting and identifying biomolecules at the single-molecule level of sensitivity. In one case, identification is based on measuring fluorescence brightness from single molecules. In another, molecules are classified by determining their electrophoretic velocities.

  2. Pressure sensitivity analysis of fiber Bragg grating sensors

    NASA Astrophysics Data System (ADS)

    Mrad, Nezih; Sridharan, Vasant; Kazemi, Alex

    2014-09-01

    Recent development in fiber optic sensing technology has mainly focused on discrete sensing, particularly, sensing systems with potential multiplexing and multi-parameter capabilities. Bragg grating fiber optic sensors have emerged as the non-disputed champion for multiplexing and simultaneous multi-parameter sensing for emerging high value structural components, advanced processing and manufacturing capabilities and increased critical infrastructure resilience applications. Although the number of potential applications for this sensing technology is large and spans the domains of medicine, manufacturing, aerospace, and public safety; critical issues such as fatigue life, sensitivity, accuracy, embeddability, material/sensor interface integrity, and universal demodulation systems still need to be addressed. The purpose of this paper is to primarily evaluate Commercial-Of-The-Shelf (COTS) Fiber Bragg Grating (FBG) sensors' sensitivity to pressure, often neglected in several applications. The COTS fiber sensitivity to pressure is further evaluated for two types of coatings (Polyimide and Acrylate), and different arrangements (arrayed and single).

  3. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  4. Identifying significant environmental features using feature recognition.

    DOT National Transportation Integrated Search

    2015-10-01

    The Department of Environmental Analysis at the Kentucky Transportation Cabinet has expressed an interest in feature-recognition capability because it may help analysts identify environmentally sensitive features in the landscape, : including those r...

  5. Sensitive Amino Acid Composition and Chirality Analysis with the Mars Organic Analyzer (MOA)

    NASA Technical Reports Server (NTRS)

    Skelley, Alison M.; Scherer, James R.; Aubrey, Andrew D.; Grover, William H.; Ivester, Robin H. C.; Ehrenfreund, Pascale; Grunthaner, Frank J.; Bada, Jeffrey L.; Mathies, Richard A.

    2005-01-01

    Detection of life on Mars requires definition of a suitable biomarker and development of sensitive yet compact instrumentation capable of performing in situ analyses. Our studies are focused on amino acid analysis because amino acids are more resistant to decomposition than other biomolecules, and because amino acid chirality is a well-defined biomarker. Amino acid composition and chirality analysis has been previously demonstrated in the lab using microfabricated capillary electrophoresis (CE) chips. To analyze amino acids in the field, we have developed the Mars Organic Analyzer (MOA), a portable analysis system that consists of a compact instrument and a novel multi-layer CE microchip.

  6. Comparative assessment of amphibious hearing in pinnipeds.

    PubMed

    Reichmuth, Colleen; Holt, Marla M; Mulsow, Jason; Sills, Jillian M; Southall, Brandon L

    2013-06-01

    Auditory sensitivity in pinnipeds is influenced by the need to balance efficient sound detection in two vastly different physical environments. Previous comparisons between aerial and underwater hearing capabilities have considered media-dependent differences relative to auditory anatomy, acoustic communication, ecology, and amphibious life history. New data for several species, including recently published audiograms and previously unreported measurements obtained in quiet conditions, necessitate a re-evaluation of amphibious hearing in pinnipeds. Several findings related to underwater hearing are consistent with earlier assessments, including an expanded frequency range of best hearing in true seals that spans at least six octaves. The most notable new results indicate markedly better aerial sensitivity in two seals (Phoca vitulina and Mirounga angustirostris) and one sea lion (Zalophus californianus), likely attributable to improved ambient noise control in test enclosures. An updated comparative analysis alters conventional views and demonstrates that these amphibious pinnipeds have not necessarily sacrificed aerial hearing capabilities in favor of enhanced underwater sound reception. Despite possessing underwater hearing that is nearly as sensitive as fully aquatic cetaceans and sirenians, many seals and sea lions have retained acute aerial hearing capabilities rivaling those of terrestrial carnivores.

  7. Shape optimization of three-dimensional stamped and solid automotive components

    NASA Technical Reports Server (NTRS)

    Botkin, M. E.; Yang, R.-J.; Bennett, J. A.

    1987-01-01

    The shape optimization of realistic, 3-D automotive components is discussed. The integration of the major parts of the total process: modeling, mesh generation, finite element and sensitivity analysis, and optimization are stressed. Stamped components and solid components are treated separately. For stamped parts a highly automated capability was developed. The problem description is based upon a parameterized boundary design element concept for the definition of the geometry. Automatic triangulation and adaptive mesh refinement are used to provide an automated analysis capability which requires only boundary data and takes into account sensitivity of the solution accuracy to boundary shape. For solid components a general extension of the 2-D boundary design element concept has not been achieved. In this case, the parameterized surface shape is provided using a generic modeling concept based upon isoparametric mapping patches which also serves as the mesh generator. Emphasis is placed upon the coupling of optimization with a commercially available finite element program. To do this it is necessary to modularize the program architecture and obtain shape design sensitivities using the material derivative approach so that only boundary solution data is needed.

  8. High order statistical signatures from source-driven measurements of subcritical fissile systems

    NASA Astrophysics Data System (ADS)

    Mattingly, John Kelly

    1998-11-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.

  9. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  10. Initial Results: An Ultra-Low-Background Germanium Crystal Array

    DTIC Science & Technology

    2010-09-01

    data (focused on γ -γ coincidence signatures) (Smith et al., 2004) and the Multi- Isotope Coincidence Analysis code (MICA) (Warren et al., 2006). The...The follow-on “CASCADES” project aims to develop a multicoincidence data- analysis package and make robust fission-product demonstration measurements...sensitivity. This effort is focused on improving gamma analysis capabilities for nuclear detonation detection (NDD) applications, e.g., nuclear treaty

  11. An error analysis of the recovery capability of the relative sea-surface profile over the Puerto Rican trench from multi-station and ship tracking of GEOS-2

    NASA Technical Reports Server (NTRS)

    Stanley, H. R.; Martin, C. F.; Roy, N. A.; Vetter, J. R.

    1971-01-01

    Error analyses were performed to examine the height error in a relative sea-surface profile as determined by a combination of land-based multistation C-band radars and optical lasers and one ship-based radar tracking the GEOS 2 satellite. It was shown that two relative profiles can be obtained: one using available south-to-north passes of the satellite and one using available north-to-south type passes. An analysis of multi-station tracking capability determined that only Antigua and Grand Turk radars are required to provide satisfactory orbits for south-to-north type satellite passes, while a combination of Merritt Island, Bermuda, and Wallops radars provide secondary orbits for north-to-south passes. Analysis of ship tracking capabilities shows that high elevation single pass range-only solutions are necessary to give only moderate sensitivity to systematic error effects.

  12. Characterizing Wheel-Soil Interaction Loads Using Meshfree Finite Element Methods: A Sensitivity Analysis for Design Trade Studies

    NASA Technical Reports Server (NTRS)

    Contreras, Michael T.; Trease, Brian P.; Bojanowski, Cezary; Kulakx, Ronald F.

    2013-01-01

    A wheel experiencing sinkage and slippage events poses a high risk to planetary rover missions as evidenced by the mobility challenges endured by the Mars Exploration Rover (MER) project. Current wheel design practice utilizes loads derived from a series of events in the life cycle of the rover which do not include (1) failure metrics related to wheel sinkage and slippage and (2) performance trade-offs based on grouser placement/orientation. Wheel designs are rigorously tested experimentally through a variety of drive scenarios and simulated soil environments; however, a robust simulation capability is still in development due to myriad of complex interaction phenomena that contribute to wheel sinkage and slippage conditions such as soil composition, large deformation soil behavior, wheel geometry, nonlinear contact forces, terrain irregularity, etc. For the purposes of modeling wheel sinkage and slippage at an engineering scale, meshfree nite element approaches enable simulations that capture su cient detail of wheel-soil interaction while remaining computationally feasible. This study implements the JPL wheel-soil benchmark problem in the commercial code environment utilizing the large deformation modeling capability of Smooth Particle Hydrodynamics (SPH) meshfree methods. The nominal, benchmark wheel-soil interaction model that produces numerically stable and physically realistic results is presented and simulations are shown for both wheel traverse and wheel sinkage cases. A sensitivity analysis developing the capability and framework for future ight applications is conducted to illustrate the importance of perturbations to critical material properties and parameters. Implementation of the proposed soil-wheel interaction simulation capability and associated sensitivity framework has the potential to reduce experimentation cost and improve the early stage wheel design proce

  13. Stochastic sensitivity measure for mistuned high-performance turbines

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha V.; Pierre, Christophe

    1992-01-01

    A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.

  14. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  15. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    NASA Astrophysics Data System (ADS)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.

  16. Effects of slow breathing rate on heart rate variability and arterial baroreflex sensitivity in essential hypertension.

    PubMed

    Li, Changjun; Chang, Qinghua; Zhang, Jia; Chai, Wenshu

    2018-05-01

    This study is to investigate the effects of slow breathing on heart rate variability (HRV) and arterial baroreflex sensitivity in essential hypertension.We studied 60 patients with essential hypertension and 60 healthy controls. All subjects underwent controlled breathing at 8 and 16 breaths per minute. Electrocardiogram, respiratory, and blood pressure signals were recorded simultaneously. We studied effects of slow breathing on heart rate, blood pressure and respiratory peak, high-frequency (HF) power, low-frequency (LF) power, and LF/HF ratio of HRV with traditional and corrected spectral analysis. Besides, we tested whether slow breathing was capable of modifying baroreflex sensitivity in hypertensive subjects.Slow breathing, compared with 16 breaths per minute, decreased the heart rate and blood pressure (all P < .05), and shifted respiratory peak toward left (P < .05). Compared to 16 breaths/minute, traditional spectral analysis showed increased LF power and LF/HF ratio, decreased HF power of HRV at 8 breaths per minute (P < .05). As breathing rate decreased, corrected spectral analysis showed increased HF power, decreased LF power, LF/HF ratio of HRV (P < .05). Compared to controls, resting baroreflex sensitivity decreased in hypertensive subjects. Slow breathing increased baroreflex sensitivity in hypertensive subjects (from 59.48 ± 6.39 to 78.93 ± 5.04 ms/mm Hg, P < .05) and controls (from 88.49 ± 6.01 to 112.91 ± 7.29 ms/mm Hg, P < .05).Slow breathing can increase HF power and decrease LF power and LF/HF ratio in essential hypertension. Besides, slow breathing increased baroreflex sensitivity in hypertensive subjects. These demonstrate slow breathing is indeed capable of shifting sympatho-vagal balance toward vagal activities and increasing baroreflex sensitivity, suggesting a safe, therapeutic approach for essential hypertension.

  17. Rotordynamics on the PC: Further Capabilities of ARDS

    NASA Technical Reports Server (NTRS)

    Fleming, David P.

    1997-01-01

    Rotordynamics codes for personal computers are now becoming available. One of the most capable codes is Analysis of RotorDynamic Systems (ARDS) which uses the component mode synthesis method to analyze a system of up to 5 rotating shafts. ARDS was originally written for a mainframe computer but has been successfully ported to a PC; its basic capabilities for steady-state and transient analysis were reported in an earlier paper. Additional functions have now been added to the PC version of ARDS. These functions include: 1) Estimation of the peak response following blade loss without resorting to a full transient analysis; 2) Calculation of response sensitivity to input parameters; 3) Formulation of optimum rotor and damper designs to place critical speeds in desirable ranges or minimize bearing loads; 4) Production of Poincard plots so the presence of chaotic motion can be ascertained. ARDS produces printed and plotted output. The executable code uses the full array sizes of the mainframe version and fits on a high density floppy disc. Examples of all program capabilities are presented and discussed.

  18. Indoor air quality inspection and analysis system based on gas sensor array

    NASA Astrophysics Data System (ADS)

    Gao, Xiang; Wang, Mingjiang; Fan, Binwen

    2017-08-01

    A detection and analysis system capable of measuring the concentration of four major gases in indoor air is designed. It uses four gas sensors constitute a gas sensor array, to achieve four indoor gas concentration detection, while the detection of data for further processing to reduce the cross-sensitivity between the gas sensor to improve the accuracy of detection.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornemann, Andrea, E-mail: andrea.hornemann@ptb.de; Hoehl, Arne, E-mail: arne.hoehl@ptb.de; Ulm, Gerhard, E-mail: gerhard.ulm@ptb.de

    Bio-diagnostic assays of high complexity rely on nanoscaled assay recognition elements that can provide unique selectivity and design-enhanced sensitivity features. High throughput performance requires the simultaneous detection of various analytes combined with appropriate bioassay components. Nanoparticle induced sensitivity enhancement, and subsequent multiplexed capability Surface-Enhanced InfraRed Absorption (SEIRA) assay formats are fitting well these purposes. SEIRA constitutes an ideal platform to isolate the vibrational signatures of targeted bioassay and active molecules. The potential of several targeted biolabels, here fluorophore-labeled antibody conjugates, chemisorbed onto low-cost biocompatible gold nano-aggregates substrates have been explored for their use in assay platforms. Dried films were analyzedmore » by synchrotron radiation based FTIR/SEIRA spectro-microscopy and the resulting complex hyperspectral datasets were submitted to automated statistical analysis, namely Principal Components Analysis (PCA). The relationships between molecular fingerprints were put in evidence to highlight their spectral discrimination capabilities. We demonstrate that robust spectral encoding via SEIRA fingerprints opens up new opportunities for fast, reliable and multiplexed high-end screening not only in biodiagnostics but also in vitro biochemical imaging.« less

  20. Simultaneous Solid Phase Extraction and Derivatization of Aliphatic Primary Amines Prior to Separation and UV-Absorbance Detection

    PubMed Central

    Felhofer, Jessica L.; Scida, Karen; Penick, Mark; Willis, Peter A.; Garcia, Carlos D.

    2013-01-01

    To overcome the problem of poor sensitivity of capillary electrophoresis-UV absorbance for the detection of aliphatic amines, a solid phase extraction and derivatization scheme was developed. This work demonstrates successful coupling of amines to a chromophore immobilized on a solid phase and subsequent cleavage and analysis. Although the analysis of many types of amines is relevant for myriad applications, this paper focuses on the derivatization and separation of amines with environmental relevance. This work aims to provide the foundations for future developments of an integrated sample preparation microreactor capable of performing simultaneous derivatization, preconcentration, and sample cleanup for sensitive analysis of primary amines. PMID:24054648

  1. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  2. Analysis of the passive stabilization of the long duration exposure facility

    NASA Technical Reports Server (NTRS)

    Siegel, S. H.; Vishwanath, N. S.

    1977-01-01

    The nominal Long Duration Exposure Facility (LDEF) configurations and the anticipated orbit parameters are presented. A linear steady state analysis was performed using these parameters. The effects of orbit eccentricity, solar pressure, aerodynamic pressure, magnetic dipole, and the magnetically anchored rate damper were evaluated to determine the configuration sensitivity to variations in these parameters. The worst case conditions for steady state errors were identified, and the performance capability calculated. Garber instability bounds were evaluated for the range of configuration and damping coefficients under consideration. The transient damping capabilities of the damper were examined, and the time constant as a function of damping coefficient and spacecraft moment of inertia determined. The capture capabilities of the damper were calculated, and the results combined with steady state, transient, and Garber instability analyses to select damper design parameters.

  3. Results of an integrated structure-control law design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1988-01-01

    Next generation air and space vehicle designs are driven by increased performance requirements, demanding a high level of design integration between traditionally separate design disciplines. Interdisciplinary analysis capabilities have been developed, for aeroservoelastic aircraft and large flexible spacecraft control for instance, but the requisite integrated design methods are only beginning to be developed. One integrated design method which has received attention is based on hierarchal problem decompositions, optimization, and design sensitivity analyses. This paper highlights a design sensitivity analysis method for Linear Quadratic Cost, Gaussian (LQG) optimal control laws, which predicts change in the optimal control law due to changes in fixed problem parameters using analytical sensitivity equations. Numerical results of a design sensitivity analysis for a realistic aeroservoelastic aircraft example are presented. In this example, the sensitivity of the optimally controlled aircraft's response to various problem formulation and physical aircraft parameters is determined. These results are used to predict the aircraft's new optimally controlled response if the parameter was to have some other nominal value during the control law design process. The sensitivity results are validated by recomputing the optimal control law for discrete variations in parameters, computing the new actual aircraft response, and comparing with the predicted response. These results show an improvement in sensitivity accuracy for integrated design purposes over methods which do not include changess in the optimal control law. Use of the analytical LQG sensitivity expressions is also shown to be more efficient that finite difference methods for the computation of the equivalent sensitivity information.

  4. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  5. iTOUGH2 v7.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FINSTERLE, STEFAN; JUNG, YOOJIN; KOWALSKY, MICHAEL

    2016-09-15

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional, multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. iTOUGH2 performs sensitivity analyses, data-worth analyses, parameter estimation, and uncertainty propagation analyses in geosciences and reservoir engineering and other application areas. iTOUGH2 supports a number of different combinations of fluids and components (equation-of-state (EOS) modules). In addition, the optimization routines implemented in iTOUGH2 can also be used for sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files using the PEST protocol. iTOUGH2 solves the inverse problem bymore » minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative-free, gradient-based, and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlo simulations for uncertainty propagation analyses. A detailed residual and error analysis is provided. This upgrade includes (a) global sensitivity analysis methods, (b) dynamic memory allocation (c) additional input features and output analyses, (d) increased forward simulation capabilities, (e) parallel execution on multicore PCs and Linux clusters, and (f) bug fixes. More details can be found at http://esd.lbl.gov/iTOUGH2.« less

  6. [Parameter sensitivity of simulating net primary productivity of Larix olgensis forest based on BIOME-BGC model].

    PubMed

    He, Li-hong; Wang, Hai-yan; Lei, Xiang-dong

    2016-02-01

    Model based on vegetation ecophysiological process contains many parameters, and reasonable parameter values will greatly improve simulation ability. Sensitivity analysis, as an important method to screen out the sensitive parameters, can comprehensively analyze how model parameters affect the simulation results. In this paper, we conducted parameter sensitivity analysis of BIOME-BGC model with a case study of simulating net primary productivity (NPP) of Larix olgensis forest in Wangqing, Jilin Province. First, with the contrastive analysis between field measurement data and the simulation results, we tested the BIOME-BGC model' s capability of simulating the NPP of L. olgensis forest. Then, Morris and EFAST sensitivity methods were used to screen the sensitive parameters that had strong influence on NPP. On this basis, we also quantitatively estimated the sensitivity of the screened parameters, and calculated the global, the first-order and the second-order sensitivity indices. The results showed that the BIOME-BGC model could well simulate the NPP of L. olgensis forest in the sample plot. The Morris sensitivity method provided a reliable parameter sensitivity analysis result under the condition of a relatively small sample size. The EFAST sensitivity method could quantitatively measure the impact of simulation result of a single parameter as well as the interaction between the parameters in BIOME-BGC model. The influential sensitive parameters for L. olgensis forest NPP were new stem carbon to new leaf carbon allocation and leaf carbon to nitrogen ratio, the effect of their interaction was significantly greater than the other parameter' teraction effect.

  7. High-Sensitivity Fast Neutron Detector KNK-2-8M

    NASA Astrophysics Data System (ADS)

    Koshelev, A. S.; Dovbysh, L. Ye.; Ovchinnikov, M. A.; Pikulina, G. N.; Drozdov, Yu. M.; Chuklyaev, S. V.; Pepyolyshev, Yu. N.

    2017-12-01

    The design of the fast neutron detector KNK-2-8M is outlined. The results of he detector study in the pulse counting mode with pulses from 238U nuclei fission in the radiator of the neutron-sensitive section and in the current mode with separation of functional section currents are presented. The possibilities of determination of the effective number of 238U nuclei in the radiator of the neutron-sensitive section are considered. The diagnostic capabilities of the detector in the counting mode are demonstrated, as exemplified by the analysis of reference data on characteristics of neutron fields in the BR-1 reactor hall. The diagnostic capabilities of the detector in the current mode are demonstrated, as exemplified by the results of measurements of 238U fission intensity in the power startup of the BR-K1 reactor in the fission pulse generation mode with delayed neutrons and the detector placed in the reactor cavity in conditions of large-scale variation of the reactor radiation fields.

  8. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  9. Theoretical Noise Analysis on a Position-sensitive Metallic Magnetic Calorimeter

    NASA Technical Reports Server (NTRS)

    Smith, Stephen J.

    2007-01-01

    We report on the theoretical noise analysis for a position-sensitive Metallic Magnetic Calorimeter (MMC), consisting of MMC read-out at both ends of a large X-ray absorber. Such devices are under consideration as alternatives to other cryogenic technologies for future X-ray astronomy missions. We use a finite-element model (FEM) to numerically calculate the signal and noise response at the detector outputs and investigate the correlations between the noise measured at each MMC coupled by the absorber. We then calculate, using the optimal filter concept, the theoretical energy and position resolution across the detector and discuss the trade-offs involved in optimizing the detector design for energy resolution, position resolution and count rate. The results show, theoretically, the position-sensitive MMC concept offers impressive spectral and spatial resolving capabilities compared to pixel arrays and similar position-sensitive cryogenic technologies using Transition Edge Sensor (TES) read-out.

  10. Single-indicator-based Multidimensional Sensing: Detection and Identification of Heavy Metal Ions and Understanding the Foundations from Experiment to Simulation

    PubMed Central

    Leng, Yumin; Qian, Sihua; Wang, Yuhui; Lu, Cheng; Ji, Xiaoxu; Lu, Zhiwen; Lin, Hengwei

    2016-01-01

    Multidimensional sensing offers advantages in accuracy, diversity and capability for the simultaneous detection and discrimination of multiple analytes, however, the previous reports usually require complicated synthesis/fabrication process and/or need a variety of techniques (or instruments) to acquire signals. Therefore, to take full advantages of this concept, simple designs are highly desirable. Herein, a novel concept is conceived to construct multidimensional sensing platforms based on a single indicator that has capability of showing diverse color/fluorescence responses with the addition of different analytes. Through extracting hidden information from these responses, such as red, green and blue (RGB) alterations, a triple-channel-based multidimensional sensing platform could consequently be fabricated, and the RGB alterations are further applicable to standard statistical methods. As a proof-of-concept study, a triple-channel sensing platform is fabricated solely using dithizone with assistance of cetyltrimethylammonium bromide (CTAB) for hyperchromicity and sensitization, which demonstrates superior capabilities in detection and identification of ten common heavy metal ions at their standard concentrations of wastewater-discharge of China. Moreover, this sensing platform exhibits promising applications in semi-quantitative and even quantitative analysis individuals of these heavy metal ions with high sensitivity as well. Finally, density functional theory calculations are performed to reveal the foundations for this analysis. PMID:27146105

  11. Computer-Aided Design/Manufacturing (CAD/M) for High-Speed Interconnect.

    DTIC Science & Technology

    1981-10-01

    are frequency sensitive and hence lend themselves to frequency domain ananlysis . Most of the classical microwave analysis is handled in the frequency ...capability integrated into a time-domain analysis program. This approach allows determination of frequency -dependent transmission line (interconnect...the items to consider in any interconnect study is that of the frequency range of interest. This determines whether the interconnections must be treated

  12. Establishing the Capability of a 1D SVAT Modelling Scheme in Predicting Key Biophysical Vegetation Characterisation Parameters

    NASA Astrophysics Data System (ADS)

    Ireland, Gareth; Petropoulos, George P.; Carlson, Toby N.; Purdy, Sarah

    2015-04-01

    Sensitivity analysis (SA) consists of an integral and important validatory check of a computer simulation model before it is used to perform any kind of analysis. In the present work, we present the results from a SA performed on the SimSphere Soil Vegetation Atmosphere Transfer (SVAT) model utilising a cutting edge and robust Global Sensitivity Analysis (GSA) approach, based on the use of the Gaussian Emulation Machine for Sensitivity Analysis (GEM-SA) tool. The sensitivity of the following model outputs was evaluated: the ambient CO2 concentration and the rate of CO2 uptake by the plant, the ambient O3 concentration, the flux of O3 from the air to the plant/soil boundary, and the flux of O3 taken up by the plant alone. The most sensitive model inputs for the majority of model outputs were related to the structural properties of vegetation, namely, the Leaf Area Index, Fractional Vegetation Cover, Cuticle Resistance and Vegetation Height. External CO2 in the leaf and the O3 concentration in the air input parameters also exhibited significant influence on model outputs. This work presents a very important step towards an all-inclusive evaluation of SimSphere. Indeed, results from this study contribute decisively towards establishing its capability as a useful teaching and research tool in modelling Earth's land surface interactions. This is of considerable importance in the light of the rapidly expanding use of this model worldwide, which also includes research conducted by various Space Agencies examining its synergistic use with Earth Observation data towards the development of operational products at a global scale. This research was supported by the European Commission Marie Curie Re-Integration Grant "TRANSFORM-EO". SimSphere is currently maintained and freely distributed by the Department of Geography and Earth Sciences at Aberystwyth University (http://www.aber.ac.uk/simsphere). Keywords: CO2 flux, ambient CO2, O3 flux, SimSphere, Gaussian process emulators, BACCO GEM-SA, TRANSFORM-EO.

  13. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  14. A low power ADS for transmutation studies in fast systems

    NASA Astrophysics Data System (ADS)

    Panza, Fabio; Firpo, Gabriele; Lomonaco, Guglielmo; Osipenko, Mikhail; Ricco, Giovanni; Ripani, Marco; Saracco, Paolo; Viberti, Carlo Maria

    2017-12-01

    In this work, we report studies on a fast low power accelerator driven system model as a possible experimental facility, focusing on its capabilities in terms of measurement of relevant integral nuclear quantities. In particular, we performed Monte Carlo simulations of minor actinides and fission products irradiation and estimated the fission rate within fission chambers in the reactor core and the reflector, in order to evaluate the transmutation rates and the measurement sensitivity. We also performed a photo-peak analysis of available experimental data from a research reactor, in order to estimate the expected sensitivity of this analysis method on the irradiation of samples in the ADS considered.

  15. Non-volatile analysis in fruits by laser resonant ionization spectrometry: application to resveratrol (3,5,4'-trihydroxystilbene) in grapes

    NASA Astrophysics Data System (ADS)

    Montero, C.; Orea, J. M.; Soledad Muñoz, M.; Lobo, R. F. M.; González Ureña, A.

    A laser desorption (LD) coupled with resonance-enhanced multiphoton ionisation (REMPI) and time-of-flight mass spectrometry (TOFMS) technique for non-volatile trace analysis compounds is presented. Essential features are: (a) an enhanced desorption yield due to the mixing of metal powder with the analyte in the sample preparation, (b) a high resolution, great sensitivity and low detection limit due to laser resonant ionisation and mass spectrometry detection. Application to resveratrol content in grapes demonstrated the capability of the analytical method with a sensitivity of 0.2 pg per single laser shot and a detection limit of 5 ppb.

  16. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  17. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  18. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  19. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  20. HF Propagation sensitivity study and system performance analysis with the Air Force Coverage Analysis Program (AFCAP)

    NASA Astrophysics Data System (ADS)

    Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.

    2017-12-01

    The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).

  1. Application, evaluation and sensitivity analysis of the coupled WRF-CMAQ system from regional to urban scales

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science chemical transport model (CTM) capable of simulating the emission, transport and fate of numerous air pollutants. Similarly, the Weather Research and Forecasting (WRF) model is a state-of-the-science mete...

  2. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  3. User’s Manual for SEEK TALK Full Scale Engineering Development Life Cycle Cost (LCC) Model. Volume II. Model Equations and Model Operations.

    DTIC Science & Technology

    1981-04-01

    LIFE CYCLE COST (LCC) LCC SENSITIVITY ANALYSIS LCC MODE , REPAIR LEVEL ANALYSIS (RLA) 20 ABSTRACT (Cnn tlnue on reverse side It necessary and Identify... level analysis capability. Next it provides values for Air Force input parameters and instructions for contractor inputs, general operating...Maintenance Manhour Requirements 39 5.1.4 Calculation of Repair Level Fractions 43 5.2 Cost Element Equations 47 5.2.1 Production Cost Element 47

  4. Computational and Mathematical Modeling of Coupled Superconducting Quantum Interference Devices

    NASA Astrophysics Data System (ADS)

    Berggren, Susan Anne Elizabeth

    This research focuses on conducting an extensive computational investigation and mathematical analysis into the average voltage response of arrays of Superconducting Quantum Interference Devices (SQUIDs). These arrays will serve as the basis for the development of a sensitive, low noise, significantly lower Size, Weight and Power (SWaP) antenna integrated with Low-Noise Amplifier (LNA) using the SQUID technology. The goal for this antenna is to be capable of meeting all requirements for Guided Missile Destroyers (DDG) 1000 class ships for Information Operations/Signals Intelligence (IO/SIGINT) applications in Very High Frequency/Ultra High Frequency (V/UHF) bands. The device will increase the listening capability of receivers by moving technology into a new regime of energy detection allowing wider band, smaller size, more sensitive, stealthier systems. The smaller size and greater sensitivity will allow for ships to be “de-cluttered” of their current large dishes and devices, replacing everything with fewer and smaller SQUID antenna devices. The fewer devices present on the deck of a ship, the more invisible the ship will be to enemy forces. We invent new arrays of SQUIDs, optimized for signal detection with very high dynamic range and excellent spur-free dynamic range, while maintaining extreme small size (and low radar cross section), wide bandwidth, and environmentally noise limited sensitivity, effectively shifting the bottle neck of receiver systems forever away from the antenna itself deeper into the receiver chain. To accomplish these goals we develop and validate mathematical models for different designs of SQUID arrays and use them to invent a new device and systems design. This design is capable of significantly exceeding, per size weight and power, state-of-the-art receiver system measures of performance, such as bandwidth, sensitivity, dynamic range, and spurious-free dynamic range.

  5. Liquid contrabands classification based on energy dispersive X-ray diffraction and hybrid discriminant analysis

    NASA Astrophysics Data System (ADS)

    YangDai, Tianyi; Zhang, Li

    2016-02-01

    Energy dispersive X-ray diffraction (EDXRD) combined with hybrid discriminant analysis (HDA) has been utilized for classifying the liquid materials for the first time. The XRD spectra of 37 kinds of liquid contrabands and daily supplies were obtained using an EDXRD test bed facility. The unique spectra of different samples reveal XRD's capability to distinguish liquid contrabands from daily supplies. In order to create a system to detect liquid contrabands, the diffraction spectra were subjected to HDA which is the combination of principal components analysis (PCA) and linear discriminant analysis (LDA). Experiments based on the leave-one-out method demonstrate that HDA is a practical method with higher classification accuracy and lower noise sensitivity than the other methods in this application. The study shows the great capability and potential of the combination of XRD and HDA for liquid contrabands classification.

  6. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  7. Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.

    2016-01-01

    The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.

  8. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  9. Strategies and Approaches to TPS Design

    NASA Technical Reports Server (NTRS)

    Kolodziej, Paul

    2005-01-01

    Thermal protection systems (TPS) insulate planetary probes and Earth re-entry vehicles from the aerothermal heating experienced during hypersonic deceleration to the planet s surface. The systems are typically designed with some additional capability to compensate for both variations in the TPS material and for uncertainties in the heating environment. This additional capability, or robustness, also provides a surge capability for operating under abnormal severe conditions for a short period of time, and for unexpected events, such as meteoroid impact damage, that would detract from the nominal performance. Strategies and approaches to developing robust designs must also minimize mass because an extra kilogram of TPS displaces one kilogram of payload. Because aircraft structures must be optimized for minimum mass, reliability-based design approaches for mechanical components exist that minimize mass. Adapting these existing approaches to TPS component design takes advantage of the extensive work, knowledge, and experience from nearly fifty years of reliability-based design of mechanical components. A Non-Dimensional Load Interference (NDLI) method for calculating the thermal reliability of TPS components is presented in this lecture and applied to several examples. A sensitivity analysis from an existing numerical simulation of a carbon phenolic TPS provides insight into the effects of the various design parameters, and is used to demonstrate how sensitivity analysis may be used with NDLI to develop reliability-based designs of TPS components.

  10. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less

  11. The electrophotonic silicon biosensor

    NASA Astrophysics Data System (ADS)

    Juan-Colás, José; Parkin, Alison; Dunn, Katherine E.; Scullion, Mark G.; Krauss, Thomas F.; Johnson, Steven D.

    2016-09-01

    The emergence of personalized and stratified medicine requires label-free, low-cost diagnostic technology capable of monitoring multiple disease biomarkers in parallel. Silicon photonic biosensors combine high-sensitivity analysis with scalable, low-cost manufacturing, but they tend to measure only a single biomarker and provide no information about their (bio)chemical activity. Here we introduce an electrochemical silicon photonic sensor capable of highly sensitive and multiparameter profiling of biomarkers. Our electrophotonic technology consists of microring resonators optimally n-doped to support high Q resonances alongside electrochemical processes in situ. The inclusion of electrochemical control enables site-selective immobilization of different biomolecules on individual microrings within a sensor array. The combination of photonic and electrochemical characterization also provides additional quantitative information and unique insight into chemical reactivity that is unavailable with photonic detection alone. By exploiting both the photonic and the electrical properties of silicon, the sensor opens new modalities for sensing on the microscale.

  12. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  13. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  14. Using Optically Stimulated Electron Emission as an Inspection Method to Monitor Surface Contamination

    NASA Technical Reports Server (NTRS)

    Lingbloom, Mike S.

    2008-01-01

    During redesign of the Space Shuttle reusable solid rocket motor (RSRM), NASA amended the contract with ATK Launch Systems (then Morton Thiokol Inc.) with Change Order 966 to implement a contamination control and cleanliness verification method. The change order required: (1) A quantitative inspection method (2) A written record of actual contamination levels versus a known reject level (3) A method that is more sensitive than existing methods of visual and black light inspection. Black light inspection is only useful for inspection of contaminants that fluoresce near the 365 nm spectral line and is not useful for inspection of most silicones that will not produce strong fluorescence. Black light inspection conducted by a qualified inspector under controlled light is capable of detecting Conoco HD-2 grease in gross amounts and is very subjective due to operator sensitivity. Optically stimulated electron emission (OSEE), developed at the Materials and Process Laboratory at Marshall Space Flight Center (MSFC), was selected to satisfy Change Order 966. OSEE offers several important advantages over existing laboratory methods with similar sensitivity, e.g., spectroscopy and nonvolatile residue sampling, which provide turn around time, real time capability, and full coverage inspection capability. Laboratory methods require sample gathering and in-lab analysis, which sometimes takes several days to get results. This is not practical in a production environment. In addition, these methods do not offer full coverage inspection of the large components

  15. Potentials and capabilities of the Extracellular Vesicle (EV) Array.

    PubMed

    Jørgensen, Malene Møller; Bæk, Rikke; Varming, Kim

    2015-01-01

    Extracellular vesicles (EVs) and exosomes are difficult to enrich or purify from biofluids, hence quantification and phenotyping of these are tedious and inaccurate. The multiplexed, highly sensitive and high-throughput platform of the EV Array presented by Jørgensen et al., (J Extracell Vesicles, 2013; 2: 10) has been refined regarding the capabilities of the method for characterization and molecular profiling of EV surface markers. Here, we present an extended microarray platform to detect and phenotype plasma-derived EVs (optimized for exosomes) for up to 60 antigens without any enrichment or purification prior to analysis.

  16. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    USGS Publications Warehouse

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.

  17. Esophageal cancer detection based on tissue surface-enhanced Raman spectroscopy and multivariate analysis

    NASA Astrophysics Data System (ADS)

    Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Chen, Weisheng; Wang, Yue; Chen, Rong; Zeng, Haishan

    2013-01-01

    The capability of using silver nanoparticle based near-infrared surface enhanced Raman scattering (SERS) spectroscopy combined with principal component analysis (PCA) and linear discriminate analysis (LDA) to differentiate esophageal cancer tissue from normal tissue was presented. Significant differences in Raman intensities of prominent SERS bands were observed between normal and cancer tissues. PCA-LDA multivariate analysis of the measured tissue SERS spectra achieved diagnostic sensitivity of 90.9% and specificity of 97.8%. This exploratory study demonstrated great potential for developing label-free tissue SERS analysis into a clinical tool for esophageal cancer detection.

  18. Detector Development for the MARE Neutrino Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galeazzi, M.; Bogorin, D.; Molina, R.

    2009-12-16

    The MARE experiment is designed to measure the mass of the neutrino with sub-eV sensitivity by measuring the beta decay of {sup 187}Re with cryogenic microcalorimeters. A preliminary analysis shows that, to achieve the necessary statistics, between 10,000 and 50,000 detectors are likely necessary. We have fabricated and characterized Iridium transition edge sensors with high reproducibility and uniformity for such a large scale experiment. We have also started a full scale simulation of the experimental setup for MARE, including thermalization in the absorber, detector response, and optimum filter analysis, to understand the issues related to reaching a sub-eV sensitivity andmore » to optimize the design of the MARE experiment. We present our characterization of the Ir devices, including reproducibility, uniformity, and sensitivity, and we discuss the implementation and capabilities of our full scale simulation.« less

  19. Development of computer-aided design system of elastic sensitive elements of automatic metering devices

    NASA Astrophysics Data System (ADS)

    Kalinkina, M. E.; Kozlov, A. S.; Labkovskaia, R. I.; Pirozhnikova, O. I.; Tkalich, V. L.; Shmakov, N. A.

    2018-05-01

    The object of research is the element base of devices of control and automation systems, including in its composition annular elastic sensitive elements, methods of their modeling, calculation algorithms and software complexes for automation of their design processes. The article is devoted to the development of the computer-aided design system of elastic sensitive elements used in weight- and force-measuring automation devices. Based on the mathematical modeling of deformation processes in a solid, as well as the results of static and dynamic analysis, the calculation of elastic elements is given using the capabilities of modern software systems based on numerical simulation. In the course of the simulation, the model was a divided hexagonal grid of finite elements with a maximum size not exceeding 2.5 mm. The results of modal and dynamic analysis are presented in this article.

  20. Analyzing Feedback Control Systems

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    Interactive controls analysis (INCA) program developed to provide user-friendly environment for design and analysis of linear control systems, primarily feedback control. Designed for use with both small- and large-order systems. Using interactive-graphics capability, INCA user quickly plots root locus, frequency response, or time response of either continuous-time system or sampled-data system. Configuration and parameters easily changed, allowing user to design compensation networks and perform sensitivity analyses in very convenient manner. Written in Pascal and FORTRAN.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadgu, Teklu; Appel, Gordon John

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the currentmore » analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less

  2. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability

    PubMed Central

    ChariDingari, Narahara; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P.; Kumar, G. Manoj

    2012-01-01

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real world applications, e.g. quality assurance and process monitoring. Specifically, variability in sample, system and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a non-linear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), due to its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data – highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples as well as in related areas of forensic and biological sample analysis. PMID:22292496

  3. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    PubMed

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  4. Sensitivity-Uncertainty Based Nuclear Criticality Safety Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-09-20

    These are slides from a seminar given to the University of Mexico Nuclear Engineering Department. Whisper is a statistical analysis package developed to support nuclear criticality safety validation. It uses the sensitivity profile data for an application as computed by MCNP6 along with covariance files for the nuclear data to determine a baseline upper-subcritical-limit for the application. Whisper and its associated benchmark files are developed and maintained as part of MCNP6, and will be distributed with all future releases of MCNP6. Although sensitivity-uncertainty methods for NCS validation have been under development for 20 years, continuous-energy Monte Carlo codes such asmore » MCNP could not determine the required adjoint-weighted tallies for sensitivity profiles. The recent introduction of the iterated fission probability method into MCNP led to the rapid development of sensitivity analysis capabilities for MCNP6 and the development of Whisper. Sensitivity-uncertainty based methods represent the future for NCS validation – making full use of today’s computer power to codify past approaches based largely on expert judgment. Validation results are defensible, auditable, and repeatable as needed with different assumptions and process models. The new methods can supplement, support, and extend traditional validation approaches.« less

  5. Integrated Droplet-Based Microextraction with ESI-MS for Removal of Matrix Interference in Single-Cell Analysis.

    PubMed

    Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong

    2016-04-29

    Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.

  6. Cavity-Enhanced Absorption Spectroscopy and Photoacoustic Spectroscopy for Human Breath Analysis

    NASA Astrophysics Data System (ADS)

    Wojtas, J.; Tittel, F. K.; Stacewicz, T.; Bielecki, Z.; Lewicki, R.; Mikolajczyk, J.; Nowakowski, M.; Szabra, D.; Stefanski, P.; Tarka, J.

    2014-12-01

    This paper describes two different optoelectronic detection techniques: cavity-enhanced absorption spectroscopy and photoacoustic spectroscopy. These techniques are designed to perform a sensitive analysis of trace gas species in exhaled human breath for medical applications. With such systems, the detection of pathogenic changes at the molecular level can be achieved. The presence of certain gases (biomarkers), at increased concentration levels, indicates numerous human diseases. Diagnosis of a disease in its early stage would significantly increase chances for effective therapy. Non-invasive, real-time measurements, and high sensitivity and selectivity, capable of minimum discomfort for patients, are the main advantages of human breath analysis. At present, monitoring of volatile biomarkers in breath is commonly useful for diagnostic screening, treatment for specific conditions, therapy monitoring, control of exogenous gases (such as bacterial and poisonous emissions), as well as for analysis of metabolic gases.

  7. LSENS: A General Chemical Kinetics and Sensitivity Analysis Code for homogeneous gas-phase reactions. Part 3: Illustrative test problems

    NASA Technical Reports Server (NTRS)

    Bittker, David A.; Radhakrishnan, Krishnan

    1994-01-01

    LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 3 of a series of three reference publications that describe LSENS, provide a detailed guide to its usage, and present many example problems. Part 3 explains the kinetics and kinetics-plus-sensitivity analysis problems supplied with LSENS and presents sample results. These problems illustrate the various capabilities of, and reaction models that can be solved by, the code and may provide a convenient starting point for the user to construct the problem data file required to execute LSENS. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.

  8. Critical factors determining the quantification capability of matrix-assisted laser desorption/ionization– time-of-flight mass spectrometry

    PubMed Central

    Wang, Chia-Chen; Lai, Yin-Hung; Ou, Yu-Meng; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-01-01

    Quantitative analysis with mass spectrometry (MS) is important but challenging. Matrix-assisted laser desorption/ionization (MALDI) coupled with time-of-flight (TOF) MS offers superior sensitivity, resolution and speed, but such techniques have numerous disadvantages that hinder quantitative analyses. This review summarizes essential obstacles to analyte quantification with MALDI-TOF MS, including the complex ionization mechanism of MALDI, sensitive characteristics of the applied electric fields and the mass-dependent detection efficiency of ion detectors. General quantitative ionization and desorption interpretations of ion production are described. Important instrument parameters and available methods of MALDI-TOF MS used for quantitative analysis are also reviewed. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644968

  9. Improving the analysis of slug tests

    USGS Publications Warehouse

    McElwee, C.D.

    2002-01-01

    This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.

  10. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  11. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    DOE PAGES

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-31

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less

  12. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis.

    PubMed

    Hoffmann, Max J; Engelmann, Felix; Matera, Sebastian

    2017-01-28

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO 2 (110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  13. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past the application of sensitivity analysis, such as Degree ofmore » Rate Control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. Here in this study we present an efficient and robust three stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using CO oxidation on RuO 2(110) as a prototypical reaction. In a first step, we utilize the Fisher Information Matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally we adopt a method for sampling coupled finite differences for evaluating the sensitivity measure of lattice based models. This allows efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano scale design of heterogeneous catalysts.« less

  14. A practical approach to the sensitivity analysis for kinetic Monte Carlo simulation of heterogeneous catalysis

    NASA Astrophysics Data System (ADS)

    Hoffmann, Max J.; Engelmann, Felix; Matera, Sebastian

    2017-01-01

    Lattice kinetic Monte Carlo simulations have become a vital tool for predictive quality atomistic understanding of complex surface chemical reaction kinetics over a wide range of reaction conditions. In order to expand their practical value in terms of giving guidelines for the atomic level design of catalytic systems, it is very desirable to readily evaluate a sensitivity analysis for a given model. The result of such a sensitivity analysis quantitatively expresses the dependency of the turnover frequency, being the main output variable, on the rate constants entering the model. In the past, the application of sensitivity analysis, such as degree of rate control, has been hampered by its exuberant computational effort required to accurately sample numerical derivatives of a property that is obtained from a stochastic simulation method. In this study, we present an efficient and robust three-stage approach that is capable of reliably evaluating the sensitivity measures for stiff microkinetic models as we demonstrate using the CO oxidation on RuO2(110) as a prototypical reaction. In the first step, we utilize the Fisher information matrix for filtering out elementary processes which only yield negligible sensitivity. Then we employ an estimator based on the linear response theory for calculating the sensitivity measure for non-critical conditions which covers the majority of cases. Finally, we adapt a method for sampling coupled finite differences for evaluating the sensitivity measure for lattice based models. This allows for an efficient evaluation even in critical regions near a second order phase transition that are hitherto difficult to control. The combined approach leads to significant computational savings over straightforward numerical derivatives and should aid in accelerating the nano-scale design of heterogeneous catalysts.

  15. Some new features of Direct Analysis in Real Time mass spectrometry utilizing the desorption at an angle option.

    PubMed

    Chernetsova, Elena S; Revelsky, Alexander I; Morlock, Gertrud E

    2011-08-30

    The present study is a first step towards the unexplored capabilities of Direct Analysis in Real Time (DART) mass spectrometry (MS) arising from the possibility of the desorption at an angle: scanning analysis of surfaces, including the coupling of thin-layer chromatography (TLC) with DART-MS, and a more sensitive analysis due to the preliminary concentration of analytes dissolved in large volumes of liquids on glass surfaces. In order to select the most favorable conditions for DART-MS analysis, proper positioning of samples is important. Therefore, a simple and cheap technique for the visualization of the impact region of the DART gas stream onto a substrate was developed. A filter paper or TLC plate, previously loaded with the analyte, was immersed in a derivatization solution. On this substrate, owing to the impact of the hot DART gas, reaction of the analyte to a colored product occurred. An improved capability of detection of DART-MS for the analysis of liquids was demonstrated by applying large volumes of model solutions of coumaphos into small glass vessels and drying these solutions prior to DART-MS analysis under ambient conditions. This allowed the introduction of, by up to more than two orders of magnitude, increased quantities of analyte compared with the conventional DART-MS analysis of liquids. Through this improved detectability, the capabilities of DART-MS in trace analysis could be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.

  16. Optical coherence tomography in the diagnosis of dysplasia and adenocarcinoma in Barret's esophagus

    NASA Astrophysics Data System (ADS)

    Gladkova, N. D.; Zagaynova, E. V.; Zuccaro, G.; Kareta, M. V.; Feldchtein, F. I.; Balalaeva, I. V.; Balandina, E. B.

    2007-02-01

    Statistical analysis of endoscopic optical coherence tomography (EOCT) surveillance of 78 patients with Barrett's esophagus (BE) is presented in this study. The sensitivity of OCT device in retrospective open detection of early malignancy (including high grade dysplasia and intramucosal adenocarcinoma (IMAC)) was 75%, specificity 82%, diagnostic accuracy - 80%, positive predictive value- 60%, negative predictive value- 87%. In the open recognition of IMAC sensitivity was 81% and specificity were 85% each. Results of a blind recognition with the same material were similar: sensitivity - 77%, specificity 85%, diagnostic accuracy - 82%, positive predictive value- 70%, negative predictive value- 87%. As the endoscopic detection of early malignancy is problematic, OCT holds great promise in enhancing the diagnostic capability of clinical GI endoscopy.

  17. Behavior sensitivities for control augmented structures

    NASA Technical Reports Server (NTRS)

    Manning, R. A.; Lust, R. V.; Schmit, L. A.

    1987-01-01

    During the past few years it has been recognized that combining passive structural design methods with active control techniques offers the prospect of being able to find substantially improved designs. These developments have stimulated interest in augmenting structural synthesis by adding active control system design variables to those usually considered in structural optimization. An essential step in extending the approximation concepts approach to control augmented structural synthesis is the development of a behavior sensitivity analysis capability for determining rates of change of dynamic response quantities with respect to changes in structural and control system design variables. Behavior sensitivity information is also useful for man-machine interactive design as well as in the context of system identification studies. Behavior sensitivity formulations for both steady state and transient response are presented and the quality of the resulting derivative information is evaluated.

  18. Sensor Applications of Soft Magnetic Materials Based on Magneto-Impedance, Magneto-Elastic Resonance and Magneto-Electricity

    PubMed Central

    García-Arribas, Alfredo; Gutiérrez, Jon; Kurlyandskaya, Galina V.; Barandiarán, José M.; Svalov, Andrey; Fernández, Eduardo; Lasheras, Andoni; de Cos, David; Bravo-Imaz, Iñaki

    2014-01-01

    The outstanding properties of selected soft magnetic materials make them successful candidates for building high performance sensors. In this paper we present our recent work regarding different sensing technologies based on the coupling of the magnetic properties of soft magnetic materials with their electric or elastic properties. In first place we report the influence on the magneto-impedance response of the thickness of Permalloy films in multilayer-sandwiched structures. An impedance change of 270% was found in the best conditions upon the application of magnetic field, with a low field sensitivity of 140%/Oe. Second, the magneto-elastic resonance of amorphous ribbons is used to demonstrate the possibility of sensitively measuring the viscosity of fluids, aimed to develop an on-line and real-time sensor capable of assessing the state of degradation of lubricant oils in machinery. A novel analysis method is shown to sensitively reveal the changes of the damping parameter of the magnetoelastic oscillations at the resonance as a function of the oil viscosity. Finally, the properties and performance of magneto-electric laminated composites of amorphous magnetic ribbons and piezoelectric polymer films are investigated, demonstrating magnetic field detection capabilities below 2.7 nT. PMID:24776934

  19. Surface plasmon resonance biosensors for highly sensitive detection in real samples

    NASA Astrophysics Data System (ADS)

    Sepúlveda, B.; Carrascosa, L. G.; Regatos, D.; Otte, M. A.; Fariña, D.; Lechuga, L. M.

    2009-08-01

    In this work we summarize the main results obtained with the portable surface plasmon resonance (SPR) device developed in our group (commercialised by SENSIA, SL, Spain), highlighting its applicability for the real-time detection of extremely low concentrations of toxic pesticides in environmental water samples. In addition, we show applications in clinical diagnosis as, on the one hand, the real-time and label-free detection of DNA hybridization and single point mutations at the gene BRCA-1, related to the predisposition in women to develop an inherited breast cancer and, on the other hand, the analysis of protein biomarkers in biological samples (urine, serum) for early detection of diseases. Despite the large number of applications already proven, the SPR technology has two main drawbacks: (i) not enough sensitivity for some specific applications (where pM-fM or single-molecule detection are needed) (ii) low multiplexing capabilities. In order solve such drawbacks, we work in several alternative configurations as the Magneto-optical Surface Plasmon Resonance sensor (MOSPR) based on a combination of magnetooptical and ferromagnetic materials, to improve the SPR sensitivity, or the Localized Surface Plasmon Resonance (LSPR) based on nanostructures (nanoparticles, nanoholes,...), for higher multiplexing capabilities.

  20. Design and characterization of planar capacitive imaging probe based on the measurement sensitivity distribution

    NASA Astrophysics Data System (ADS)

    Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.

    2013-01-01

    Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.

  1. Secondary Ion Mass Spectrometers (SIMS) for calcium isotope measurements as an application to biological samples

    NASA Astrophysics Data System (ADS)

    Craven, S. M.; Hoenigman, J. R.; Moddeman, W. E.

    1981-11-01

    The potential use of secondary ion mass spectroscopy (SIMS) to analyze biological samples for calcium isotopes is discussed. Comparison of UTI and Extranuclear based quadrupole systems is made on the basis of the analysis of CaO and calcium metal. The Extranuclear quadrupole based system is superior in resolution and sensitivity to the UTI system and is recommended. For determination of calcium isotopes to within an accuracy of a few percent a high resolution quadrupole, such as the Extranuclear, and signal averaging capability are required. Charge neutralization will be mandated for calcium oxide, calcium nitrate, or calcium oxalate. SIMS is not capable of the high precision and high accuracy results possible by thermal ionization methods, but where faster analysis is desirable with an accuracy of a few percent, SIMS is a viable alternative.

  2. Orbit/attitude estimation with LANDSAT Landmark data

    NASA Technical Reports Server (NTRS)

    Hall, D. L.; Waligora, S.

    1979-01-01

    The use of LANDSAT landmark data for orbit/attitude and camera bias estimation was studied. The preliminary results of these investigations are presented. The Goddard Trajectory Determination System (GTDS) error analysis capability was used to perform error analysis studies. A number of questions were addressed including parameter observability and sensitivity, effects on the solve-for parameter errors of data span, density, and distribution an a priori covariance weighting. The use of the GTDS differential correction capability with acutal landmark data was examined. The rms line and element observation residuals were studied as a function of the solve-for parameter set, a priori covariance weighting, force model, attitude model and data characteristics. Sample results are presented. Finally, verfication and preliminary system evaluation of the LANDSAT NAVPAK system for sequential (extended Kalman Filter) estimation of orbit, and camera bias parameters is given.

  3. Orbital transfer vehicle concept definition and system analysis study, 1986. Volume 9: Study extension results

    NASA Technical Reports Server (NTRS)

    Kofal, Allen E.

    1987-01-01

    The purpose of this extension to the OTV Concept Definition and Systems Analysis Study was to improve the definition of the OTV Program that will be most beneficial to the nation in the 1995 to 2010 timeframe. The implications of the defined mission and defined launch vehicle are investigated. The key mission requirements identified for the Space Transportation Architecture Study (STAS) were established and reflect a need for early capability and more ambitious capability growth. The key technical objectives and related issues addressed are summarized. The analyses of selected areas including aerobrake design, proximity operations, and the balance of EVA and IVA operations used in the support of the OTV at the space-base were enhanced. Sensitivity studies were conducted to establish how the OTV program should be tailored to meet changing circumstances.

  4. A Sensitivity Study of the Aircraft Vortex Spacing System (AVOSS) Wake Predictor Algorithm to the Resolution of Input Meteorological Profiles

    NASA Technical Reports Server (NTRS)

    Rutishauser, David K.; Butler, Patrick; Riggins, Jamie

    2004-01-01

    The AVOSS project demonstrated the feasibility of applying aircraft wake vortex sensing and prediction technologies to safe aircraft spacing for single runway arrivals. On average, AVOSS provided spacing recommendations that were less than the current FAA prescribed spacing rules, resulting in a potential airport efficiency gain. Subsequent efforts have included quantifying the operational specifications for future Wake Vortex Advisory Systems (WakeVAS). In support of these efforts, each of the candidate subsystems for a WakeVAS must be specified. The specifications represent a consensus between the high-level requirements and the capabilities of the candidate technologies. This report documents the beginnings of an effort to quantify the capabilities of the AVOSS Prediction Algorithm (APA). Specifically, the APA horizontal position and circulation strength output sensitivity to the resolution of its wind and turbulence inputs is examined. The results of this analysis have implications for the requirements of the meteorological sensing and prediction systems comprising a WakeVAS implementation.

  5. Interval analysis of interictal EEG: pathology of the alpha rhythm in focal epilepsy

    NASA Astrophysics Data System (ADS)

    Pyrzowski, Jan; Siemiński, Mariusz; Sarnowska, Anna; Jedrzejczak, Joanna; Nyka, Walenty M.

    2015-11-01

    The contemporary use of interictal scalp electroencephalography (EEG) in the context of focal epilepsy workup relies on the visual identification of interictal epileptiform discharges. The high-specificity performance of this marker comes, however, at a cost of only moderate sensitivity. Zero-crossing interval analysis is an alternative to Fourier analysis for the assessment of the rhythmic component of EEG signals. We applied this method to standard EEG recordings of 78 patients divided into 4 subgroups: temporal lobe epilepsy (TLE), frontal lobe epilepsy (FLE), psychogenic nonepileptic seizures (PNES) and nonepileptic patients with headache. Interval-analysis based markers were capable of effectively discriminating patients with epilepsy from those in control subgroups (AUC~0.8) with diagnostic sensitivity potentially exceeding that of visual analysis. The identified putative epilepsy-specific markers were sensitive to the properties of the alpha rhythm and displayed weak or non-significant dependences on the number of antiepileptic drugs (AEDs) taken by the patients. Significant AED-related effects were concentrated in the theta interval range and an associated marker allowed for identification of patients on AED polytherapy (AUC~0.9). Interval analysis may thus, in perspective, increase the diagnostic yield of interictal scalp EEG. Our findings point to the possible existence of alpha rhythm abnormalities in patients with epilepsy.

  6. Capacitive chemical sensor

    DOEpatents

    Manginell, Ronald P; Moorman, Matthew W; Wheeler, David R

    2014-05-27

    A microfabricated capacitive chemical sensor can be used as an autonomous chemical sensor or as an analyte-sensitive chemical preconcentrator in a larger microanalytical system. The capacitive chemical sensor detects changes in sensing film dielectric properties, such as the dielectric constant, conductivity, or dimensionality. These changes result from the interaction of a target analyte with the sensing film. This capability provides a low-power, self-heating chemical sensor suitable for remote and unattended sensing applications. The capacitive chemical sensor also enables a smart, analyte-sensitive chemical preconcentrator. After sorption of the sample by the sensing film, the film can be rapidly heated to release the sample for further analysis. Therefore, the capacitive chemical sensor can optimize the sample collection time prior to release to enable the rapid and accurate analysis of analytes by a microanalytical system.

  7. Effectiveness evaluation of STOL transport operations (phase 2). [computer simulation program of commercial short haul aircraft operations

    NASA Technical Reports Server (NTRS)

    Welp, D. W.; Brown, R. A.; Ullman, D. G.; Kuhner, M. B.

    1974-01-01

    A computer simulation program which models a commercial short-haul aircraft operating in the civil air system was developed. The purpose of the program is to evaluate the effect of a given aircraft avionics capability on the ability of the aircraft to perform on-time carrier operations. The program outputs consist primarily of those quantities which can be used to determine direct operating costs. These include: (1) schedule reliability or delays, (2) repairs/replacements, (3) fuel consumption, and (4) cancellations. More comprehensive models of the terminal area environment were added and a simulation of an existing airline operation was conducted to obtain a form of model verification. The capability of the program to provide comparative results (sensitivity analysis) was then demonstrated by modifying the aircraft avionics capability for additional computer simulations.

  8. Using the MCNP Taylor series perturbation feature (efficiently) for shielding problems

    NASA Astrophysics Data System (ADS)

    Favorite, Jeffrey

    2017-09-01

    The Taylor series or differential operator perturbation method, implemented in MCNP and invoked using the PERT card, can be used for efficient parameter studies in shielding problems. This paper shows how only two PERT cards are needed to generate an entire parameter study, including statistical uncertainty estimates (an additional three PERT cards can be used to give exact statistical uncertainties). One realistic example problem involves a detailed helium-3 neutron detector model and its efficiency as a function of the density of its high-density polyethylene moderator. The MCNP differential operator perturbation capability is extremely accurate for this problem. A second problem involves the density of the polyethylene reflector of the BeRP ball and is an example of first-order sensitivity analysis using the PERT capability. A third problem is an analytic verification of the PERT capability.

  9. Sensitivity Analysis of the Static Aeroelastic Response of a Wing

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.

    1993-01-01

    A technique to obtain the sensitivity of the static aeroelastic response of a three dimensional wing model is designed and implemented. The formulation is quite general and accepts any aerodynamic and structural analysis capability. A program to combine the discipline level, or local, sensitivities into global sensitivity derivatives is developed. A variety of representations of the wing pressure field are developed and tested to determine the most accurate and efficient scheme for representing the field outside of the aerodynamic code. Chebyshev polynomials are used to globally fit the pressure field. This approach had some difficulties in representing local variations in the field, so a variety of local interpolation polynomial pressure representations are also implemented. These panel based representations use a constant pressure value, a bilinearly interpolated value. or a biquadraticallv interpolated value. The interpolation polynomial approaches do an excellent job of reducing the numerical problems of the global approach for comparable computational effort. Regardless of the pressure representation used. sensitivity and response results with excellent accuracy have been produced for large integrated quantities such as wing tip deflection and trim angle of attack. The sensitivities of such things as individual generalized displacements have been found with fair accuracy. In general, accuracy is found to be proportional to the relative size of the derivatives to the quantity itself.

  10. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  11. Future prospects for high resolution X-ray spectrometers

    NASA Technical Reports Server (NTRS)

    Canizares, C. R.

    1981-01-01

    Capabilities of the X-ray spectroscopy payloads were compared. Comparison of capabilities of AXAF in the context of the science to be achieved is reported. The Einstein demonstrated the tremendous scientific power of spectroscopy to probe deeply the astrophysics of all types of celestial X-ray source. However, it has limitations in sensitivity and resolution. Each of the straw man instruments has a sensitivity that is at least an order of magnitude better than that of the Einstein FPSC. The AXAF promises powerful spectral capability.

  12. Electrochemical Sensors for Clinic Analysis

    PubMed Central

    Wang, You; Xu, Hui; Zhang, Jianming; Li, Guang

    2008-01-01

    Demanded by modern medical diagnosis, advances in microfabrication technology have led to the development of fast, sensitive and selective electrochemical sensors for clinic analysis. This review addresses the principles behind electrochemical sensor design and fabrication, and introduces recent progress in the application of electrochemical sensors to analysis of clinical chemicals such as blood gases, electrolytes, metabolites, DNA and antibodies, including basic and applied research. Miniaturized commercial electrochemical biosensors will form the basis of inexpensive and easy to use devices for acquiring chemical information to bring sophisticated analytical capabilities to the non-specialist and general public alike in the future. PMID:27879810

  13. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  14. Operationalising the capability approach as an outcome measure in public health: The development of the OCAP-18.

    PubMed

    Lorgelly, Paula K; Lorimer, Karen; Fenwick, Elisabeth A L; Briggs, Andrew H; Anand, Paul

    2015-10-01

    There is growing interest in operationalising the capability approach to measure quality of life. This paper reports the results of a research project undertaken in 2007 that sought to reduce and refine a longer survey in order to provide a summary measure of wellbeing and capability in the realm of public health. The reduction and refinement of the questionnaire took place across a number of stages, using both qualitative (five focus group discussions and 17 in-depth interviews) and quantitative (secondary data analysis, N = 1048 and primary data collection using postal surveys and interviews, N = 45) approaches. The questionnaire was reduced from its original 60+ questions to 24 questions (including demographic questions). Each of Nussbaum's ten Central Human Capabilities are measured using one (or more) of the 18 specific capability items which are included in the questionnaire (referred to as the OCAP-18). Analysis of the questionnaire responses (N = 198) found that respondents differed with respect to the levels of capabilities they reported, and that these capabilities appear to be sensitive to one's gender, age, income and deprivation decile. An index of capability, estimated by assuming equal weight for each capability question, found that the average level of capability amongst respondents was 12.44 (range 3-17.75). This index was found to be highly correlated with a measure of health (EQ-5D) and wellbeing (global QoL), although some differences were apparent. This project operationalised the capability approach to produce an instrument to measure the effectiveness (and cost effectiveness) of public health interventions; the resulting OCAP-18 appears to be responsive and measure something supplementary to health and wellbeing, thus offers a promising addition to the current suite of outcome measures that are available. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Label-free and high-sensitive detection for genetic point mutation based on hyperspectral interferometry

    NASA Astrophysics Data System (ADS)

    Fu, Rongxin; Li, Qi; Zhang, Junqi; Wang, Ruliang; Lin, Xue; Xue, Ning; Su, Ya; Jiang, Kai; Huang, Guoliang

    2016-10-01

    Label free point mutation detection is particularly momentous in the area of biomedical research and clinical diagnosis since gene mutations naturally occur and bring about highly fatal diseases. In this paper, a label free and high sensitive approach is proposed for point mutation detection based on hyperspectral interferometry. A hybridization strategy is designed to discriminate a single-base substitution with sequence-specific DNA ligase. Double-strand structures will take place only if added oligonucleotides are perfectly paired to the probe sequence. The proposed approach takes full use of the inherent conformation of double-strand DNA molecules on the substrate and a spectrum analysis method is established to point out the sub-nanoscale thickness variation, which benefits to high sensitive mutation detection. The limit of detection reach 4pg/mm2 according to the experimental result. A lung cancer gene point mutation was demonstrated, proving the high selectivity and multiplex analysis capability of the proposed biosensor.

  16. Single quantum dot analysis enables multiplexed point mutation detection by gap ligase chain reaction.

    PubMed

    Song, Yunke; Zhang, Yi; Wang, Tza-Huei

    2013-04-08

    Gene point mutations present important biomarkers for genetic diseases. However, existing point mutation detection methods suffer from low sensitivity, specificity, and a tedious assay processes. In this report, an assay technology is proposed which combines the outstanding specificity of gap ligase chain reaction (Gap-LCR), the high sensitivity of single-molecule coincidence detection, and the superior optical properties of quantum dots (QDs) for multiplexed detection of point mutations in genomic DNA. Mutant-specific ligation products are generated by Gap-LCR and subsequently captured by QDs to form DNA-QD nanocomplexes that are detected by single-molecule spectroscopy (SMS) through multi-color fluorescence burst coincidence analysis, allowing for multiplexed mutation detection in a separation-free format. The proposed assay is capable of detecting zeptomoles of KRAS codon 12 mutation variants with near 100% specificity. Its high sensitivity allows direct detection of KRAS mutation in crude genomic DNA without PCR pre-amplification. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental modemore » contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.« less

  18. Analytical improvements of hybrid LC-MS/MS techniques for the efficient evaluation of emerging contaminants in river waters: a case study of the Henares River (Madrid, Spain).

    PubMed

    Pérez-Parada, Andrés; Gómez-Ramos, María del Mar; Martínez Bueno, María Jesús; Uclés, Samanta; Uclés, Ana; Fernández-Alba, Amadeo R

    2012-02-01

    Instrumental capabilities and software tools of modern hybrid mass spectrometry (MS) instruments such as high-resolution mass spectrometry (HRMS), quadrupole time-of-flight (QTOF), and quadrupole linear ion trap (QLIT) were experimentally investigated for the study of emerging contaminants in Henares River water samples. Automated screening and confirmatory capabilities of QTOF working in full-scan MS and tandem MS (MS/MS) were explored when dealing with real samples. Investigations on the effect of sensitivity and resolution power influence on mass accuracy were studied for the correct assignment of the amoxicillin transformation product 5(R) amoxicillin-diketopiperazine-2',5' as an example of a nontarget compound. On the other hand, a comparison of quantitative and qualitative strategies based on direct injection analysis and off-line solid-phase extraction sample treatment were assayed using two different QLIT instruments for a selected group of emerging contaminants when operating in selected reaction monitoring (SRM) and information-dependent acquisition (IDA) modes. Software-aided screening usually needs a further confirmatory step. Resolving power and MS/MS feature of QTOF showed to confirm/reject most findings in river water, although sensitivity-related limitations are usually found. Superior sensitivity of modern QLIT-MS/MS offered the possibility of direct injection analysis for proper quantitative study of a variety of contaminants, while it simultaneously reduced the matrix effect and increased the reliability of the results. Confirmation of ethylamphetamine, which lacks on a second SRM transition, was accomplished by using the IDA feature. Hybrid MS instruments equipped with high resolution and high sensitivity contributes to enlarge the scope of targeted analytes in river waters. However, in the tested instruments, there is a margin of improvement principally in required sensitivity and data treatment software tools devoted to reliable confirmation and improved automated data processing.

  19. Selection of optimal sensors for predicting performance of polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Mao, Lei; Jackson, Lisa

    2016-10-01

    In this paper, sensor selection algorithms are investigated based on a sensitivity analysis, and the capability of optimal sensors in predicting PEM fuel cell performance is also studied using test data. The fuel cell model is developed for generating the sensitivity matrix relating sensor measurements and fuel cell health parameters. From the sensitivity matrix, two sensor selection approaches, including the largest gap method, and exhaustive brute force searching technique, are applied to find the optimal sensors providing reliable predictions. Based on the results, a sensor selection approach considering both sensor sensitivity and noise resistance is proposed to find the optimal sensor set with minimum size. Furthermore, the performance of the optimal sensor set is studied to predict fuel cell performance using test data from a PEM fuel cell system. Results demonstrate that with optimal sensors, the performance of PEM fuel cell can be predicted with good quality.

  20. Significance of dual polarized long wavelength radar for terrain analysis

    NASA Technical Reports Server (NTRS)

    Macdonald, H. C.; Waite, W. P.

    1978-01-01

    Long wavelength systems with improved penetration capability have been considered to have the potential for minimizing the vegetation contribution and enhancing the surface return variations. L-band imagery of the Arkansas geologic test site provides confirmatory evidence of this effect. However, the increased wavelength increases the sensitivity to larger scale structure at relatively small incidence angles. The regularity of agricultural and urban scenes provides large components in the low frequency-large scale portion of the roughness spectrum that are highly sensitive to orientation. The addition of a cross polarized channel is shown to enable the interpreter to distinguish vegetation and orientational perturbations in the surface return.

  1. Thermo-elastic optical coherence tomography.

    PubMed

    Wang, Tianshi; Pfeiffer, Tom; Wu, Min; Wieser, Wolfgang; Amenta, Gaetano; Draxinger, Wolfgang; van der Steen, Antonius F W; Huber, Robert; Soest, Gijs van

    2017-09-01

    The absorption of nanosecond laser pulses induces rapid thermo-elastic deformation in tissue. A sub-micrometer scale displacement occurs within a few microseconds after the pulse arrival. In this Letter, we investigate the laser-induced thermo-elastic deformation using a 1.5 MHz phase-sensitive optical coherence tomography (OCT) system. A displacement image can be reconstructed, which enables a new modality of phase-sensitive OCT, called thermo-elastic OCT. An analysis of the results shows that the optical absorption is a dominating factor for the displacement. Thermo-elastic OCT is capable of visualizing inclusions that do not appear on the structural OCT image, providing additional tissue type information.

  2. Influence of the carrier-envelope phase of few-cycle pulses on ponderomotive surface-plasmon electron acceleration.

    PubMed

    Irvine, S E; Dombi, P; Farkas, Gy; Elezzabi, A Y

    2006-10-06

    Control over basic processes through the electric field of a light wave can lead to new knowledge of fundamental light-matter interaction phenomena. We demonstrate, for the first time, that surface-plasmon (SP) electron acceleration can be coherently controlled through the carrier-envelope phase (CEP) of an excitation optical pulse. Analysis indicates that the physical origin of the CEP sensitivity arises from the electron's ponderomotive interaction with the oscillating electromagnetic field of the SP wave. The ponderomotive electron acceleration mechanism provides sensitive (nJ energies), high-contrast, single-shot CEP measurement capability of few-cycle laser pulses.

  3. Detection of chitinase activity by 2-aminobenzoic acid labeling of chito-oligosaccharides.

    PubMed

    Ghauharali-van der Vlugt, Karen; Bussink, Anton P; Groener, Johanna E M; Boot, Rolf G; Aerts, Johannes M F G

    2009-01-01

    Chitinases are hydrolases capable of hydrolyzing the abundant natural polysaccharide chitin. Next to artificial fluorescent substrates, more physiological chito-oligomers are commonly used in chitinase assays. Analysis of chito-oligosaccharides products is generally accomplished by UV detection. However, the relatively poor sensitivity poses a serious limitation. Here we report on a novel, much more sensitive assay for the detection of chito-oligosaccharide reaction products released by chitinases, based on fluorescent detection, following chemical labeling by 2-aminobenzoic acid. Comparison with existing UV-based assays, shows that the novel assay offers the same advantages yet allows detection of chito-oligosaccharides in the low picomolar range.

  4. Quantifying the sensitivity of feedstock properties and process conditions on hydrochar yield, carbon content, and energy content.

    PubMed

    Li, Liang; Wang, Yiying; Xu, Jiting; Flora, Joseph R V; Hoque, Shamia; Berge, Nicole D

    2018-08-01

    Hydrothermal carbonization (HTC) is a wet, low temperature thermal conversion process that continues to gain attention for the generation of hydrochar. The importance of specific process conditions and feedstock properties on hydrochar characteristics is not well understood. To evaluate this, linear and non-linear models were developed to describe hydrochar characteristics based on data collected from HTC-related literature. A Sobol analysis was subsequently conducted to identify parameters that most influence hydrochar characteristics. Results from this analysis indicate that for each investigated hydrochar property, the model fit and predictive capability associated with the random forest models is superior to both the linear and regression tree models. Based on results from the Sobol analysis, the feedstock properties and process conditions most influential on hydrochar yield, carbon content, and energy content were identified. In addition, a variational process parameter sensitivity analysis was conducted to determine how feedstock property importance changes with process conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Towards point of care testing for C. difficile infection by volatile profiling, using the combination of a short multi-capillary gas chromatography column with metal oxide sensor detection

    NASA Astrophysics Data System (ADS)

    McGuire, N. D.; Ewen, R. J.; de Lacy Costello, B.; Garner, C. E.; Probert, C. S. J.; Vaughan, K.; Ratcliffe, N. M.

    2014-06-01

    Rapid volatile profiling of stool sample headspace was achieved using a combination of short multi-capillary chromatography column (SMCC), highly sensitive heated metal oxide semiconductor sensor and artificial neural network software. For direct analysis of biological samples this prototype offers alternatives to conventional gas chromatography (GC) detectors and electronic nose technology. The performance was compared to an identical instrument incorporating a long single capillary column (LSCC). The ability of the prototypes to separate complex mixtures was assessed using gas standards and homogenized in house ‘standard’ stool samples, with both capable of detecting more than 24 peaks per sample. The elution time was considerably faster with the SMCC resulting in a run time of 10 min compared to 30 min for the LSCC. The diagnostic potential of the prototypes was assessed using 50 C. difficile positive and 50 negative samples. The prototypes demonstrated similar capability of discriminating between positive and negative samples with sensitivity and specificity of 85% and 80% respectively. C. difficile is an important cause of hospital acquired diarrhoea, with significant morbidity and mortality around the world. A device capable of rapidly diagnosing the disease at the point of care would reduce cases, deaths and financial burden.

  6. Towards point of care testing for C. difficile infection by volatile profiling, using the combination of a short multi-capillary gas chromatography column with metal oxide sensor detection

    PubMed Central

    McGuire, N D; Ewen, R J; de Lacy Costello, B; Garner, C E; Probert, C S J; Vaughan, K.; Ratcliffe, N M

    2016-01-01

    Rapid volatile profiling of stool sample headspace was achieved using a combination of short multi-capillary chromatography column (SMCC), highly sensitive heated metal oxide semiconductor (MOS) sensor and artificial neural network (ANN) software. For direct analysis of biological samples this prototype offers alternatives to conventional GC detectors and electronic nose technology. The performance was compared to an identical instrument incorporating a long single capillary column (LSCC). The ability of the prototypes to separate complex mixtures was assessed using gas standards and homogenised in house ‘standard’ stool samples, with both capable of detecting more than 24 peaks per sample. The elution time was considerably faster with the SMCC resulting in a run time of 10 minutes compared to 30 minutes for the LSCC. The diagnostic potential of the prototypes was assessed using 50 C. difficile positive and 50 negative samples. The prototypes demonstrated similar capability of discriminating between positive and negative samples with sensitivity and specificity of 85% and 80% respectively. C. difficile is an important cause of hospital acquired diarrhoea, with significant morbidity and mortality around the world. A device capable of rapidly diagnosing the disease at the point of care would reduce cases, deaths and financial burden. PMID:27212803

  7. Dynamic analysis of a photovoltaic power system with battery storage capability

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.; Blaha, R. J.; Pickrell, R. L.

    1979-01-01

    A photovolataic power system with a battery storage capability is analyzed. A dual battery current control concept is proposed, which enables the battery to either supply or accept power depending upon system environment and load conditions. A simulation of the power system, including the battery current control, is developed and evaluated. The evaulation demonstrate the visbility of the battery control concept of switch the battery from a charge to discharge mode and back as required by load and environmental conditions. An acceptable system operation is demonstrated over the entire insolation range. Additionally, system sensitivity, bandwidth, and damping characteristics of the battery control are shown to be acceptable for a projected hardware implementation.

  8. Cosmic ray composition investigations using ICE/ISEE-3

    NASA Technical Reports Server (NTRS)

    Wiedenbeck, Mark E.

    1992-01-01

    The analysis of data from the high energy cosmic experiment on ISEE-3 and associated modeling and interpretation activities are discussed. The ISEE-3 payload included two instruments capable of measuring the composition of heavy cosmic rays. The designs of these two instruments incorporated innovations which made it possible, for the first time, to measure isotopic as well as the chemical composition for a wide range of elements. As the result of the demonstrations by these two instruments of the capability to resolve individual cosmic ray isotopes, a new generation of detectors was developed using very similar designs, but having improved reliability and increased sensitive area. The composition measurements which were obtained from the ISEE-3 experiment are summarized.

  9. Metamaterial split ring resonator as a sensitive mechanical vibration sensor

    NASA Astrophysics Data System (ADS)

    Sikha Simon, K.; Chakyar, Sreedevi P.; Andrews, Jolly; Joseph V., P.

    2017-06-01

    This paper introduces a sensitive vibration sensor based on microwave metamaterial Split Ring Resonator (SRR) capable of detecting any ground vibration. The experimental setup consists of single Broad-side Coupled SRR (BCSRR) unit fixed on a cantilever capable of sensitive vibrations. It is arranged between transmitting and receiving probes of a microwave measurement system. The absorption level variations at the resonant frequency due to the displacement from the reference plane of SRR, which is a function of the strength of external mechanical vibration, is analyzed. This portable and cost effective sensor working on a single frequency is observed to be capable of detecting even very weak vibrations. This may find potential applications in the field of tamper-proofing, mining, quarrying and earthquake sensing.

  10. Digital Avionics Information System (DAIS): Life Cycle Cost Impact Modeling System Reliability, Maintainability, and Cost Model (RMCM)--Description. Users Guide. Final Report.

    ERIC Educational Resources Information Center

    Goclowski, John C.; And Others

    The Reliability, Maintainability, and Cost Model (RMCM) described in this report is an interactive mathematical model with a built-in sensitivity analysis capability. It is a major component of the Life Cycle Cost Impact Model (LCCIM), which was developed as part of the DAIS advanced development program to be used to assess the potential impacts…

  11. Reflections of a Wave: An Analysis of Photonic Doppler Velocimetry Systems

    DTIC Science & Technology

    2015-03-16

    system employed by the Advanced Initiation Sciences team, (Munitions Direc - torate, AFRL) is capable of explosive sensitivity testing. The errors from...1961, experiments proved that Semenov Theory aligned well with well- stirred liquids [25, p. 179]. In combat applications, the military usually utilizes...solid explosives instead of liquid ones due to the higher stability of solid-molded explosives where conduction has a huge influence on initiation

  12. Bessel beam OCM for analysis of global ischemia in mouse brain

    NASA Astrophysics Data System (ADS)

    Rapolu, Mounika; Dolezyczek, Hubert; Tamborski, Szymon; Malinowska, Monika; Wilczynski, Grzegorz; Szkulmowski, Maciej; Wojtkowski, Maciej

    2017-07-01

    We present the in-vivo imaging of the global mouse brain ischemia using Bessel beam optical coherence microscopy. This method allows to monitor changes in brain structure with extra control of blood flow during the process of artery occlusion. The results show the capability and sensitivity of OCM system with Bessel beam to analyze brain plasticity after severe injury within a period of 8 days.

  13. Fractal dimension based damage identification incorporating multi-task sparse Bayesian learning

    NASA Astrophysics Data System (ADS)

    Huang, Yong; Li, Hui; Wu, Stephen; Yang, Yongchao

    2018-07-01

    Sensitivity to damage and robustness to noise are critical requirements for the effectiveness of structural damage detection. In this study, a two-stage damage identification method based on the fractal dimension analysis and multi-task Bayesian learning is presented. The Higuchi’s fractal dimension (HFD) based damage index is first proposed, directly examining the time-frequency characteristic of local free vibration data of structures based on the irregularity sensitivity and noise robustness analysis of HFD. Katz’s fractal dimension is then presented to analyze the abrupt irregularity change of the spatial curve of the displacement mode shape along the structure. At the second stage, the multi-task sparse Bayesian learning technique is employed to infer the final damage localization vector, which borrow the dependent strength of the two fractal dimension based damage indication information and also incorporate the prior knowledge that structural damage occurs at a limited number of locations in a structure in the absence of its collapse. To validate the capability of the proposed method, a steel beam and a bridge, named Yonghe Bridge, are analyzed as illustrative examples. The damage identification results demonstrate that the proposed method is capable of localizing single and multiple damages regardless of its severity, and show superior robustness under heavy noise as well.

  14. Secondary ion mass spectrometers (SIMS) for calcium isotope measurements as an application to biological samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craven, S.M.; Hoenigman, J.R.; Moddeman, W.E.

    1981-11-20

    The potential use of secondary ion mass spectroscopy (SIMS) to analyze biological samples for calcium isotopes is discussed. Comparison of UTI and Extranuclear based quadrupole systems is made on the basis of the analysis of CaO and calcium metal. The Extranuclear quadrupole based system is superior in resolution and sensitivity to the UTI system and is recommended. For determination of calcium isotopes to within an accuracy of a few percent a high resolution quadrupole, such as the Extranuclear, and signal averaging capability are required. Charge neutralization will be mandated for calcium oxide, calcium nitrate, or calcium oxalate. SIMS is notmore » capable of the high precision and high accuracy results possible by thermal ionization methods, but where faster analysis is desirable with an accuracy of a few percent, SIMS is a viable alternative.« less

  15. FY17 Status Report on NEAMS Neutronics Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, C. H.; Jung, Y. S.; Smith, M. A.

    2017-09-30

    Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less

  16. A direct electron detector for time-resolved MeV electron microscopy

    DOE PAGES

    Vecchione, T.; Denes, P.; Jobe, R. K.; ...

    2017-03-15

    The introduction of direct electron detectors enabled the structural biology revolution of cryogenic electron microscopy. Direct electron detectors are now expected to have a similarly dramatic impact on time-resolved MeV electron microscopy, particularly by enabling both spatial and temporal jitter correction. Here in this paper, we report on the commissioning of a direct electron detector for time-resolved MeV electron microscopy. The direct electron detector demonstrated MeV single electron sensitivity and is capable of recording megapixel images at 180 Hz. The detector has a 15-bit dynamic range, better than 30-μm spatial resolution and less than 20 analogue-to-digital converter count RMS pixelmore » noise. The unique capabilities of the direct electron detector and the data analysis required to take advantage of these capabilities are presented. The technical challenges associated with generating and processing large amounts of data are also discussed.« less

  17. A direct electron detector for time-resolved MeV electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vecchione, T.; Denes, P.; Jobe, R. K.

    The introduction of direct electron detectors enabled the structural biology revolution of cryogenic electron microscopy. Direct electron detectors are now expected to have a similarly dramatic impact on time-resolved MeV electron microscopy, particularly by enabling both spatial and temporal jitter correction. Here we report on the commissioning of a direct electron detector for time-resolved MeV electron microscopy. The direct electron detector demonstrated MeV single electron sensitivity and is capable of recording megapixel images at 180 Hz. The detector has a 15-bit dynamic range, better than 30-μmμm spatial resolution and less than 20 analogue-to-digital converter count RMS pixel noise. The uniquemore » capabilities of the direct electron detector and the data analysis required to take advantage of these capabilities are presented. The technical challenges associated with generating and processing large amounts of data are also discussed.« less

  18. A direct electron detector for time-resolved MeV electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vecchione, T.; Denes, P.; Jobe, R. K.

    The introduction of direct electron detectors enabled the structural biology revolution of cryogenic electron microscopy. Direct electron detectors are now expected to have a similarly dramatic impact on time-resolved MeV electron microscopy, particularly by enabling both spatial and temporal jitter correction. Here in this paper, we report on the commissioning of a direct electron detector for time-resolved MeV electron microscopy. The direct electron detector demonstrated MeV single electron sensitivity and is capable of recording megapixel images at 180 Hz. The detector has a 15-bit dynamic range, better than 30-μm spatial resolution and less than 20 analogue-to-digital converter count RMS pixelmore » noise. The unique capabilities of the direct electron detector and the data analysis required to take advantage of these capabilities are presented. The technical challenges associated with generating and processing large amounts of data are also discussed.« less

  19. Toward Improved Force-Field Accuracy through Sensitivity Analysis of Host-Guest Binding Thermodynamics

    PubMed Central

    Yin, Jian; Fenley, Andrew T.; Henriksen, Niel M.; Gilson, Michael K.

    2015-01-01

    Improving the capability of atomistic computer models to predict the thermodynamics of noncovalent binding is critical for successful structure-based drug design, and the accuracy of such calculations remains limited by non-optimal force field parameters. Ideally, one would incorporate protein-ligand affinity data into force field parametrization, but this would be inefficient and costly. We now demonstrate that sensitivity analysis can be used to efficiently tune Lennard-Jones parameters of aqueous host-guest systems for increasingly accurate calculations of binding enthalpy. These results highlight the promise of a comprehensive use of calorimetric host-guest binding data, along with existing validation data sets, to improve force field parameters for the simulation of noncovalent binding, with the ultimate goal of making protein-ligand modeling more accurate and hence speeding drug discovery. PMID:26181208

  20. Rare cell isolation and analysis in microfluidics

    PubMed Central

    Chen, Yuchao; Li, Peng; Huang, Po-Hsun; Xie, Yuliang; Mai, John D.; Wang, Lin; Nguyen, Nam-Trung; Huang, Tony Jun

    2014-01-01

    Rare cells are low-abundance cells in a much larger population of background cells. Conventional benchtop techniques have limited capabilities to isolate and analyze rare cells because of their generally low selectivity and significant sample loss. Recent rapid advances in microfluidics have been providing robust solutions to the challenges in the isolation and analysis of rare cells. In addition to the apparent performance enhancements resulting in higher efficiencies and sensitivity levels, microfluidics provides other advanced features such as simpler handling of small sample volumes and multiplexing capabilities for high-throughput processing. All of these advantages make microfluidics an excellent platform to deal with the transport, isolation, and analysis of rare cells. Various cellular biomarkers, including physical properties, dielectric properties, as well as immunoaffinities, have been explored for isolating rare cells. In this Focus article, we discuss the design considerations of representative microfluidic devices for rare cell isolation and analysis. Examples from recently published works are discussed to highlight the advantages and limitations of the different techniques. Various applications of these techniques are then introduced. Finally, a perspective on the development trends and promising research directions in this field are proposed. PMID:24406985

  1. Direct Detection of Nucleic Acid with Minimizing Background and Improving Sensitivity Based on a Conformation-Discriminating Indicator.

    PubMed

    Zhu, Lixuan; Qing, Zhihe; Hou, Lina; Yang, Sheng; Zou, Zhen; Cao, Zhong; Yang, Ronghua

    2017-08-25

    As is well-known, the nucleic acid indicator-based strategy is one of the major approaches to monitor the nucleic acid hybridization-mediated recognition events in biochemical analysis, displaying obvious advantages including simplicity, low cost, convenience, and generality. However, conventional indicators either hold strong self-fluorescence or can be lighted by both ssDNA and dsDNA, lacking absolute selectivity for a certain conformation, always with high background interference and low sensitivity in sensing; and additional processing (e.g., nanomaterial-mediated background suppression, and enzyme-catalyzed signal amplification) is generally required to improve the detection performance. In this work, a carbazole derivative, EBCB, has been synthesized and screened as a dsDNA-specific fluorescent indicator. Compared with conventional indicators under the same conditions, EBCB displayed a much higher selective coefficient for dsDNA, with little self-fluorescence and negligible effect from ssDNA. Based on its superior capability in DNA conformation-discrimination, high sensitivity with minimizing background interference was demonstrated for direct detection of nucleic acid, and monitoring nucleic acid-based circuitry with good reversibity, resulting in low detection limit and high capability for discriminating base-mismatching. Thus, we expect that this highly specific DNA conformation-discriminating indicator will hold good potential for application in biochemical sensing and molecular logic switching.

  2. Value of high-sensitivity C-reactive protein assays in predicting atrial fibrillation recurrence: a systematic review and meta-analysis.

    PubMed

    Yo, Chia-Hung; Lee, Si-Huei; Chang, Shy-Shin; Lee, Matthew Chien-Hung; Lee, Chien-Chang

    2014-02-20

    We performed a systematic review and meta-analysis of studies on high-sensitivity C-reactive protein (hs-CRP) assays to see whether these tests are predictive of atrial fibrillation (AF) recurrence after cardioversion. Systematic review and meta-analysis. PubMed, EMBASE and Cochrane databases as well as a hand search of the reference lists in the retrieved articles from inception to December 2013. This review selected observational studies in which the measurements of serum CRP were used to predict AF recurrence. An hs-CRP assay was defined as any CRP test capable of measuring serum CRP to below 0.6 mg/dL. We summarised test performance characteristics with the use of forest plots, hierarchical summary receiver operating characteristic curves and bivariate random effects models. Meta-regression analysis was performed to explore the source of heterogeneity. We included nine qualifying studies comprising a total of 347 patients with AF recurrence and 335 controls. A CRP level higher than the optimal cut-off point was an independent predictor of AF recurrence after cardioversion (summary adjusted OR: 3.33; 95% CI 2.10 to 5.28). The estimated pooled sensitivity and specificity for hs-CRP was 71.0% (95% CI 63% to 78%) and 72.0% (61% to 81%), respectively. Most studies used a CRP cut-off point of 1.9 mg/L to predict long-term AF recurrence (77% sensitivity, 65% specificity), and 3 mg/L to predict short-term AF recurrence (73% sensitivity, 71% specificity). hs-CRP assays are moderately accurate in predicting AF recurrence after successful cardioversion.

  3. CORSSTOL: Cylinder Optimization of Rings, Skin, and Stringers with Tolerance sensitivity

    NASA Technical Reports Server (NTRS)

    Finckenor, J.; Bevill, M.

    1995-01-01

    Cylinder Optimization of Rings, Skin, and Stringers with Tolerance (CORSSTOL) sensitivity is a design optimization program incorporating a method to examine the effects of user-provided manufacturing tolerances on weight and failure. CORSSTOL gives designers a tool to determine tolerances based on need. This is a decisive way to choose the best design among several manufacturing methods with differing capabilities and costs. CORSSTOL initially optimizes a stringer-stiffened cylinder for weight without tolerances. The skin and stringer geometry are varied, subject to stress and buckling constraints. Then the same analysis and optimization routines are used to minimize the maximum material condition weight subject to the least favorable combination of tolerances. The adjusted optimum dimensions are provided with the weight and constraint sensitivities of each design variable. The designer can immediately identify critical tolerances. The safety of parts made out of tolerance can also be determined. During design and development of weight-critical systems, design/analysis tools that provide product-oriented results are of vital significance. The development of this program and methodology provides designers with an effective cost- and weight-saving design tool. The tolerance sensitivity method can be applied to any system defined by a set of deterministic equations.

  4. Probabilistic structural analysis using a general purpose finite element program

    NASA Astrophysics Data System (ADS)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  5. Investigating the Water Vapor Component of the Greenhouse Effect from the Atmospheric InfraRed Sounder (AIRS)

    NASA Astrophysics Data System (ADS)

    Gambacorta, A.; Barnet, C.; Sun, F.; Goldberg, M.

    2009-12-01

    We investigate the water vapor component of the greenhouse effect in the tropical region using data from the Atmospheric InfraRed Sounder (AIRS). Differently from previous studies who have relayed on the assumption of constant lapse rate and performed coarse layer or total column sensitivity analysis, we resort to AIRS high vertical resolution to measure the greenhouse effect sensitivity to water vapor along the vertical column. We employ a "partial radiative perturbation" methodology and discriminate between two different dynamic regimes, convective and non-convective. This analysis provides useful insights on the occurrence and strength of the water vapor greenhouse effect and its sensitivity to spatial variations of surface temperature. By comparison with the clear-sky computation conducted in previous works, we attempt to confine an estimate for the cloud contribution to the greenhouse effect. Our results compare well with the current literature, falling in the upper range of the existing global circulation model estimates. We value the results of this analysis as a useful reference to help discriminate among model simulations and improve our capability to make predictions about the future of our climate.

  6. Resilience through adaptation

    PubMed Central

    van Voorn, George A. K.; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover’s distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system. PMID:28196372

  7. Resilience through adaptation.

    PubMed

    Ten Broeke, Guus A; van Voorn, George A K; Ligtenberg, Arend; Molenaar, Jaap

    2017-01-01

    Adaptation of agents through learning or evolution is an important component of the resilience of Complex Adaptive Systems (CAS). Without adaptation, the flexibility of such systems to cope with outside pressures would be much lower. To study the capabilities of CAS to adapt, social simulations with agent-based models (ABMs) provide a helpful tool. However, the value of ABMs for studying adaptation depends on the availability of methodologies for sensitivity analysis that can quantify resilience and adaptation in ABMs. In this paper we propose a sensitivity analysis methodology that is based on comparing time-dependent probability density functions of output of ABMs with and without agent adaptation. The differences between the probability density functions are quantified by the so-called earth-mover's distance. We use this sensitivity analysis methodology to quantify the probability of occurrence of critical transitions and other long-term effects of agent adaptation. To test the potential of this new approach, it is used to analyse the resilience of an ABM of adaptive agents competing for a common-pool resource. Adaptation is shown to contribute positively to the resilience of this ABM. If adaptation proceeds sufficiently fast, it may delay or avert the collapse of this system.

  8. Expanding the occupational health methodology: A concatenated artificial neural network approach to model the burnout process in Chinese nurses.

    PubMed

    Ladstätter, Felix; Garrosa, Eva; Moreno-Jiménez, Bernardo; Ponsoda, Vicente; Reales Aviles, José Manuel; Dai, Junming

    2016-01-01

    Artificial neural networks are sophisticated modelling and prediction tools capable of extracting complex, non-linear relationships between predictor (input) and predicted (output) variables. This study explores this capacity by modelling non-linearities in the hardiness-modulated burnout process with a neural network. Specifically, two multi-layer feed-forward artificial neural networks are concatenated in an attempt to model the composite non-linear burnout process. Sensitivity analysis, a Monte Carlo-based global simulation technique, is then utilised to examine the first-order effects of the predictor variables on the burnout sub-dimensions and consequences. Results show that (1) this concatenated artificial neural network approach is feasible to model the burnout process, (2) sensitivity analysis is a prolific method to study the relative importance of predictor variables and (3) the relationships among variables involved in the development of burnout and its consequences are to different degrees non-linear. Many relationships among variables (e.g., stressors and strains) are not linear, yet researchers use linear methods such as Pearson correlation or linear regression to analyse these relationships. Artificial neural network analysis is an innovative method to analyse non-linear relationships and in combination with sensitivity analysis superior to linear methods.

  9. Arts, literature and reflective writing as educational strategies to promote narrative reasoning capabilities among physiotherapy students.

    PubMed

    Caeiro, Carmen; Cruz, Eduardo Brazete; Pereira, Carla Mendes

    2014-11-01

    The use of arts, literature and reflective writing has becoming increasingly popular in health professionals education. However, research examining its contribution as an educational strategy to promote narrative reasoning capabilities is limited, particularly from the students' perspective. This study aimed to explore the final year physiotherapy students' perspectives about the contribution of arts, literature and reflective writing in facilitating narrative reasoning capabilities. Three focus group meetings using a semi-structured interview schedule were carried out to collect data. Focus group sessions were audiotaped and transcribed verbatim. Interpretative phenomenological analysis was used to conduct the study and analyze the transcripts. Three themes emerged: (1) developmental understanding of the patients' experiences; (2) developmental understanding about the self; and (3) embedding reflection in clinical practice. Students emphasized an increasing capability to be sensitive and vicariously experience the patient's experience. Through reflective writing, students reported they became more capable of thinking critically about their practice and learning needs for continuous professional development. Finally, students highlighted the contribution of these strategies in making reflection part of their practice. Final year physiotherapy students reported enhanced skills of narrative reasoning. The findings support the inclusion of these strategies within the undergraduate physiotherapy curricula.

  10. Signal Amplification Technologies for the Detection of Nucleic Acids: from Cell-Free Analysis to Live-Cell Imaging.

    PubMed

    Fozooni, Tahereh; Ravan, Hadi; Sasan, Hosseinali

    2017-12-01

    Due to their unique properties, such as programmability, ligand-binding capability, and flexibility, nucleic acids can serve as analytes and/or recognition elements for biosensing. To improve the sensitivity of nucleic acid-based biosensing and hence the detection of a few copies of target molecule, different modern amplification methodologies, namely target-and-signal-based amplification strategies, have already been developed. These recent signal amplification technologies, which are capable of amplifying the signal intensity without changing the targets' copy number, have resulted in fast, reliable, and sensitive methods for nucleic acid detection. Working in cell-free settings, researchers have been able to optimize a variety of complex and quantitative methods suitable for deploying in live-cell conditions. In this study, a comprehensive review of the signal amplification technologies for the detection of nucleic acids is provided. We classify the signal amplification methodologies into enzymatic and non-enzymatic strategies with a primary focus on the methods that enable us to shift away from in vitro detecting to in vivo imaging. Finally, the future challenges and limitations of detection for cellular conditions are discussed.

  11. Breakthrough Capability for UVOIR Space Astronomy: Reaching the Darkest Sky

    NASA Technical Reports Server (NTRS)

    Greenhouse, Matthew A.; Benson, Scott W.; Englander, Jacob; Falck, Robert D.; Fixsen, Dale J.; Gardner, Jonathan P.; Kruk, Jeffrey W.; Oleson, Steven R.; Thronson, Harley A.

    2014-01-01

    We describe how availability of new solar electric propulsion (SEP) technology can substantially increase the science capability of space astronomy missions working within the near-UV to far-infrared (UVOIR) spectrum by making dark sky orbits accessible for the first time. We present a proof of concept case study in which SEP is used to enable a 700 kg Explorer-class observatory payload to reach an orbit beyond where the zodiacal dust limits observatory sensitivity. The resulting scientific performance advantage relative to a Sun-Earth L2 point orbit is presented and discussed. We find that making SEP available to astrophysics Explorers can enable this small payload program to rival the science performance of much larger long development-time systems. We also present flight dynamics analysis which illustrates that this concept can be extended beyond Explorers to substantially improve the sensitivity performance of heavier (7000 kg) flagship-class astrophysics payloads such as the UVOIR successor to the James Webb Space Telescope by using high power SEP that is being developed for the Asteroid Redirect Robotics Mission.

  12. Short stack modeling of degradation in solid oxide fuel cells. Part II. Sensitivity and interaction analysis

    NASA Astrophysics Data System (ADS)

    Gazzarri, J. I.; Kesler, O.

    In the first part of this two-paper series, we presented a numerical model of the impedance behaviour of a solid oxide fuel cell (SOFC) aimed at simulating the change in the impedance spectrum induced by contact degradation at the interconnect-electrode, and at the electrode-electrolyte interfaces. The purpose of that investigation was to develop a non-invasive diagnostic technique to identify degradation modes in situ. In the present paper, we appraise the predictive capabilities of the proposed method in terms of its robustness to uncertainties in the input parameters, many of which are very difficult to measure independently. We applied this technique to the degradation modes simulated in Part I, in addition to anode sulfur poisoning. Electrode delamination showed the highest robustness to input parameter variations, followed by interconnect oxidation and interconnect detachment. The most sensitive degradation mode was sulfur poisoning, due to strong parameter interactions. In addition, we simulate several simultaneous two-degradation-mode scenarios, assessing the method's capabilities and limitations for the prediction of electrochemical behaviour of SOFC's undergoing multiple simultaneous degradation modes.

  13. Methodologies for Removing/Desorbing and Transporting Particles from Surfaces to Instrumentation

    NASA Astrophysics Data System (ADS)

    Miller, Carla J.; Cespedes, Ernesto R.

    2012-12-01

    Explosive trace detection (ETD) continues to be a key technology supporting the fight against terrorist bombing threats. Very selective and sensitive ETD instruments have been developed to detect explosive threats concealed on personnel, in vehicles, in luggage, and in cargo containers, as well as for forensic analysis (e.g. post blast inspection, bomb-maker identification, etc.) in a broad range of homeland security, law enforcement, and military applications. A number of recent studies have highlighted the fact that significant improvements in ETD systems' capabilities will be achieved, not by increasing the selectivity/sensitivity of the sensors, but by improved techniques for particle/vapor sampling, pre-concentration, and transport to the sensors. This review article represents a compilation of studies focused on characterizing the adhesive properties of explosive particles, the methodologies for removing/desorbing these particles from a range of surfaces, and approaches for transporting them to the instrument. The objectives of this review are to summarize fundamental work in explosive particle characterization, to describe experimental work performed in harvesting and transport of these particles, and to highlight those approaches that indicate high potential for improving ETD capabilities.

  14. Shape optimization using a NURBS-based interface-enriched generalized FEM

    DOE PAGES

    Najafi, Ahmad R.; Safdari, Masoud; Tortorelli, Daniel A.; ...

    2016-11-26

    This study presents a gradient-based shape optimization over a fixed mesh using a non-uniform rational B-splines-based interface-enriched generalized finite element method, applicable to multi-material structures. In the proposed method, non-uniform rational B-splines are used to parameterize the design geometry precisely and compactly by a small number of design variables. An analytical shape sensitivity analysis is developed to compute derivatives of the objective and constraint functions with respect to the design variables. Subtle but important new terms involve the sensitivity of shape functions and their spatial derivatives. As a result, verification and illustrative problems are solved to demonstrate the precision andmore » capability of the method.« less

  15. A Comparison Framework for Reactor Anti-Neutrino Detectors in Near-Field Nuclear Safeguards Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendenhall, M.; Bowden, N.; Brodsky, J.

    Electron anti-neutrino ( e) detectors can support nuclear safeguards, from reactor monitoring to spent fuel characterization. In recent years, the scientific community has developed multiple detector concepts, many of which have been prototyped or deployed for specific measurements by their respective collaborations. However, the diversity of technical approaches, deployment conditions, and analysis techniques complicates direct performance comparison between designs. We have begun development of a simulation framework to compare and evaluate existing and proposed detector designs for nonproliferation applications in a uniform manner. This report demonstrates the intent and capabilities of the framework by evaluating four detector design concepts, calculatingmore » generic reactor antineutrino counting sensitivity, and capabilities in a plutonium disposition application example.« less

  16. Quantitative Velocity Field Measurements in Reduced-Gravity Combustion Science and Fluid Physics Experiments

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Wernet, Mark P.

    1999-01-01

    Systems have been developed and demonstrated for performing quantitative velocity measurements in reduced gravity combustion science and fluid physics investigations. The unique constraints and operational environments inherent to reduced-gravity experimental facilities pose special challenges to the development of hardware and software systems. Both point and planar velocimetric capabilities are described, with particular attention being given to the development of systems to support the International Space Station laboratory. Emphasis has been placed on optical methods, primarily arising from the sensitivity of the phenomena of interest to intrusive probes. Limitations on available power, volume, data storage, and attendant expertise have motivated the use of solid-state sources and detectors, as well as efficient analysis capabilities emphasizing interactive data display and parameter control.

  17. Multimodality hard-x-ray imaging of a chromosome with nanoscale spatial resolution

    DOE PAGES

    Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth R.; ...

    2016-02-05

    Here, we developed a scanning hard x-ray microscope using a new class of x-ray nano-focusing optic called a multilayer Laue lens and imaged a chromosome with nanoscale spatial resolution. The combination of the hard x-ray's superior penetration power, high sensitivity to elemental composition, high spatial-resolution and quantitative analysis creates a unique tool with capabilities that other microscopy techniques cannot provide. Using this microscope, we simultaneously obtained absorption-, phase-, and fluorescence-contrast images of Pt-stained human chromosome samples. The high spatial-resolution of the microscope and its multi-modality imaging capabilities enabled us to observe the internal ultra-structures of a thick chromosome without sectioningmore » it.« less

  18. Multimodality hard-x-ray imaging of a chromosome with nanoscale spatial resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hanfei; Nazaretski, Evgeny; Lauer, Kenneth R.

    Here, we developed a scanning hard x-ray microscope using a new class of x-ray nano-focusing optic called a multilayer Laue lens and imaged a chromosome with nanoscale spatial resolution. The combination of the hard x-ray's superior penetration power, high sensitivity to elemental composition, high spatial-resolution and quantitative analysis creates a unique tool with capabilities that other microscopy techniques cannot provide. Using this microscope, we simultaneously obtained absorption-, phase-, and fluorescence-contrast images of Pt-stained human chromosome samples. The high spatial-resolution of the microscope and its multi-modality imaging capabilities enabled us to observe the internal ultra-structures of a thick chromosome without sectioningmore » it.« less

  19. Sensitivity assessment of freshwater macroinvertebrates to pesticides using biological traits.

    PubMed

    Ippolito, A; Todeschini, R; Vighi, M

    2012-03-01

    Assessing the sensitivity of different species to chemicals is one of the key points in predicting the effects of toxic compounds in the environment. Trait-based predicting methods have proved to be extremely efficient for assessing the sensitivity of macroinvertebrates toward compounds with non specific toxicity (narcotics). Nevertheless, predicting the sensitivity of organisms toward compounds with specific toxicity is much more complex, since it depends on the mode of action of the chemical. The aim of this work was to predict the sensitivity of several freshwater macroinvertebrates toward three classes of plant protection products: organophosphates, carbamates and pyrethroids. Two databases were built: one with sensitivity data (retrieved, evaluated and selected from the U.S. Environmental Protection Agency ECOTOX database) and the other with biological traits. Aside from the "traditional" traits usually considered in ecological analysis (i.e. body size, respiration technique, feeding habits, etc.), multivariate analysis was used to relate the sensitivity of organisms to some other characteristics which may be involved in the process of intoxication. Results confirmed that, besides traditional biological traits, related to uptake capability (e.g. body size and body shape) some traits more related to particular metabolic characteristics or patterns have a good predictive capacity on the sensitivity to these kinds of toxic substances. For example, behavioral complexity, assumed as an indicator of nervous system complexity, proved to be an important predictor of sensitivity towards these compounds. These results confirm the need for more complex traits to predict effects of highly specific substances. One key point for achieving a complete mechanistic understanding of the process is the choice of traits, whose role in the discrimination of sensitivity should be clearly interpretable, and not only statistically significant.

  20. Highly sensitive transient absorption imaging of graphene and graphene oxide in living cells and circulating blood.

    PubMed

    Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N; Chen, Yong P; Cheng, Ji-Xin; Yang, Chen

    2015-07-23

    We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials.

  1. Dynamic analysis of gas-core reactor system

    NASA Technical Reports Server (NTRS)

    Turner, K. H., Jr.

    1973-01-01

    A heat transfer analysis was incorporated into a previously developed model CODYN to obtain a model of open-cycle gaseous core reactor dynamics which can predict the heat flux at the cavity wall. The resulting model was used to study the sensitivity of the model to the value of the reactivity coefficients and to determine the system response for twenty specified perturbations. In addition, the model was used to study the effectiveness of several control systems in controlling the reactor. It was concluded that control drums located in the moderator region capable of inserting reactivity quickly provided the best control.

  2. JPL-ANTOPT antenna structure optimization program

    NASA Technical Reports Server (NTRS)

    Strain, D. M.

    1994-01-01

    New antenna path-length error and pointing-error structure optimization codes were recently added to the MSC/NASTRAN structural analysis computer program. Path-length and pointing errors are important measured of structure-related antenna performance. The path-length and pointing errors are treated as scalar displacements for statics loading cases. These scalar displacements can be subject to constraint during the optimization process. Path-length and pointing-error calculations supplement the other optimization and sensitivity capabilities of NASTRAN. The analysis and design functions were implemented as 'DMAP ALTERs' to the Design Optimization (SOL 200) Solution Sequence of MSC-NASTRAN, Version 67.5.

  3. Trace metal mapping by laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaiser, Jozef; Novotny, Dr. Karel; Hrdlicka, A

    2012-01-01

    Abstract: Laser-Induced Breakdown Spectroscopy (LIBS) is a sensitive optical technique capable of fast multi-elemental analysis of solid, gaseous and liquid samples. The potential applications of lasers for spectrochemical analysis were developed shortly after its invention; however the massive development of LIBS is connected with the availability of powerful pulsed laser sources. Since the late 80s of 20th century LIBS dominated the analytical atomic spectroscopy scene and its application are developed continuously. Here we review the utilization of LIBS for trace elements mapping in different matrices. The main emphasis is on trace metal mapping in biological samples.

  4. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Jason; Winkler, Jon

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  5. Effective moisture penetration depth model for residential buildings: Sensitivity analysis and guidance on model inputs

    DOE PAGES

    Woods, Jason; Winkler, Jon

    2018-01-31

    Moisture buffering of building materials has a significant impact on the building's indoor humidity, and building energy simulations need to model this buffering to accurately predict the humidity. Researchers requiring a simple moisture-buffering approach typically rely on the effective-capacitance model, which has been shown to be a poor predictor of actual indoor humidity. This paper describes an alternative two-layer effective moisture penetration depth (EMPD) model and its inputs. While this model has been used previously, there is a need to understand the sensitivity of this model to uncertain inputs. In this paper, we use the moisture-adsorbent materials exposed to themore » interior air: drywall, wood, and carpet. We use a global sensitivity analysis to determine which inputs are most influential and how the model's prediction capability degrades due to uncertainty in these inputs. We then compare the model's humidity prediction with measured data from five houses, which shows that this model, and a set of simple inputs, can give reasonable prediction of the indoor humidity.« less

  6. Methodology for Conducting Analyses of Army Capabilities

    DTIC Science & Technology

    1992-06-01

    31 Determine Sensitivity of Operations to Functions ........................ 34 Generate Capability Issues ...40 Package and Prioritize Issues ..................................... 40 IDENTIFY AND ASSESS CAPABILITY IMPROVEMENTS .................. 43 Generate...identify critical issues , and make force modernization recommendations to Headquarters, Depart- ment of the Army (HQDA). The work described in this report

  7. The Physiology, Biochemistry and Genetics of Survival of Bacteria Subjected to Environmental Stress

    DTIC Science & Technology

    1981-11-01

    sodium lauryl sulfate , but not to sodtimnl chloride or streptomycin alone. This sensitivity was again transient and capable of... sodium lauryl sulfate but not to sodium chloride or streptomycin alone. This sensitivity was again transient and capable of repair in the name simple...polymyxin B, bacitracin, and sodium lauryl . - 19 sutfate during growth, to ethylenediaminetetraacetic acid and sodium lauryl AUTHOR: 20 sulfate in

  8. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  9. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  10. Neural network representation and learning of mappings and their derivatives

    NASA Technical Reports Server (NTRS)

    White, Halbert; Hornik, Kurt; Stinchcombe, Maxwell; Gallant, A. Ronald

    1991-01-01

    Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed.

  11. Towards metals analysis using corona discharge ionization ion mobility spectrometry.

    PubMed

    Jafari, Mohammad T; Saraji, Mohammad; Sherafatmand, Hossein

    2016-02-25

    For the first time, the capability of corona discharge ionization ion mobility spectrometry (CD-IMS) in the determination of metal complex was evaluated. The extreme simplicity of dispersive liquid-liquid microextraction (DLLME) coupled to the high sensitivity of CD-IMS measurement could make this combination really useful for simple, rapid, and sensitive determination of metals in different samples. In this regard, mercury, as a model metal, was complexed with diethyldithiocarbamate (DEDTC), and then extracted into the carbon tetrachloride using DLLME. Some parameters affecting the extraction efficiency, including the type and volume of the extraction solvent, the type and volume of the disperser solvent, the concentration of the chelating agent, salt addition and, pH were exhaustively investigated. Under the optimized condition, the enrichment factor was obtained to be 142. The linear range of 0.035-10.0 μg mL(-1) with r(2) = 0.997 and the detection limit of 0.010 μg mL(-1) were obtained. The relative standard deviation values were calculated to be lower than 4% and 8% for intra-day and inter-day, respectively. Finally, the developed method was successfully applied for the extraction and determination of mercury in various real samples. The satisfactory results revealed the capability of the proposed method in trace analysis without tedious derivatization or hydride generation. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. An efficient method of reducing glass dispersion tolerance sensitivity

    NASA Astrophysics Data System (ADS)

    Sparrold, Scott W.; Shepard, R. Hamilton

    2014-12-01

    Constraining the Seidel aberrations of optical surfaces is a common technique for relaxing tolerance sensitivities in the optimization process. We offer an observation that a lens's Abbe number tolerance is directly related to the magnitude by which its longitudinal and transverse color are permitted to vary in production. Based on this observation, we propose a computationally efficient and easy-to-use merit function constraint for relaxing dispersion tolerance sensitivity. Using the relationship between an element's chromatic aberration and dispersion sensitivity, we derive a fundamental limit for lens scale and power that is capable of achieving high production yield for a given performance specification, which provides insight on the point at which lens splitting or melt fitting becomes necessary. The theory is validated by comparing its predictions to a formal tolerance analysis of a Cooke Triplet, and then applied to the design of a 1.5x visible linescan lens to illustrate optimization for reduced dispersion sensitivity. A selection of lenses in high volume production is then used to corroborate the proposed method of dispersion tolerance allocation.

  13. Value of high-sensitivity C-reactive protein assays in predicting atrial fibrillation recurrence: a systematic review and meta-analysis

    PubMed Central

    Yo, Chia-Hung; Lee, Si-Huei; Chang, Shy-Shin; Lee, Matthew Chien-Hung; Lee, Chien-Chang

    2014-01-01

    Objectives We performed a systematic review and meta-analysis of studies on high-sensitivity C-reactive protein (hs-CRP) assays to see whether these tests are predictive of atrial fibrillation (AF) recurrence after cardioversion. Design Systematic review and meta-analysis. Data sources PubMed, EMBASE and Cochrane databases as well as a hand search of the reference lists in the retrieved articles from inception to December 2013. Study eligibility criteria This review selected observational studies in which the measurements of serum CRP were used to predict AF recurrence. An hs-CRP assay was defined as any CRP test capable of measuring serum CRP to below 0.6 mg/dL. Primary and secondary outcome measures We summarised test performance characteristics with the use of forest plots, hierarchical summary receiver operating characteristic curves and bivariate random effects models. Meta-regression analysis was performed to explore the source of heterogeneity. Results We included nine qualifying studies comprising a total of 347 patients with AF recurrence and 335 controls. A CRP level higher than the optimal cut-off point was an independent predictor of AF recurrence after cardioversion (summary adjusted OR: 3.33; 95% CI 2.10 to 5.28). The estimated pooled sensitivity and specificity for hs-CRP was 71.0% (95% CI 63% to 78%) and 72.0% (61% to 81%), respectively. Most studies used a CRP cut-off point of 1.9 mg/L to predict long-term AF recurrence (77% sensitivity, 65% specificity), and 3 mg/L to predict short-term AF recurrence (73% sensitivity, 71% specificity). Conclusions hs-CRP assays are moderately accurate in predicting AF recurrence after successful cardioversion. PMID:24556243

  14. Impact of rheology on probabilistic forecasts of sea ice trajectories: application for search and rescue operations in the Arctic

    NASA Astrophysics Data System (ADS)

    Rabatel, Matthias; Rampal, Pierre; Carrassi, Alberto; Bertino, Laurent; Jones, Christopher K. R. T.

    2018-03-01

    We present a sensitivity analysis and discuss the probabilistic forecast capabilities of the novel sea ice model neXtSIM used in hindcast mode. The study pertains to the response of the model to the uncertainty on winds using probabilistic forecasts of ice trajectories. neXtSIM is a continuous Lagrangian numerical model that uses an elasto-brittle rheology to simulate the ice response to external forces. The sensitivity analysis is based on a Monte Carlo sampling of 12 members. The response of the model to the uncertainties is evaluated in terms of simulated ice drift distances from their initial positions, and from the mean position of the ensemble, over the mid-term forecast horizon of 10 days. The simulated ice drift is decomposed into advective and diffusive parts that are characterised separately both spatially and temporally and compared to what is obtained with a free-drift model, that is, when the ice rheology does not play any role in the modelled physics of the ice. The seasonal variability of the model sensitivity is presented and shows the role of the ice compactness and rheology in the ice drift response at both local and regional scales in the Arctic. Indeed, the ice drift simulated by neXtSIM in summer is close to the one obtained with the free-drift model, while the more compact and solid ice pack shows a significantly different mechanical and drift behaviour in winter. For the winter period analysed in this study, we also show that, in contrast to the free-drift model, neXtSIM reproduces the sea ice Lagrangian diffusion regimes as found from observed trajectories. The forecast capability of neXtSIM is also evaluated using a large set of real buoy's trajectories and compared to the capability of the free-drift model. We found that neXtSIM performs significantly better in simulating sea ice drift, both in terms of forecast error and as a tool to assist search and rescue operations, although the sources of uncertainties assumed for the present experiment are not sufficient for complete coverage of the observed IABP positions.

  15. Methods of Analysis by the U.S. Geological Survey National Water Quality Laboratory - Determination of Elements in Whole-Water Digests Using Inductively Coupled Plasma-Optical Emission Spectrometry and Inductively Coupled Plasma-Mass Spectrometry

    USGS Publications Warehouse

    Garbarino, John R.; Struzeski, Tedmund M.

    1998-01-01

    Inductively coupled plasma-optical emission spectrometry (ICP-OES) and inductively coupled plasma-mass spectrometry (ICP-MS) can be used to determine 26 elements in whole-water digests. Both methods have distinct advantages and disadvantages--ICP-OES is capable of analyzing samples with higher elemental concentrations without dilution, however, ICP-MS is more sensitive and capable of determining much lower elemental concentrations. Both techniques gave accurate results for spike recoveries, digested standard reference-water samples, and whole-water digests. Average spike recoveries in whole-water digests were 100 plus/minus 10 percent, although recoveries for digests with high dissolved-solid concentrations were lower for selected elements by ICP-MS. Results for standard reference-water samples were generally within 1 standard deviation of hte most probable values. Statistical analysis of the results from 43 whole-water digest indicated that there was no significant difference among ICP-OES, ICP-MS, and former official methods of analysis for 24 of the 26 elements evaluated.

  16. ROMUSE 2.0 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khuwaileh, Bassam; Turinsky, Paul; Williams, Brian J.

    2016-10-04

    ROMUSE (Reduced Order Modeling Based Uncertainty/Sensitivity Estimator) is an effort within the Consortium for Advanced Simulation of Light water reactors (CASL) to provide an analysis tool to be used in conjunction with reactor core simulators, especially the Virtual Environment for Reactor Applications (VERA). ROMUSE is written in C++ and is currently capable of performing various types of parameters perturbations, uncertainty quantification, surrogate models construction and subspace analysis. Version 2.0 has the capability to interface with DAKOTA which gives ROMUSE access to the various algorithms implemented within DAKOTA. ROMUSE is mainly designed to interface with VERA and the Comprehensive Modeling andmore » Simulation Suite for Nuclear Safety Analysis and Design (SCALE) [1,2,3], however, ROMUSE can interface with any general model (e.g. python and matlab) with Input/Output (I/O) format that follows the Hierarchical Data Format 5 (HDF5). In this brief user manual, the use of ROMUSE will be overviewed and example problems will be presented and briefly discussed. The algorithms provided here range from algorithms inspired by those discussed in Ref.[4] to nuclear-specific algorithms discussed in Ref. [3].« less

  17. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  18. [Nootropic and analgesic effects of Semax following different routes of administration].

    PubMed

    Manchenko, D M; Glazova, N Iu; Levitskaia, N G; Andreeva, L A; Kamenskiĭ, A A; Miasoedov, N F

    2010-10-01

    Heptapeptide Semax (MEHFPGP) is the fragment of ACTH(4-10) analogue with prolonged neurotropic activity. The aim of the present work was to study the Semax effects on learning capability and pain sensitivity in white rats following intraperitoneal and intranasal administration in different doses. Semax nootropic effects were studied in the test of acquisition of passive avoidance task. Pain sensitivity was estimated in Randall-Selitto paw-withdrawal test. It was shown that Semax exerts nootropic and analgesic activities following intraperitoneal administration. Analysis of dependence of these effects on dose resulted in different dose-response curves. Following intranasal administration, Semax was more potent in learning improvement compared to intraperitoneal administration. The peptide failed to affect the animal pain sensitivity following intranasal administration as opposed to intraperitoneal administration. The data obtained suggest different mechanisms and brain structures involved in realization of the nootropic and analgesic effects of Semax.

  19. Fundamentals and practice for ultrasensitive laser-induced fluorescence detection in microanalytical systems.

    PubMed

    Johnson, Mitchell E; Landers, James P

    2004-11-01

    Laser-induced fluorescence is an extremely sensitive method for detection in chemical separations. In addition, it is well-suited to detection in small volumes, and as such is widely used for capillary electrophoresis and microchip-based separations. This review explores the detailed instrumental conditions required for sub-zeptomole, sub-picomolar detection limits. The key to achieving the best sensitivity is to use an excitation and emission volume that is matched to the separation system and that, simultaneously, will keep scattering and luminescence background to a minimum. We discuss how this is accomplished with confocal detection, 90 degrees on-capillary detection, and sheath-flow detection. It is shown that each of these methods have their advantages and disadvantages, but that all can be used to produce extremely sensitive detectors for capillary- or microchip-based separations. Analysis of these capabilities allows prediction of the optimal means of achieving ultrasensitive detection on microchips.

  20. Multi-modal approach using Raman spectroscopy and optical coherence tomography for the discrimination of colonic adenocarcinoma from normal colon

    PubMed Central

    Ashok, Praveen C.; Praveen, Bavishna B.; Bellini, Nicola; Riches, Andrew; Dholakia, Kishan; Herrington, C. Simon

    2013-01-01

    We report a multimodal optical approach using both Raman spectroscopy and optical coherence tomography (OCT) in tandem to discriminate between colonic adenocarcinoma and normal colon. Although both of these non-invasive techniques are capable of discriminating between normal and tumour tissues, they are unable individually to provide both the high specificity and high sensitivity required for disease diagnosis. We combine the chemical information derived from Raman spectroscopy with the texture parameters extracted from OCT images. The sensitivity obtained using Raman spectroscopy and OCT individually was 89% and 78% respectively and the specificity was 77% and 74% respectively. Combining the information derived using the two techniques increased both sensitivity and specificity to 94% demonstrating that combining complementary optical information enhances diagnostic accuracy. These data demonstrate that multimodal optical analysis has the potential to achieve accurate non-invasive cancer diagnosis. PMID:24156073

  1. Development of an Extraterrestrial Organic Analyzer (EOA) for Highly Sensitive Organic Detection on an Ice Shell Impact Penetrator (IceShIP)

    NASA Astrophysics Data System (ADS)

    Stockton, A. M.; Duca, Z. A.; Cato, M.; Cantrell, T.; Kim, J.; Putman, P.; Schmidt, B. E.

    2016-12-01

    Kinetic penetrators have the potential to enable low cost in situ measurements of the ice of worlds including Europa and Enceladus [1]. Their small size and mass, critical to limiting their kinetic energy, makes them ideal small landers riding on primarily orbiter missions, while enabling sampling at several m depth due to burial and excavation. In situ microfluidic-based organic analysis systems are a powerful, miniaturized approach for detecting markers of habitability and recent biological activity. Development of microfluidic technology, like that of the Mars Organic Analyzer (MOA) [2,3] and Enceladus Organic Analyzer (EOA), has led to an instrument capable of in situ organic chemical analysis compatible with a kinetic penetrator platform. This technology uses an integrated microfluidic processor to prepare samples for analysis via fluorescent derivatization prior to highly sensitive laser-induced fluorescence (LIF) detection. Selective derivatization in the presence of a chiral selector enables distinction between amino acid enantiomers. Finite element analysis of the core microfluidic processing and analytical device indicated that the device itself is more than capable of surviving the stresses associated with an impact acceleration of >50,000g. However, a number of developments were still required to enable a flight-ready system. Preliminary experiments indicated that moving from a pneumatically-actuated to a hydraulically-actuated microvalve system may provide better impact resistance. A hydraulically-actuated microvalve system was developed and tested. A modification of an established microfabricated LIF detection system would use indium bump bonding to permanently weld optical components using standard microfabrication techniques with perfect alignment. Recent work has also focused on developing and characterizing impact-resistant electronics. This work shows the low-TRL development of EOA's LIF and microfluidic subsystems for future planetary impact penetrator missions. With correct structural decisions and optimizations, EOA can survive a 50,000g impact, making it the only current optical instrument with this capability. References: [1] Gowen et al., Adv. Space Res., 2011, 725. [2] Skelley et al, PNAS USA, 2005, 102, 1041. [3] Kim J., et al, Anal. Chem., 2013, 85, 7682.

  2. The Impact of Place in Building Human Capability

    ERIC Educational Resources Information Center

    Garlick, Steve

    2014-01-01

    While it is accepted that there are "sensitive" and "critical" periods of life during which certain human capabilities are more readily acquired, and where the multiplied returns on our investment in human capability building are more significant, it is also argued that there are place-based contexts (society, nature, culture,…

  3. Sensitive Amino Acid Composition and Chirality Analysis in the Martian Regolith with a Microfabricated in situ Analyzer

    NASA Astrophysics Data System (ADS)

    Skelley, A. M.; Grunthaner, F. J.; Bada, J. L.; Mathies, R. A.

    2003-12-01

    Recent advances in microfabricated "lab-on-a-chip" technologies have dramatically enhanced the capabilities of chemical and biochemical analyzers. The portability and sensitivity of these devices makes them ideal instruments for in situ chemical analysis on other planets. We have focused our initial studies on amino acid analysis because amino acids are more chemically resistant to decomposition than other biomolecules, and because amino acid chirality is a well-defined biomarker [1]. Previously, we developed a prototype electrophoresis chip, detection system and analysis method where the amino acids were labeled with fluorescein using FITC and then electrophoretically analyzed using g-cyclodextrin as the chiral resolution agent [2]. Extracts of the Murchison meteorite were analyzed, and the D/L ratios determined by microchip CE closely matched those from HPLC and GCMS and exhibited greater precision. Our microchip analyzer has now been further improved by establishing the capability of performing amino acid composition and chirality analyses using fluorescamine rather than FITC [3]. Fluorescamine is advantageous because it reacts more rapidly than FITC, and because excess reagent is hydrolyzed to a non-fluorescent product. Furthermore, the use of fluorescamine facilitates interfacing with the Mars Organic Detector (MOD) [4]. Fluorescamine-amino acids are separated using similar conditions as the FITC-aa, resulting in similar separation times and identical elution orders. Fluorescamine-aa are chirally resolved in the presence of hydroxy-propyl-b-cyclodextrin, and typical limits of detection are ˜ 50 nM. This work establishes the feasibility of combining fluorescamine labeling of amino acids with microfabricated CE devices to develop low-volume, high-sensitivity apparatus for extraterrestrial exploration. The stage is now set for the development of the Mars Organic Analyzer (MOA), a portable analysis system for amino acid extraction and chiral analysis that will combine the capabilities of microchip CE with the previously developed extraction capabilities of MOD [4]. Amino acids are first extracted from soil by sublimation to a cold finger coated with fluorescamine for solid phase labeling. Sample transfer between MOD and the CE device is achieved through a capillary sipper driven by microfabricated valves and pumps [5]. The construction of a portable MOA instrument will facilitate in situ studies of amino acids in Mars analog sites such as the Atacama Desert in Chile. Preliminary chiral analyses of Atacama soil extracts on the microfabricated CE device have shown amino acid detection down to low ppb concentrations. Future field tests in the Atacama Desert will explore the feasibility of the portable CE device for performing in situ amino acid analysis. This work will provide the technology base for the development the Mars Organic Laboratory (MOL), a portable device that will analyze a broad suite of biomolecules, including nucleobases, sugars, and organic acids and bases [6]. [1]J.L. Bada, G.D. McDonald, Icarus 114 (1995) 139. [2]L.D. Hutt, D.P. Glavin, J.L. Bada, R.A. Mathies, Anal. Chem. 71 (1999) 4000. [3]A.M. Skelley, R.A. Mathies, J. Chromatogr. A (2003) in press. [4]G. Kminek, J.L. Bada, O. Botta, D.P. Glavin, F. Grunthaner, Planet. Space Sci. 48 (2000) 1087. [5]W.H. Grover, A.M. Skelley, C.N. Liu, E.T. Lagally, R.A. Mathies, Sens. Actuators B 89 (2003) 325. [6]A.M. Skelley, F.J. Grunthaner, J.F. Bada, R.A. Mathies, in SPIE: Proceedings of the In-Situ Instrument Technologies Meeting, Pasadena, CA, 2002.

  4. Use of SCALE Continuous-Energy Monte Carlo Tools for Eigenvalue Sensitivity Coefficient Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2013-01-01

    The TSUNAMI code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The CLUTCH and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in themore » CE KENO framework to generate the capability for TSUNAMI-3D to perform eigenvalue sensitivity calculations in continuous-energy applications. This work explores the improvements in accuracy that can be gained in eigenvalue and eigenvalue sensitivity calculations through the use of the SCALE CE KENO and CE TSUNAMI continuous-energy Monte Carlo tools as compared to multigroup tools. The CE KENO and CE TSUNAMI tools were used to analyze two difficult models of critical benchmarks, and produced eigenvalue and eigenvalue sensitivity coefficient results that showed a marked improvement in accuracy. The CLUTCH sensitivity method in particular excelled in terms of efficiency and computational memory requirements.« less

  5. Alternate Waveforms for a Low-Cost Civil Global Positioning System Receiver.

    DTIC Science & Technology

    1980-06-01

    implementation. 17 Is. D i* on. S’ *. l7Gfo ai" ositioning System Navigation Receiver DOCUMENT IS AVAILASLe TO THE PUSLIC THROUGH THE NATIONAL TECHNICALRanging...be included were ranging performance, data hand- ling capability, time-to-first fix, acquisition and re-acquisition capability, and sensitivity to...seconds). This receiver would exhibit less sensitivity to multipath and to signal dropouts because it would continuously track all satellites in view and

  6. Establishment and Analysis of the 3-dimensional (3D) Spheroids Generated from the Nasopharyngeal Carcinoma Cell Line HK1.

    PubMed

    Muniandy, Kalaivani; Sankar, Prabu Siva; Xiang, Benedict Lian Shi; Soo-Beng, Alan Khoo; Balakrishnan, Venugopal; Mohana-Kumaran, Nethia

    2016-11-01

    Spheroids have been shown to recapitulate the tumour in vivo with properties such as the tumour microenvironment, concentration gradients, and tumour phenotype. As such, it can serve as a platform for determining the growth and invasion behaviour pattern of the cancer cells as well as be utilised for drug sensitivity assays; capable of exhibiting results that are closer to what is observed in vivo compared to two-dimensional (2D) cell culture assays. This study focused on establishing a three-dimensional (3D) cell culture model using the Nasopharyngeal Carcinoma (NPC) cell line, HK1 and analysing its growth and invasion phenotypes. The spheroids will also serve as a model to elucidate their sensitivity to the chemotherapeutic drug, Flavopiridol. The liquid overlay method was employed to generate the spheroids which was embedded in bovine collagen I matrix for growth and invasion phenotypes observation. The HK1 cells formed compact spheroids within 72 hours. Our observation from the 3 days experiments revealed that the spheroids gradually grew and invaded into the collagen matrix, showing that the HK1 spheroids are capable of growth and invasion. Progressing from these experiments, the HK1 spheroids were employed to perform a drug sensitivity assay using the chemotherapeutic drug, Flavopiridol. The drug had a dose-dependent inhibition on spheroid growth and invasion.

  7. [Efficiency of 27-plex single nucleotide polymorphism multiplex system for ancestry inference in different populations].

    PubMed

    Feng, Xing-Ling; Sun, Qi-Fan; Liu, Hong; Wei, Yi-Liang; DU, Wei-An; Li, Cai-Xia; Chen, Ling; Liu, Chao

    2016-04-20

    To validate the efficiency of 27-plex single nucleotide polymorphism (SNP) multiplex system for ancestry inference. The 27-plex SNP system was validated for its sensitivity and species specificity. A total of 533 samples were collected from African, Southern Chinese Han, China's ethic minorities (Yi, Hui, Miao, Tibet, and Uygur), European, Central Asian, Western Asian, Southern Asian, Southeast Asian and South American populations for clustering analysis of the genotypes by citing 3 representative continental ancestral groups [East Asia (CHB), Europe (CEU), and Africa (YRI)] from HapMap database. The system sensitivity is 0.125 ng. Twenty and six genotypes were detected in chimpanzee and monkeys, respectively. Except in rs10496971, no more products were found in other animals. The system was capable of differentiating intercontinental populations but not of distinguishing between East Asian and Southeast Asian population or between Southern Chinese Han population and Chinese Ethnic populations (Hui, Miao, Yi and Tibet). This system achieved a 100% accuracy for intercontinental population source inference for 46 blind test samples. 27-plex SNPs multiplex system has a high sensitivity and species specificity and can correctly differentiate the ancestry origins of individuals from African, European and East Asian for criminal case investigation. But this system is not capable of distinguishing subpopulation groups and more specific ancestry-informative markers are needed to improve its recognition of Southeast Asian and Chinese ethnic populations.

  8. Cluster secondary ion mass spectrometry microscope mode mass spectrometry imaging.

    PubMed

    Kiss, András; Smith, Donald F; Jungmann, Julia H; Heeren, Ron M A

    2013-12-30

    Microscope mode imaging for secondary ion mass spectrometry is a technique with the promise of simultaneous high spatial resolution and high-speed imaging of biomolecules from complex surfaces. Technological developments such as new position-sensitive detectors, in combination with polyatomic primary ion sources, are required to exploit the full potential of microscope mode mass spectrometry imaging, i.e. to efficiently push the limits of ultra-high spatial resolution, sample throughput and sensitivity. In this work, a C60 primary source was combined with a commercial mass microscope for microscope mode secondary ion mass spectrometry imaging. The detector setup is a pixelated detector from the Medipix/Timepix family with high-voltage post-acceleration capabilities. The system's mass spectral and imaging performance is tested with various benchmark samples and thin tissue sections. The high secondary ion yield (with respect to 'traditional' monatomic primary ion sources) of the C60 primary ion source and the increased sensitivity of the high voltage detector setup improve microscope mode secondary ion mass spectrometry imaging. The analysis time and the signal-to-noise ratio are improved compared with other microscope mode imaging systems, all at high spatial resolution. We have demonstrated the unique capabilities of a C60 ion microscope with a Timepix detector for high spatial resolution microscope mode secondary ion mass spectrometry imaging. Copyright © 2013 John Wiley & Sons, Ltd.

  9. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Intelligent hand-portable proliferation sensing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dieckman, S.L.; Bostrom, G.A.; Waterfield, L.G.

    1997-08-01

    Argonne National Laboratory, with support from DOE`s Office of Nonproliferation and National Security, is currently developing an intelligent hand-portable sensor system. This system is designed specifically to support the intelligence community with the task of in-field sensing of nuclear proliferation and related activities. Based upon pulsed laser photo-ionization time-of-flight mass spectrometry technology, this novel sensing system is capable of quickly providing a molecular or atomic analysis of specimens. The system is capable of analyzing virtually any gas phase molecule, or molecule that can be induced into the gas phase by (for example) sample heating. This system has the unique advantagesmore » of providing unprecedented portability, excellent sensitivity, tremendous fieldability, and a high performance/cost ratio. The system will be capable of operating in a highly automated manner for on-site inspections, and easily modified for other applications such as perimeter monitoring aboard a plane or drone. The paper describes the sensing system.« less

  11. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  12. Wide area restoration following biological contamination

    NASA Astrophysics Data System (ADS)

    Yang, Lynn; Hibbard, Wilthea; Edwards, Donna; Franco, David; Fruetel, Julie; Tucker, Mark; Einfeld, Wayne; Knowlton, Robert; Brown, Gary; Brockmann, John; Greenwalt, Robert; Miles, Robin; Raber, Ellen; Carlsen, Tina; Krauter, Paula; Dillon, Michael; MacQueen, Don; Intrepido, Tony; Hoppes, Bill; Wilson, Wendy; Mancieri, Sav

    2008-04-01

    Current understanding of how to restore a wide area that has been contaminated following a large biological attack is limited. The Department of Homeland Security and Department of Defense are executing a four-year collaborative program named the Interagency Biological Restoration Demonstration (IBRD) program. This program is aimed at developing technologies, methods, plans and policies necessary to restore a wide area, including military installations and critical infrastructures, in the event of a large outdoor aerosol release of anthrax. The IBRD program partner pilot city is the Seattle Urban Area to include Fort Lewis, WA and McChord Air Force Base. A front-end systems analysis was conducted as part of IBRD, to: 1) assess existing technologies and processes for wide area restoration; from this, 2) develop an "as-is" decision framework for wide area restoration; and 3) identify and prioritize capability gaps. Qualitative assessments and quantitative analyses, including sensitivity, timeline and case study analyses, were conducted to evaluate existing processes and rank capability gaps. This paper describes the approach and results from this front-end systems analysis.

  13. Aerodynamic parameter studies and sensitivity analysis for rotor blades in axial flight

    NASA Technical Reports Server (NTRS)

    Chiu, Y. Danny; Peters, David A.

    1991-01-01

    The analytical capability is offered for aerodynamic parametric studies and sensitivity analyses of rotary wings in axial flight by using a 3-D undistorted wake model in curved lifting line theory. The governing equations are solved by both the Multhopp Interpolation technique and the Vortex Lattice method. The singularity from the bound vortices is eliminated through the Hadamard's finite part concept. Good numerical agreement between both analytical methods and finite differences methods are found. Parametric studies were made to assess the effects of several shape variables on aerodynamic loads. It is found, e.g., that a rotor blade with out-of-plane and inplane curvature can theoretically increase lift in the inboard and outboard regions respectively without introducing an additional induced drag.

  14. ZnO and cobalt phthalocyanine hybridized graphene: efficient photocatalysts for degradation of rhodamine B

    PubMed Central

    Neelgund, Gururaj M.; Oki, Aderemi; Luo, Zhiping

    2014-01-01

    A novel method has been developed to synthesize graphene-ZnO composite as a highly efficient catalyst by reduction of graphite oxide and in-situ deposition of ZnO nanoparticles by chemical reduction reaction. The graphene-ZnO catalyst is capable of complete degradation of rhodamine B under exposure to natural sunlight. Further, the catalytic efficiency of graphene-ZnO catalyst was enhanced by sensitizing with cobalt phthalocyanine. The formation of graphene-ZnO pcatalyst and its further sensitization with cobalt phthalocyanine was characterized using UV-vis, ATR-IR and Raman spectroscopy, powder XRD and thermogravimetric analysis. The morphology of both graphene-ZnO and graphene-ZnO-CoPC catalysts was analyzed using scanning and transmission electron microscopes. PMID:24972296

  15. Measurement of the Muon Production Depths at the Pierre Auger Observatory

    DOE PAGES

    Collica, Laura

    2016-09-08

    The muon content of extensive air showers is an observable sensitive to the primary composition and to the hadronic interaction properties. The Pierre Auger Observatory uses water-Cherenkov detectors to measure particle densities at the ground and therefore is sensitive to the muon content of air showers. We present here a method which allows us to estimate the muon production depths by exploiting the measurement of the muon arrival times at the ground recorded with the Surface Detector of the Pierre Auger Observatory. The analysis is performed in a large range of zenith angles, thanks to the capability of estimating and subtracting the electromagnetic component, and for energies betweenmore » $$10^{19.2}$$ and $$10^{20}$$ eV.« less

  16. Data acquisition system for the socal plane detector of the mass separator MASHA

    NASA Astrophysics Data System (ADS)

    Novoselov, A. S.; Rodin, A. M.; Motycak, S.; Podshibyakin, A. V.; Krupa, L.; Belozerov, A. V.; Vedeneyev, V. Yu.; Gulyaev, A. V.; Gulyaeva, A. V.; Kliman, J.; Salamatin, V. S.; Stepantsov, S. V.; Chernysheva, E. V.; Yukhimchuk, S. A.; Komarov, A. B.; Kamas, D.

    2016-09-01

    The results of the development and the general information about the data acquisition system which was recently created at the MASHA setup (Flerov laboratory of nuclear reactions at Joint institute for nuclear research) are presented. The main difference from the previous system is that we use a new modern platform, National Instruments PXI with XIA multichannel high-speed digitizers (250 MHz 12 bit 16 channels). At this moment system has 448 spectrometric channels. The software and its features for the data acquisition and analysis are also described. The new DAQ system expands precision measuring capabilities of alpha decays and spontaneous fission at the focal plane position-sensitive silicon strip detector which, in turn, increases the capabilities of the setup in such a field as low-yield registration of elements.

  17. HRST architecture modeling and assessments

    NASA Astrophysics Data System (ADS)

    Comstock, Douglas A.

    1997-01-01

    This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segments for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented.

  18. Application of the ABC helicopter to the emergency medical service role

    NASA Technical Reports Server (NTRS)

    Levine, L. S.

    1981-01-01

    Attention is called to the use of helicopters in transporting the sick and injured to medical facilities. It is noted that the helicopter's speed of response and delivery increases patient survival rates and may reduce the cost of medical care and its burden on society. Among the vehicle characteristics desired for this use are a cruising speed of 200 knots, a single engine hover capability at 10,000 ft, and an absence of a tail rotor. Three designs for helicopters incorporating such new technologies as digital/optical control systems, all composite air-frames, and third-generation airfoils are presented. A sensitivity analysis is conducted to show the effect of design speed, mission radius, and single engine hover capability on vehicle weight, fuel consumption, operating costs, and productivity.

  19. Near-infrared confocal micro-Raman spectroscopy combined with PCA-LDA multivariate analysis for detection of esophageal cancer

    NASA Astrophysics Data System (ADS)

    Chen, Long; Wang, Yue; Liu, Nenrong; Lin, Duo; Weng, Cuncheng; Zhang, Jixue; Zhu, Lihuan; Chen, Weisheng; Chen, Rong; Feng, Shangyuan

    2013-06-01

    The diagnostic capability of using tissue intrinsic micro-Raman signals to obtain biochemical information from human esophageal tissue is presented in this paper. Near-infrared micro-Raman spectroscopy combined with multivariate analysis was applied for discrimination of esophageal cancer tissue from normal tissue samples. Micro-Raman spectroscopy measurements were performed on 54 esophageal cancer tissues and 55 normal tissues in the 400-1750 cm-1 range. The mean Raman spectra showed significant differences between the two groups. Tentative assignments of the Raman bands in the measured tissue spectra suggested some changes in protein structure, a decrease in the relative amount of lactose, and increases in the percentages of tryptophan, collagen and phenylalanine content in esophageal cancer tissue as compared to those of a normal subject. The diagnostic algorithms based on principal component analysis (PCA) and linear discriminate analysis (LDA) achieved a diagnostic sensitivity of 87.0% and specificity of 70.9% for separating cancer from normal esophageal tissue samples. The result demonstrated that near-infrared micro-Raman spectroscopy combined with PCA-LDA analysis could be an effective and sensitive tool for identification of esophageal cancer.

  20. Sensitivity analysis and uncertainty estimation in ash concentration simulations and tephra deposit daily forecasted at Mt. Etna, in Italy

    NASA Astrophysics Data System (ADS)

    Prestifilippo, Michele; Scollo, Simona; Tarantola, Stefano

    2015-04-01

    The uncertainty in volcanic ash forecasts may depend on our knowledge of the model input parameters and our capability to represent the dynamic of an incoming eruption. Forecasts help governments to reduce risks associated with volcanic eruptions and for this reason different kinds of analysis that help to understand the effect that each input parameter has on model outputs are necessary. We present an iterative approach based on the sequential combination of sensitivity analysis, parameter estimation procedure and Monte Carlo-based uncertainty analysis, applied to the lagrangian volcanic ash dispersal model PUFF. We modify the main input parameters as the total mass, the total grain-size distribution, the plume thickness, the shape of the eruption column, the sedimentation models and the diffusion coefficient, perform thousands of simulations and analyze the results. The study is carried out on two different Etna scenarios: the sub-plinian eruption of 22 July 1998 that formed an eruption column rising 12 km above sea level and lasted some minutes and the lava fountain eruption having features similar to the 2011-2013 events that produced eruption column high up to several kilometers above sea level and lasted some hours. Sensitivity analyses and uncertainty estimation results help us to address the measurements that volcanologists should perform during volcanic crisis to reduce the model uncertainty.

  1. Systematic parameter estimation and sensitivity analysis using a multidimensional PEMFC model coupled with DAKOTA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Chao Yang; Luo, Gang; Jiang, Fangming

    2010-05-01

    Current computational models for proton exchange membrane fuel cells (PEMFCs) include a large number of parameters such as boundary conditions, material properties, and numerous parameters used in sub-models for membrane transport, two-phase flow and electrochemistry. In order to successfully use a computational PEMFC model in design and optimization, it is important to identify critical parameters under a wide variety of operating conditions, such as relative humidity, current load, temperature, etc. Moreover, when experimental data is available in the form of polarization curves or local distribution of current and reactant/product species (e.g., O2, H2O concentrations), critical parameters can be estimated inmore » order to enable the model to better fit the data. Sensitivity analysis and parameter estimation are typically performed using manual adjustment of parameters, which is also common in parameter studies. We present work to demonstrate a systematic approach based on using a widely available toolkit developed at Sandia called DAKOTA that supports many kinds of design studies, such as sensitivity analysis as well as optimization and uncertainty quantification. In the present work, we couple a multidimensional PEMFC model (which is being developed, tested and later validated in a joint effort by a team from Penn State Univ. and Sandia National Laboratories) with DAKOTA through the mapping of model parameters to system responses. Using this interface, we demonstrate the efficiency of performing simple parameter studies as well as identifying critical parameters using sensitivity analysis. Finally, we show examples of optimization and parameter estimation using the automated capability in DAKOTA.« less

  2. Functionalized Nanopipettes: A Sensitive Tool for Pathogen Detection

    NASA Astrophysics Data System (ADS)

    Actis, P.; Jejelowo, O.; Pourmand, N.

    2010-04-01

    Nanopipette technology is capable of detecting and functional analyzing biomolecules. Preliminary experiments are demonstrating the sensitivity and selectivity of the technique with specific proteins targeting environmental toxins.

  3. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  4. Development of a quantitative in-shoe measurement system for assessing balance: sixteen-sensor insoles.

    PubMed

    Bamberg, Stacy M; Lastayo, Paul; Dibble, Lee; Musselman, Josh; Raghavendra, Swarna Kiran Dasa

    2006-01-01

    This work presents the first phase in the development of an in-shoe sensor system designed to evaluate balance. Sixteen force-sensitive resistors were strategically mounted to a removable insole, and the bilateral outputs were recorded. The initial results indicate that these sensors are capable of detecting subtle changes in weight distribution, corresponding to the subject's ability to balance. Preliminary analysis of this data found a clear correlation between the ability to balance and the state of health of the subject.

  5. Electrochemical high-temperature gas sensors

    NASA Astrophysics Data System (ADS)

    Saruhan, B.; Stranzenbach, M.; Yüce, A.; Gönüllü, Y.

    2012-06-01

    Combustion produced common air pollutant, NOx associates with greenhouse effects. Its high temperature detection is essential for protection of nature. Component-integration capable high-temperature sensors enable the control of combustion products. The requirements are quantitative detection of total NOx and high selectivity at temperatures above 500°C. This study reports various approaches to detect NO and NO2 selectively under lean and humid conditions at temperatures from 300°C to 800°C. All tested electrochemical sensors were fabricated in planar design to enable componentintegration. We suggest first an impedance-metric gas sensor for total NOx-detection consisting of NiO- or NiCr2O4-SE and PYSZ-electrolyte. The electrolyte-layer is about 200μm thickness and constructed of quasi-single crystalline columns. The sensing-electrode (SE) is magnetron sputtered thin-layers of NiO or NiCr2O4. Sensor sensitivity for detection of total NOx has been measured by applying impedance analysis. The cross-sensitivity to other emission gases such as CO, CO2, CH4 and oxygen (5 vol.%) has been determined under 0-1000ppm NO. Sensor maintains its high sensitivity at temperatures up to 550°C and 600°C, depending on the sensing-electrode. NiO-SE yields better selectivity to NO in the presence of oxygen and have shorter response times comparing to NiCr2O4-SE. For higher temperature NO2-sensing capability, a resistive DC-sensor having Al-doped TiO2-sensing layers has been employed. Sensor-sensitivity towards NO2 and cross-sensitivity to CO has been determined in the presence of H2O at temperatures 600°C and 800°C. NO2 concentrations varying from 25 to 100ppm and CO concentrations from 25 to 75ppm can be detected. By nano-tubular structuring of TiO2, NO2 sensitivity of the sensor was increased.

  6. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)-A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes.

    PubMed

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare . However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes.

  7. Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq)—A Method for High-Throughput Analysis of Differentially Methylated CCGG Sites in Plants with Large Genomes

    PubMed Central

    Chwialkowska, Karolina; Korotko, Urszula; Kosinska, Joanna; Szarejko, Iwona; Kwasniewski, Miroslaw

    2017-01-01

    Epigenetic mechanisms, including histone modifications and DNA methylation, mutually regulate chromatin structure, maintain genome integrity, and affect gene expression and transposon mobility. Variations in DNA methylation within plant populations, as well as methylation in response to internal and external factors, are of increasing interest, especially in the crop research field. Methylation Sensitive Amplification Polymorphism (MSAP) is one of the most commonly used methods for assessing DNA methylation changes in plants. This method involves gel-based visualization of PCR fragments from selectively amplified DNA that are cleaved using methylation-sensitive restriction enzymes. In this study, we developed and validated a new method based on the conventional MSAP approach called Methylation Sensitive Amplification Polymorphism Sequencing (MSAP-Seq). We improved the MSAP-based approach by replacing the conventional separation of amplicons on polyacrylamide gels with direct, high-throughput sequencing using Next Generation Sequencing (NGS) and automated data analysis. MSAP-Seq allows for global sequence-based identification of changes in DNA methylation. This technique was validated in Hordeum vulgare. However, MSAP-Seq can be straightforwardly implemented in different plant species, including crops with large, complex and highly repetitive genomes. The incorporation of high-throughput sequencing into MSAP-Seq enables parallel and direct analysis of DNA methylation in hundreds of thousands of sites across the genome. MSAP-Seq provides direct genomic localization of changes and enables quantitative evaluation. We have shown that the MSAP-Seq method specifically targets gene-containing regions and that a single analysis can cover three-quarters of all genes in large genomes. Moreover, MSAP-Seq's simplicity, cost effectiveness, and high-multiplexing capability make this method highly affordable. Therefore, MSAP-Seq can be used for DNA methylation analysis in crop plants with large and complex genomes. PMID:29250096

  8. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less

  9. Asymmetry identification in rigid rotating bodies—Theory and experiment

    NASA Astrophysics Data System (ADS)

    Bucher, Izhak; Shomer, Ofer

    2013-12-01

    Asymmetry and anisotropy are important parameters in rotating devices that can cause instability; indicate a manufacturing defect or a developing fault. The present paper discusses an identification method capable of detecting minute levels of asymmetry by exploiting the unique dynamics of parametric excitation caused by asymmetry and rotation. The detection relies on rigid body dynamics without resorting to nonlinear vibration analysis, and the natural dynamics of elastically supported systems is exploited in order to increase the sensitivity to asymmetry. It is possible to isolate asymmetry from other rotation-induced phenomena like unbalance. An asymmetry detection machine which was built in the laboratory demonstrates the method alongside theoretical analysis.

  10. Advances toward submicron resolution optics for x-ray instrumentation and applications

    NASA Astrophysics Data System (ADS)

    Cordier, Mark; Stripe, Benjamin; Yun, Wenbing; Lau, S. H.; Lyon, Alan; Reynolds, David; Lewis, Sylvia J. Y.; Chen, Sharon; Semenov, Vladimir A.; Spink, Richard I.; Seshadri, Srivatsan

    2017-08-01

    Sigray's axially symmetric x-ray optics enable advanced microanalytical capabilities for focusing x-rays to microns-scale to submicron spot sizes, which can potentially unlock many avenues for laboratory micro-analysis. The design of these optics allows submicron spot sizes even at low x-ray energies, enabling research into low atomic number elements and allows increased sensitivity of grazing incidence measurements and surface analysis. We will discuss advances made in the fabrication of these double paraboloidal mirror lenses designed for use in laboratory x-ray applications. We will additionally present results from as-built paraboloids, including surface figure error and focal spot size achieved to-date.

  11. Highly sensitive transient absorption imaging of graphene and graphene oxide in living cells and circulating blood

    PubMed Central

    Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N.; Chen, Yong P.; Cheng, Ji-Xin; Yang, Chen

    2015-01-01

    We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials. PMID:26202216

  12. Application of non-coherent Doppler data types for deep space navigation

    NASA Technical Reports Server (NTRS)

    Bhaskaran, Shyam

    1995-01-01

    Recent improvements in computational capability and Deep Space Network technology have renewed interest in examining the possibility of using one-way Doppler data alone to navigate interplanetary spacecraft. The one-way data can be formulated as the standard differenced-count Doppler or as phase measurements, and the data can be received at a single station or differenced if obtained simultaneously at two stations. A covariance analysis is performed which analyzes the accuracy obtainable by combinations of one-way Doppler data and compared with similar results using standard two-way Doppler and range. The sample interplanetary trajectory used was that of the Mars Pathfinder mission to Mars. It is shown that differenced one-way data is capable of determining the angular position of the spacecraft to fairly high accuracy, but has relatively poor sensitivity to the range. When combined with single station data, the position dispersions are roughly an order of magnitude larger in range and comparable in angular position as compared to dispersions obtained with standard data two-way types. It was also found that the phase formulation is less sensitive to data weight variations and data coverage than the differenced-count Doppler formulation.

  13. The application of noncoherent Doppler data types for Deep Space Navigation

    NASA Technical Reports Server (NTRS)

    Bhaskaran, S.

    1995-01-01

    Recent improvements in computational capability and DSN technology have renewed interest in examining the possibility of using one-way Doppler data alone to navigate interplanetary spacecraft. The one-way data can be formulated as the standard differenced-count Doppler or as phase measurements, and the data can be received at a single station or differenced if obtained simultaneously at two stations. A covariance analysis, which analyzes the accuracy obtainable by combinations of one-way Doppler data, is performed and compared with similar results using standard two-way Doppler and range. The sample interplanetary trajectory used was that of the Mars Pathfinder mission to Mars. It is shown that differenced one-way data are capable of determining the angular position of the spacecraft to fairly high accuracy, but have relatively poor sensitivity to the range. When combined with single-station data, the position dispersions are roughly an order of magnitude larger in range and comparable in angular position as compared to dispersions obtained with standard two-way data types. It was also found that the phase formulation is less sensitive to data weight variations and data coverage than the differenced-count Doppler formulation.

  14. Use of Accelerator Mass Spectrometry in Human Health and Molecular Toxicology.

    PubMed

    Enright, Heather A; Malfatti, Michael A; Zimmermann, Maike; Ognibene, Ted; Henderson, Paul; Turteltaub, Kenneth W

    2016-12-19

    Accelerator mass spectrometry (AMS) has been adopted as a powerful bioanalytical method for human studies in the areas of pharmacology and toxicology. The exquisite sensitivity (10 -18 mol) of AMS has facilitated studies of toxins and drugs at environmentally and physiologically relevant concentrations in humans. Such studies include risk assessment of environmental toxicants, drug candidate selection, absolute bioavailability determination, and more recently, assessment of drug-target binding as a biomarker of response to chemotherapy. Combining AMS with complementary capabilities such as high performance liquid chromatography (HPLC) can maximize data within a single experiment and provide additional insight when assessing drugs and toxins, such as metabolic profiling. Recent advances in the AMS technology at Lawrence Livermore National Laboratory have allowed for direct coupling of AMS with complementary capabilities such as HPLC via a liquid sample moving wire interface, offering greater sensitivity compared to that of graphite-based analysis, therefore enabling the use of lower 14 C and chemical doses, which are imperative for clinical testing. The aim of this review is to highlight the recent efforts in human studies using AMS, including technological advancements and discussion of the continued promise of AMS for innovative clinical based research.

  15. HEROES Observations of a Quiescent Active Region

    NASA Astrophysics Data System (ADS)

    Shih, A. Y.; Christe, S.; Gaskin, J.; Wilson-Hodge, C.

    2014-12-01

    Hard X-ray (HXR) observations of solar flares reveal the signatures of energetic electrons, and HXR images with high dynamic range and high sensitivity can distinguish between where electrons are accelerated and where they stop. Even in the non-flaring corona, high-sensitivity HXR measurements may be able to detect the presence of electron acceleration. The High Energy Replicated Optics to Explore the Sun (HEROES) balloon mission added the capability of solar observations to an existing astrophysics balloon payload, HERO, which used grazing-incidence optics for direct HXR imaging. HEROES measures HXR emission from ~20 to ~75 keV with an angular resolution of 33" HPD. HEROES launched on 2013 September 21 from Fort Sumner, New Mexico, and had a successful one-day flight. We present the detailed analysis of the 7-hour observation of AR 11850, which sets new upper limits on the HXR emission from a quiescent active region, with corresponding constraints on the numbers of tens of keV energetic electrons present. Using the imaging capability of HEROES, HXR upper limits are also obtained for the quiet Sun surrounding the active region. We also discuss what can be achieved with new and improved HXR instrumentation on balloons.

  16. The Use of Accelerator Mass Spectrometry in Human Health and Molecular Toxicology

    PubMed Central

    Enright, Heather A.; Malfatti, Michael A.; Zimmermann, Maike; Ognibene, Ted; Henderson, Paul; Turteltaub, Kenneth W.

    2016-01-01

    Accelerator Mass Spectrometry (AMS) has been adopted as a powerful bio-analytical method for human studies in the areas of pharmacology and toxicology. The exquisite sensitivity (10−18 mol) of AMS has facilitated studies of toxins and drugs at environmentally and physiologically relevant concentrations in humans. Such studies include: risk assessment of environmental toxicants, drug candidate selection, absolute bioavailability determination, and more recently, assessment of drug-target binding as a biomarker of response to chemotherapy. Combining AMS with complementary capabilities such as high performance liquid chromatography (HPLC) can maximize data within a single experiment and provide additional insight when assessing drugs and toxins, such as metabolic profiling. Recent advances in the AMS technology at Lawrence Livermore National Laboratory have allowed for direct coupling of AMS with complementary capabilities such as HPLC via a liquid sample moving wire interface, offering greater sensitivity compared to graphite-based analysis therefore, enabling the use of lower 14C and chemical doses, which are imperative for clinical testing. The aim of this review is to highlight the recent efforts in human studies using AMS, including technological advancements and discussion of the continued promise of AMS for innovative clinical based research. PMID:27726383

  17. Single-photon semiconductor photodiodes for distributed optical fiber sensors: state of the art and perspectives

    NASA Astrophysics Data System (ADS)

    Ripamonti, Giancarlo; Lacaita, Andrea L.

    1993-03-01

    The extreme sensitivity and time resolution of Geiger-mode avalanche photodiodes (GM- APDs) have already been exploited for optical time domain reflectometry (OTDR). Better than 1 cm spatial resolution in Rayleigh scattering detection was demonstrated. Distributed and quasi-distributed optical fiber sensors can take advantage of the capabilities of GM-APDs. Extensive studies have recently disclosed the main characteristics and limitations of silicon devices, both commercially available and developmental. In this paper we report an analysis of the performance of these detectors. The main characteristics of GM-APDs of interest for distributed optical fiber sensors are briefly reviewed. Command electronics (active quenching) is then introduced. The detector timing performance sets the maximum spatial resolution in experiments employing OTDR techniques. We highlight that the achievable time resolution depends on the physics of the avalanche spreading over the device area. On the basis of these results, trade-off between the important parameters (quantum efficiency, time resolution, background noise, and afterpulsing effects) is considered. Finally, we show first results on Germanium devices, capable of single photon sensitivity at 1.3 and 1.5 micrometers with sub- nanosecond time resolution.

  18. Predictive Capability Maturity Model for computational modeling and simulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less

  19. Peptide biomarkers as a way to determine meat authenticity.

    PubMed

    Sentandreu, Miguel Angel; Sentandreu, Enrique

    2011-11-01

    Meat fraud implies many illegal procedures affecting the composition of meat and meat products, something that is commonly done with the aim to increase profit. These practices need to be controlled by legal authorities by means of robust, accurate and sensitive methodologies capable to assure that fraudulent or accidental mislabelling does not arise. Common strategies traditionally used to assess meat authenticity have been based on methods such as chemometric analysis of a large set of data analysis, immunoassays or DNA analysis. The identification of peptide biomarkers specific of a particular meat species, tissue or ingredient by proteomic technologies constitutes an interesting and promising alternative to existing methodologies due to its high discriminating power, robustness and sensitivity. The possibility to develop standardized protein extraction protocols, together with the considerably higher resistance of peptide sequences to food processing as compared to DNA sequences, would overcome some of the limitations currently existing for quantitative determinations of highly processed food samples. The use of routine mass spectrometry equipment would make the technology suitable for control laboratories. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Sample analysis at Mars

    NASA Astrophysics Data System (ADS)

    Coll, P.; Cabane, M.; Mahaffy, P. R.; Brinckerhoff, W. B.; Sam Team

    The next landed missions to Mars, such as the planned Mars Science Laboratory and ExoMars, will require sample analysis capabilities refined well beyond what has been flown to date. A key science objective driving this requirement is the determination of the carbon inventory of Mars, and particularly the detection of organic compounds. The Sample Analysis at Mars (SAM) suite consists of a group of tightly-integrated experiments that would analyze samples delivered directly from a coring drill or by a facility sample processing and delivery (SPAD) mechanism. SAM consists of an advanced GC/MS system and a laser desorption mass spectrometer (LDMS). The combined capabilities of these techniques can address Mars science objectives with much improved sensitivity, resolution, and analytical breadth over what has been previously possible in situ. The GC/MS system analyzes the bulk composition (both molecular and isotopic) of solid-phase and atmospheric samples. Solid samples are introduced with a highly flexible chemical derivatization/pyrolysis subsystem (Pyr/GC/MS) that is significantly more capable than the mass spectrometers on Viking. The LDMS analyzes local elemental and molecular composition in solid samples vaporized and ionized with a pulsed laser. We will describe how each of these capabilities has particular strengths that can achieve key measurement objectives at Mars. In addition, the close codevelopment of the GC/MS and LDMS along with a sample manipulation system enables the the sharing of resources, the correlation of results, and the utilization of certain approaches that would not be possible with separate instruments. For instance, the same samples could be analyzed with more than one technique, increasing efficiency and providing cross-checks for quantification. There is also the possibility of combining methods, such as by permitting TOF-MS analyses of evolved gas (Pyr/EI-TOF-MS) or GC/MS analyses of laser evaporated gas (LD-GC/MS).

  1. A total internal reflection-fluorescence correlation spectroscopy setup with pulsed diode laser excitation

    NASA Astrophysics Data System (ADS)

    Weger, Lukas; Hoffmann-Jacobsen, Kerstin

    2017-09-01

    Fluorescence correlation spectroscopy (FCS) measures fluctuations in a (sub-)femtoliter volume to analyze the diffusive behavior of fluorescent particles. This highly sensitive method has proven to be useful for the analysis of dynamic biological systems as well as in chemistry, physics, and material sciences. It is routinely performed with commercial fluorescence microscopes, which provide a confined observation volume by the confocal technique. The evanescent wave of total internal reflectance (TIR) is used in home-built systems to permit a surface sensitive FCS analysis. We present a combined confocal and TIR-FCS setup which uses economic low-power pulsed diode lasers for excitation. Excitation and detection are coupled to time-correlated photon counting hardware. This allows simultaneous fluorescence lifetime and FCS measurements in a surface-sensitive mode. Moreover, the setup supports fluorescence lifetime correlation spectroscopy at surfaces. The excitation can be easily switched between TIR and epi-illumination to compare the surface properties with those in liquid bulk. The capabilities of the presented setup are demonstrated by measuring the diffusion coefficients of a free dye molecule, a labeled polyethylene glycol, and a fluorescent nanoparticle in confocal as well as in TIR-FCS.

  2. Evaluating practical vs. theoretical inspection system capability with a new programmed defect test mask designed for 3X and 4X technology nodes

    NASA Astrophysics Data System (ADS)

    Glasser, Joshua; Pratt, Tim

    2008-10-01

    Programmed defect test masks serve the useful purpose of evaluating inspection system sensitivity and capability. It is widely recognized that when evaluating inspection system capability, it is important to understand the actual sensitivity of the inspection system in production; yet unfortunately we have observed that many test masks are a more accurate judge of theoretical sensitivity rather than real-world usable capability. Use of ineffective test masks leave the purchaser of inspection equipment open to the risks of over-estimating the capability of their inspection solution and overspecifying defect sensitivity to their customers. This can result in catastrophic yield loss for device makers. In this paper we examine some of the lithography-related technology advances which place an increasing burden on mask inspection complexity, such as MEEF, defect printability estimation, aggressive OPC, double patterning, and OPC jogs. We evaluate the key inspection system component contributors to successful mask inspection, including what can "go wrong" with these components. We designed and fabricated a test mask which both (a) more faithfully represents actual production use cases; and (b) stresses the key components of the inspection system. This mask's patterns represent 32nm, 36nm, and 45nm logic and memory technology including metal and poly like background patterns with programmed defects. This test mask takes into consideration requirements of advanced lithography, such as MEEF, defect printability, assist features, nearly-repetitive patterns, and data preparation. This mask uses patterns representative of 32nm, 36nm, and 45nm logic, flash, and DRAM technology. It is specifically designed to have metal and poly like background patterns with programmed defects. The mask is complex tritone and was designed for annular immersion lithography.

  3. Application of fractal and grey level co-occurrence matrix analysis in evaluation of brain corpus callosum and cingulum architecture.

    PubMed

    Pantic, Igor; Dacic, Sanja; Brkic, Predrag; Lavrnja, Irena; Pantic, Senka; Jovanovic, Tomislav; Pekovic, Sanja

    2014-10-01

    This aim of this study was to assess the discriminatory value of fractal and grey level co-occurrence matrix (GLCM) analysis methods in standard microscopy analysis of two histologically similar brain white mass regions that have different nerve fiber orientation. A total of 160 digital micrographs of thionine-stained rat brain white mass were acquired using a Pro-MicroScan DEM-200 instrument. Eighty micrographs from the anterior corpus callosum and eighty from the anterior cingulum areas of the brain were analyzed. The micrographs were evaluated using the National Institutes of Health ImageJ software and its plugins. For each micrograph, seven parameters were calculated: angular second moment, inverse difference moment, GLCM contrast, GLCM correlation, GLCM variance, fractal dimension, and lacunarity. Using the Receiver operating characteristic analysis, the highest discriminatory value was determined for inverse difference moment (IDM) (area under the receiver operating characteristic (ROC) curve equaled 0.925, and for the criterion IDM≤0.610 the sensitivity and specificity were 82.5 and 87.5%, respectively). Most of the other parameters also showed good sensitivity and specificity. The results indicate that GLCM and fractal analysis methods, when applied together in brain histology analysis, are highly capable of discriminating white mass structures that have different axonal orientation.

  4. MEMS inertial sensors with integral rotation means.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohler, Stewart M.

    The state-of-the-art of inertial micro-sensors (gyroscopes and accelerometers) has advanced to the point where they are displacing the more traditional sensors in many size, power, and/or cost-sensitive applications. A factor limiting the range of application of inertial micro-sensors has been their relatively poor bias stability. The incorporation of an integral sensitive axis rotation capability would enable bias mitigation through proven techniques such as indexing, and foster the use of inertial micro-sensors in more accuracy-sensitive applications. Fabricating the integral rotation mechanism in MEMS technology would minimize the penalties associated with incorporation of this capability, and preserve the inherent advantages of inertialmore » micro-sensors.« less

  5. SU-E-I-85: Exploring the 18F-Fluorodeoxyglucose PET Characteristics in Staging of Esophageal Squamous Cell Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, C; Yin, Y

    2014-06-01

    Purpose: The aim of this study was to explore the characteristics derived from 18F-fluorodeoxyglucose (18F-FDG) PET image and assess its capacity in staging of esophageal squamous cell carcinoma (ESCC). Methods: 26 patients with newly diagnosed ESCC who underwent 18F-FDG PET scan were included in this study. Different image-derived indices including the standardized uptake value (SUV), gross tumor length, texture features and shape feature were considered. Taken the histopathologic examination as the gold standard, the extracted capacities of indices in staging of ESCC were assessed by Kruskal-Wallis test and Mann-Whitney test. Specificity and sensitivity for each of the studied parameters weremore » derived using receiver-operating characteristic curves. Results: 18F-FDG SUVmax and SUVmean showed statistically significant capability in AJCC and TNM stages. Texture features such as ENT and CORR were significant factors for N stages(p=0.040, p=0.029). Both FDG PET Longitudinal length and shape feature Eccentricity (EC) (p≤0.010) provided powerful stratification in the primary ESCC AJCC and TNM stages than SUV and texture features. Receiver-operating-characteristic curve analysis showed that tumor textural analysis can capability M stages with higher sensitivity than SUV measurement but lower in T and N stages. Conclusion: The 18F-FDG image-derived characteristics of SUV, textural features and shape feature allow for good stratification AJCC and TNM stage in ESCC patients.« less

  6. Improved Sensitivity for Molecular Detection of Bacterial and Candida Infections in Blood

    PubMed Central

    Bacconi, Andrea; Richmond, Gregory S.; Baroldi, Michelle A.; Laffler, Thomas G.; Blyn, Lawrence B.; Carolan, Heather E.; Frinder, Mark R.; Toleno, Donna M.; Metzgar, David; Gutierrez, Jose R.; Massire, Christian; Rounds, Megan; Kennel, Natalie J.; Rothman, Richard E.; Peterson, Stephen; Carroll, Karen C.; Wakefield, Teresa; Ecker, David J.

    2014-01-01

    The rapid identification of bacteria and fungi directly from the blood of patients with suspected bloodstream infections aids in diagnosis and guides treatment decisions. The development of an automated, rapid, and sensitive molecular technology capable of detecting the diverse agents of such infections at low titers has been challenging, due in part to the high background of genomic DNA in blood. PCR followed by electrospray ionization mass spectrometry (PCR/ESI-MS) allows for the rapid and accurate identification of microorganisms but with a sensitivity of about 50% compared to that of culture when using 1-ml whole-blood specimens. Here, we describe a new integrated specimen preparation technology that substantially improves the sensitivity of PCR/ESI-MS analysis. An efficient lysis method and automated DNA purification system were designed for processing 5 ml of whole blood. In addition, PCR amplification formulations were optimized to tolerate high levels of human DNA. An analysis of 331 specimens collected from patients with suspected bloodstream infections resulted in 35 PCR/ESI-MS-positive specimens (10.6%) compared to 18 positive by culture (5.4%). PCR/ESI-MS was 83% sensitive and 94% specific compared to culture. Replicate PCR/ESI-MS testing from a second aliquot of the PCR/ESI-MS-positive/culture-negative specimens corroborated the initial findings in most cases, resulting in increased sensitivity (91%) and specificity (99%) when confirmed detections were considered true positives. The integrated solution described here has the potential to provide rapid detection and identification of organisms responsible for bloodstream infections. PMID:24951806

  7. Wavenumber-frequency Spectra of Pressure Fluctuations Measured via Fast Response Pressure Sensitive Paint

    NASA Technical Reports Server (NTRS)

    Panda, J.; Roozeboom, N. H.; Ross, J. C.

    2016-01-01

    The recent advancement in fast-response Pressure-Sensitive Paint (PSP) allows time-resolved measurements of unsteady pressure fluctuations from a dense grid of spatial points on a wind tunnel model. This capability allows for direct calculations of the wavenumber-frequency (k-?) spectrum of pressure fluctuations. Such data, useful for the vibro-acoustics analysis of aerospace vehicles, are difficult to obtain otherwise. For the present work, time histories of pressure fluctuations on a flat plate subjected to vortex shedding from a rectangular bluff-body were measured using PSP. The light intensity levels in the photographic images were then converted to instantaneous pressure histories by applying calibration constants, which were calculated from a few dynamic pressure sensors placed at selective points on the plate. Fourier transform of the time-histories from a large number of spatial points provided k-? spectra for pressure fluctuations. The data provides first glimpse into the possibility of creating detailed forcing functions for vibro-acoustics analysis of aerospace vehicles, albeit for a limited frequency range.

  8. Evaluation and Improvement of Liquid Propellant Rocket Chugging Analysis Techniques. Part 1: A One-Dimensional Analysis of Low Frequency Combustion Instability in the Fuel Preburner of the Space Shuttle Main Engine. Final Report M.S. Thesis - Aug. 1986

    NASA Technical Reports Server (NTRS)

    Lim, Kair Chuan

    1986-01-01

    Low frequency combustion instability, known as chugging, is consistently experienced during shutdown in the fuel and oxidizer preburners of the Space Shuttle Main Engines. Such problems always occur during the helium purge of the residual oxidizer from the preburner manifolds during the shutdown sequence. Possible causes and triggering mechanisms are analyzed and details in modeling the fuel preburner chug are presented. A linearized chugging model, based on the foundation of previous models, capable of predicting the chug occurrence is discussed and the predicted results are presented and compared to experimental work performed by NASA. Sensitivity parameters such as chamber pressure, fuel and oxidizer temperatures, and the effective bulk modulus of the liquid oxidizer are considered in analyzing the fuel preburner chug. The computer program CHUGTEST is utilized to generate the stability boundary for each sensitivity study and the region for stable operation is identified.

  9. A study of the stress wave factor technique for nondestructive evaluation of composite materials

    NASA Technical Reports Server (NTRS)

    Sarrafzadeh-Khoee, A.; Kiernan, M. T.; Duke, J. C., Jr.; Henneke, E. G., II

    1986-01-01

    The acousto-ultrasonic method of nondestructive evaluation is an extremely sensitive means of assessing material response. Efforts continue to complete the understanding of this method. In order to achieve the full sensitivity of the technique, extreme care must be taken in its performance. This report provides an update of the efforts to advance the understanding of this method and to increase its application to the nondestructive evaluation of composite materials. Included are descriptions of a novel optical system that is capable of measuring in-plane and out-of-plane displacements, an IBM PC-based data acquisition system, an extensive data analysis software package, the azimuthal variation of acousto-ultrasonic behavior in graphite/epoxy laminates, and preliminary examination of processing variation in graphite-aluminum tubes.

  10. Aeras: A next generation global atmosphere model

    DOE PAGES

    Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...

    2015-06-01

    Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less

  11. A New High-Speed, High-Cycle, Gear-Tooth Bending Fatigue Test Capability

    NASA Technical Reports Server (NTRS)

    Stringer, David B.; Dykas, Brian D.; LaBerge, Kelsen E.; Zakrajsek, Andrew J.; Handschuh, Robert F.

    2011-01-01

    A new high-speed test capability for determining the high cycle bending-fatigue characteristics of gear teeth has been developed. Experiments were performed in the test facility using a standard spur gear test specimens designed for use in NASA Glenn s drive system test facilities. These tests varied in load condition and cycle-rate. The cycle-rate varied from 50 to 1000 Hz. The loads varied from high-stress, low-cycle loads to near infinite life conditions. Over 100 tests were conducted using AISI 9310 steel spur gear specimen. These results were then compared to previous data in the literature for correlation. Additionally, a cycle-rate sensitivity analysis was conducted by grouping the results according to cycle-rate and comparing the data sets. Methods used to study and verify load-path and facility dynamics are also discussed.

  12. SAFARI new and improved: extending the capabilities of SPICA's imaging spectrometer

    NASA Astrophysics Data System (ADS)

    Roelfsema, Peter; Giard, Martin; Najarro, Francisco; Wafelbakker, Kees; Jellema, Willem; Jackson, Brian; Sibthorpe, Bruce; Audard, Marc; Doi, Yasuo; di Giorgio, Anna; Griffin, Matthew; Helmich, Frank; Kamp, Inga; Kerschbaum, Franz; Meyer, Michael; Naylor, David; Onaka, Takashi; Poglitch, Albrecht; Spinoglio, Luigi; van der Tak, Floris; Vandenbussche, Bart

    2014-08-01

    The Japanese SPace Infrared telescope for Cosmology and Astrophysics, SPICA, aims to provide astronomers with a truly new window on the universe. With a large -3 meter class- cold -6K- telescope, the mission provides a unique low background environment optimally suited for highly sensitive instruments limited only by the cosmic background itself. SAFARI, the SpicA FAR infrared Instrument SAFARI, is a Fourier Transform imaging spectrometer designed to fully exploit this extremely low far infrared background environment. The SAFARI consortium, comprised of European and Canadian institutes, has established an instrument reference design based on a Mach-Zehnder interferometer stage with outputs directed to three extremely sensitive Transition Edge Sensor arrays covering the 35 to 210 μm domain. The baseline instrument provides R > 1000 spectroscopic imaging capabilities over a 2' by 2' field of view. A number of modifications to the instrument to extend its capabilities are under investigation. With the reference design SAFARI's sensitivity for many objects is limited not only by the detector NEP but also by the level of broad band background radiation - the zodiacal light for the shorter wavelengths and satellite baffle structures for the longer wavelengths. Options to reduce this background are dedicated masks or dispersive elements which can be inserted in the optics as required. The resulting increase in sensitivity can directly enhance the prime science goals of SAFARI; with the expected enhanced sensitivity astronomers would be in a better position to study thousands of galaxies out to redshift 3 and even many hundreds out to redshifts of 5 or 6. Possibilities to increase the wavelength resolution, at least for the shorter wavelength bands, are investigated as this would significantly enhance SAFARI's capabilities to study star and planet formation in our own galaxy.

  13. Digital Analysis and Sorting of Fluorescence Lifetime by Flow Cytometry

    PubMed Central

    Houston, Jessica P.; Naivar, Mark A.; Freyer, James P.

    2010-01-01

    Frequency-domain flow cytometry techniques are combined with modifications to the digital signal processing capabilities of the Open Reconfigurable Cytometric Acquisition System (ORCAS) to analyze fluorescence decay lifetimes and control sorting. Real-time fluorescence lifetime analysis is accomplished by rapidly digitizing correlated, radiofrequency modulated detector signals, implementing Fourier analysis programming with ORCAS’ digital signal processor (DSP) and converting the processed data into standard cytometric list mode data. To systematically test the capabilities of the ORCAS 50 MS/sec analog-to-digital converter (ADC) and our DSP programming, an error analysis was performed using simulated light scatter and fluorescence waveforms (0.5–25 ns simulated lifetime), pulse widths ranging from 2 to 15 µs, and modulation frequencies from 2.5 to 16.667 MHz. The standard deviations of digitally acquired lifetime values ranged from 0.112 to >2 ns, corresponding to errors in actual phase shifts from 0.0142° to 1.6°. The lowest coefficients of variation (<1%) were found for 10-MHz modulated waveforms having pulse widths of 6 µs and simulated lifetimes of 4 ns. Direct comparison of the digital analysis system to a previous analog phase-sensitive flow cytometer demonstrated similar precision and accuracy on measurements of a range of fluorescent microspheres, unstained cells and cells stained with three common fluorophores. Sorting based on fluorescence lifetime was accomplished by adding analog outputs to ORCAS and interfacing with a commercial cell sorter with a radiofrequency modulated solid-state laser. Two populations of fluorescent microspheres with overlapping fluorescence intensities but different lifetimes (2 and 7 ns) were separated to ~98% purity. Overall, the digital signal acquisition and processing methods we introduce present a simple yet robust approach to phase-sensitive measurements in flow cytometry. The ability to simply and inexpensively implement this system on a commercial flow sorter will both allow better dissemination of this technology and better exploit the traditionally underutilized parameter of fluorescence lifetime. PMID:20662090

  14. Detection of flat colorectal polyps at screening CT colonography in comparison with conventional polypoid lesions.

    PubMed

    Sakamoto, Takashi; Mitsuzaki, Katsuhiko; Utsunomiya, Daisuke; Matsuda, Katsuhiko; Yamamura, Sadahiro; Urata, Joji; Kawakami, Megumi; Yamashita, Yasuyuki

    2012-09-01

    Although the screening of small, flat polyps is clinically important, the role of CT colonography (CTC) screening in their detection has not been thoroughly investigated. To evaluate the detection capability and usefulness of CTC in the screening of flat and polypoid lesions by comparing CTC with optic colonoscopy findings as the gold standard. We evaluated the CTC detection capability for flat colorectal polyps with a flat surface and a height not exceeding 3 mm (n = 42) by comparing to conventional polypoid lesions (n = 418) according to the polyp diameter. Four types of reconstruction images including multiplanar reconstruction, volume rendering, virtual gross pathology, and virtual endoscopic images were used for visual analysis. We compared the abilities of the four reconstructions for polyp visualization. Detection sensitivity for flat polyps was 31.3%, 44.4%, and 87.5% for lesions measuring 2-3 mm, 4-5 mm, and ≥6 mm, respectively; the corresponding sensitivity for polypoid lesions was 47.6%, 79.0%, and 91.7%. The overall sensitivity for flat lesions (47.6%) was significantly lower than polypoid lesions (64.1%). Virtual endoscopic imaging showed best visualization among the four reconstructions. Colon cancers were detected in eight patients by optic colonoscopy, and CTC detected colon cancers in all eight patients. CTC using 64-row multidetector CT is useful for colon cancer screening to detect colorectal polyps while the detection of small, flat lesions is still challenging.

  15. GAT: a graph-theoretical analysis toolbox for analyzing between-group differences in large-scale structural and functional brain networks.

    PubMed

    Hosseini, S M Hadi; Hoeft, Fumiko; Kesler, Shelli R

    2012-01-01

    In recent years, graph theoretical analyses of neuroimaging data have increased our understanding of the organization of large-scale structural and functional brain networks. However, tools for pipeline application of graph theory for analyzing topology of brain networks is still lacking. In this report, we describe the development of a graph-analysis toolbox (GAT) that facilitates analysis and comparison of structural and functional network brain networks. GAT provides a graphical user interface (GUI) that facilitates construction and analysis of brain networks, comparison of regional and global topological properties between networks, analysis of network hub and modules, and analysis of resilience of the networks to random failure and targeted attacks. Area under a curve (AUC) and functional data analyses (FDA), in conjunction with permutation testing, is employed for testing the differences in network topologies; analyses that are less sensitive to the thresholding process. We demonstrated the capabilities of GAT by investigating the differences in the organization of regional gray-matter correlation networks in survivors of acute lymphoblastic leukemia (ALL) and healthy matched Controls (CON). The results revealed an alteration in small-world characteristics of the brain networks in the ALL survivors; an observation that confirm our hypothesis suggesting widespread neurobiological injury in ALL survivors. Along with demonstration of the capabilities of the GAT, this is the first report of altered large-scale structural brain networks in ALL survivors.

  16. Can hip abduction and external rotation discriminate sacroiliac joint pain?

    PubMed

    Adhia, Divya Bharatkumar; Tumilty, Steve; Mani, Ramakrishnan; Milosavljevic, Stephan; Bussey, Melanie D

    2016-02-01

    The primary aim of the study is to determine if Hip Abduction and External Rotation (HABER) test is capable of reproducing familiar pain in individuals with low back pain (LBP) of sacroiliac joint (SIJ) origin (SIJ-positive) when compared with LBP of Non-SIJ origin (SIJ-negative). If so, the secondary aim is to determine the diagnostic accuracy of HABER test against the reference standard of pain provocation tests, and to determine which increments of the HABER test has highest sensitivity and specificity for identifying SIJ-positive individuals. Single-blinded diagnostic accuracy study. Participants [n(122)] between ages of 18-50 y, suffering from chronic non-specific LBP (≥3 months) volunteered in the study. An experienced musculoskeletal physiotherapist evaluated and classified participants into either SIJ-positive [n(45)] or SIJ-negative [n(77)], based on reference standard of pain provocation tests [≥3 positive tests = SIJ-positive]. Another musculoskeletal physiotherapist, blinded to clinical groups, evaluated participants for reproduction of familiar pain during each increment (10°, 20°, 30°, 40°, and 50°) of HABER test. The HABER test reproduced familiar pain in SIJ-positive individuals when compared with SIJ-negative individuals [p (0.001), R(2) (0.38), Exp(β) (5.95-10.32)], and demonstrated moderate level of sensitivity (67%-78%) and specificity (71%-72%) for identifying SIJ-positive individuals. Receiver operator curve analysis demonstrated that the HABER increments of ≥30° have the highest sensitivity (83%-100%) and specificity (52%-64%). The HABER test is capable of reproducing familiar pain in SIJ-positive LBP individuals and has moderate levels of sensitivity and specificity for identifying SIJ-positive LBP individuals. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Analysis of Microflares from the Second Sounding Rocket Flight of the Focusing Optics X-ray Solar Imager (FOXSI-2)

    NASA Astrophysics Data System (ADS)

    Vievering, J. T.; Glesener, L.; Krucker, S.; Christe, S.; Buitrago-Casas, J. C.; Ishikawa, S. N.; Ramsey, B.; Takahashi, T.; Watanabe, S.

    2016-12-01

    Observations of the sun in hard x-rays can provide insight into many solar phenomena which are not currently well-understood, including the mechanisms behind particle acceleration in flares. Currently, RHESSI is the only solar-dedicated spacecraft observing in the hard x-ray regime. Though RHESSI has greatly added to our knowledge of flare particle acceleration, the method of rotation modulation collimators is limited in sensitivity and dynamic range. By instead using a direct imaging technique, the structure and evolution of even small flares and active regions can be investigated in greater depth. FOXSI (Focusing Optics X-ray Solar Imager), a hard x-ray instrument flown on two sounding rocket campaigns, seeks to achieve these improved capabilities by using focusing optics for solar observations in the 4-20 keV range. During the second of the FOXSI flights, flown on December 11, 2014, two microflares were observed, estimated as GOES class A0.5 and A2.5 (upper limits). Preliminary analysis of these two flares will be presented, including imaging spectroscopy, light curves, and photon spectra. Through this analysis, we investigate the capabilities of FOXSI in enhancing our knowledge of smaller-scale solar events.

  18. Serum biomarkers of Burkholderia mallei infection elucidated by proteomic imaging of skin and lung abscesses.

    PubMed

    Glaros, Trevor G; Blancett, Candace D; Bell, Todd M; Natesan, Mohan; Ulrich, Robert G

    2015-01-01

    The bacterium Burkholderia mallei is the etiological agent of glanders, a highly contagious, often fatal zoonotic infectious disease that is also a biodefense concern. Clinical laboratory assays that analyze blood or other biological fluids are the highest priority because these specimens can be collected with minimal risk to the patient. However, progress in developing sensitive assays for monitoring B. mallei infection is hampered by a shortage of useful biomarkers. Reasoning that there should be a strong correlation between the proteomes of infected tissues and circulating serum, we employed imaging mass spectrometry (IMS) of thin-sectioned tissues from Chlorocebus aethiops (African green) monkeys infected with B. mallei to localize host and pathogen proteins that were associated with abscesses. Using laser-capture microdissection of specific regions identified by IMS and histology within the tissue sections, a more extensive proteomic analysis was performed by a technique that combined the physical separation capabilities of liquid chromatography (LC) with the sensitive mass analysis capabilities of mass spectrometry (LC-MS/MS). By examining standard formalin-fixed, paraffin-embedded tissue sections, this strategy resulted in the identification of several proteins that were associated with lung and skin abscesses, including the host protein calprotectin and the pathogen protein GroEL. Elevated levels of calprotectin detected by ELISA and antibody responses to GroEL, measured by a microarray of the bacterial proteome, were subsequently detected in the sera of C. aethiops, Macaca mulatta, and Macaca fascicularis primates infected with B. mallei. Our results demonstrate that a combination of multidimensional MS analysis of traditional histology specimens with high-content protein microarrays can be used to discover lead pairs of host-pathogen biomarkers of infection that are identifiable in biological fluids.

  19. An Integrated Modeling Framework Forecasting Ecosystem Exposure-- A Systems Approach to the Cumulative Impacts of Multiple Stressors

    NASA Astrophysics Data System (ADS)

    Johnston, J. M.

    2013-12-01

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.

  20. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  1. Reprint Of: Enhanced spatially-resolved trace analysis using combined SIMS-single-stage AMS

    NASA Astrophysics Data System (ADS)

    Grabowski, K. S.; Groopman, E. E.; Fahey, A. J.

    2018-01-01

    Secondary ion mass spectrometry (SIMS) provides spatially resolved trace analysis of solid materials, but can be complicated by unresolved abundant molecular isobars. By adding a 300-kV single-stage accelerator mass spectrometer (SSAMS) as a detector for a Cameca ims 4f SIMS, one can measure more abundant positive ions from the SIMS while removing molecular isobars, thus improving very low abundance trace element and isotope analysis. This paper describes important features and capabilities of such an integrated system at the Naval Research Laboratory using charge state +1 ions. Transmission loss is compared to molecule destruction as gas flow to the molecule-destruction cell increases. As most measurements tolerate more modest abundance sensitivities than for 14C analysis, a lower gas flow is acceptable, so good transmission of 20-50% for ions of interest can be maintained for a broad range of ion masses. This new instrument has measured isotope ratios for uranium, lead, rare earths, and other elements from particulates and localized regions, with molecule destruction enabling the measurement at low SIMS mass resolving power and thus high transmission, as examples will show. This new and world-unique instrument provides improved capabilities for applications in nuclear and other forensics, geochemistry, cosmochemistry, and the development of optical, electronic, multifunctional, and structural materials.

  2. Enhanced spatially-resolved trace analysis using combined SIMS-single-stage AMS

    NASA Astrophysics Data System (ADS)

    Grabowski, K. S.; Groopman, E. E.; Fahey, A. J.

    2017-11-01

    Secondary ion mass spectrometry (SIMS) provides spatially resolved trace analysis of solid materials, but can be complicated by unresolved abundant molecular isobars. By adding a 300-kV single-stage accelerator mass spectrometer (SSAMS) as a detector for a Cameca ims 4f SIMS, one can measure more abundant positive ions from the SIMS while removing molecular isobars, thus improving very low abundance trace element and isotope analysis. This paper describes important features and capabilities of such an integrated system at the Naval Research Laboratory using charge state +1 ions. Transmission loss is compared to molecule destruction as gas flow to the molecule-destruction cell increases. As most measurements tolerate more modest abundance sensitivities than for 14C analysis, a lower gas flow is acceptable, so good transmission of 20-50% for ions of interest can be maintained for a broad range of ion masses. This new instrument has measured isotope ratios for uranium, lead, rare earths, and other elements from particulates and localized regions, with molecule destruction enabling the measurement at low SIMS mass resolving power and thus high transmission, as examples will show. This new and world-unique instrument provides improved capabilities for applications in nuclear and other forensics, geochemistry, cosmochemistry, and the development of optical, electronic, multifunctional, and structural materials.

  3. An interferometric imaging biosensor using weighted spectrum analysis to confirm DNA monolayer films with attogram sensitivity.

    PubMed

    Fu, Rongxin; Li, Qi; Wang, Ruliang; Xue, Ning; Lin, Xue; Su, Ya; Jiang, Kai; Jin, Xiangyu; Lin, Rongzan; Gan, Wupeng; Lu, Ying; Huang, Guoliang

    2018-05-01

    Interferometric imaging biosensors are powerful and convenient tools for confirming the existence of DNA monolayer films on silicon microarray platforms. However, their accuracy and sensitivity need further improvement because DNA molecules contribute to an inconspicuous interferometric signal both in thickness and size. Such weaknesses result in poor performance of these biosensors for low DNA content analyses and point mutation tests. In this paper, an interferometric imaging biosensor with weighted spectrum analysis is presented to confirm DNA monolayer films. The interferometric signal of DNA molecules can be extracted and then quantitative detection results for DNA microarrays can be reconstructed. With the proposed strategy, the relative error of thickness detection was reduced from 88.94% to merely 4.15%. The mass sensitivity per unit area of the proposed biosensor reached 20 attograms (ag). Therefore, the sample consumption per unit area of the target DNA content was only 62.5 zeptomoles (zm), with the volume of 0.25 picolitres (pL). Compared with the fluorescence resonance energy transfer (FRET), the measurement veracity of the interferometric imaging biosensor with weighted spectrum analysis is free to the changes in spotting concentration and DNA length. The detection range was more than 1µm. Moreover, single nucleotide mismatch could be pointed out combined with specific DNA ligation. A mutation experiment for lung cancer detection proved the high selectivity and accurate analysis capability of the presented biosensor. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Detection and Characterization of Low Temperature Peat Fires during the 2015 Fire Catastrophe in Indonesia Using a New High-Sensitivity Fire Monitoring Satellite Sensor (FireBird)

    PubMed Central

    Atwood, Elizabeth C.; Englhart, Sandra; Lorenz, Eckehard; Halle, Winfried; Wiedemann, Werner; Siegert, Florian

    2016-01-01

    Vast and disastrous fires occurred on Borneo during the 2015 dry season, pushing Indonesia into the top five carbon emitting countries. The region was affected by a very strong El Niño-Southern Oscillation (ENSO) climate phenomenon, on par with the last severe event in 1997/98. Fire dynamics in Central Kalimantan were investigated using an innovative sensor offering higher sensitivity to a wider range of fire intensities at a finer spatial resolution (160 m) than heretofore available. The sensor is onboard the TET-1 satellite, part of the German Aerospace Center (DLR) FireBird mission. TET-1 images (acquired every 2–3 days) from the middle infrared were used to detect fires continuously burning for almost three weeks in the protected peatlands of Sebangau National Park as well as surrounding areas with active logging and oil palm concessions. TET-1 detection capabilities were compared with MODIS active fire detection and Landsat burned area algorithms. Fire dynamics, including fire front propagation speed and area burned, were investigated. We show that TET-1 has improved detection capabilities over MODIS in monitoring low-intensity peatland fire fronts through thick smoke and haze. Analysis of fire dynamics revealed that the largest burned areas resulted from fire front lines started from multiple locations, and the highest propagation speeds were in excess of 500 m/day (all over peat > 2m deep). Fires were found to occur most often in concessions that contained drainage infrastructure but were not cleared prior to the fire season. Benefits of implementing this sensor system to improve current fire management techniques are discussed. Near real-time fire detection together with enhanced fire behavior monitoring capabilities would not only improve firefighting efforts, but also benefit analysis of fire impact on tropical peatlands, greenhouse gas emission estimations as well as mitigation measures to reduce severe fire events in the future. PMID:27486664

  5. Detection and Characterization of Low Temperature Peat Fires during the 2015 Fire Catastrophe in Indonesia Using a New High-Sensitivity Fire Monitoring Satellite Sensor (FireBird).

    PubMed

    Atwood, Elizabeth C; Englhart, Sandra; Lorenz, Eckehard; Halle, Winfried; Wiedemann, Werner; Siegert, Florian

    2016-01-01

    Vast and disastrous fires occurred on Borneo during the 2015 dry season, pushing Indonesia into the top five carbon emitting countries. The region was affected by a very strong El Niño-Southern Oscillation (ENSO) climate phenomenon, on par with the last severe event in 1997/98. Fire dynamics in Central Kalimantan were investigated using an innovative sensor offering higher sensitivity to a wider range of fire intensities at a finer spatial resolution (160 m) than heretofore available. The sensor is onboard the TET-1 satellite, part of the German Aerospace Center (DLR) FireBird mission. TET-1 images (acquired every 2-3 days) from the middle infrared were used to detect fires continuously burning for almost three weeks in the protected peatlands of Sebangau National Park as well as surrounding areas with active logging and oil palm concessions. TET-1 detection capabilities were compared with MODIS active fire detection and Landsat burned area algorithms. Fire dynamics, including fire front propagation speed and area burned, were investigated. We show that TET-1 has improved detection capabilities over MODIS in monitoring low-intensity peatland fire fronts through thick smoke and haze. Analysis of fire dynamics revealed that the largest burned areas resulted from fire front lines started from multiple locations, and the highest propagation speeds were in excess of 500 m/day (all over peat > 2m deep). Fires were found to occur most often in concessions that contained drainage infrastructure but were not cleared prior to the fire season. Benefits of implementing this sensor system to improve current fire management techniques are discussed. Near real-time fire detection together with enhanced fire behavior monitoring capabilities would not only improve firefighting efforts, but also benefit analysis of fire impact on tropical peatlands, greenhouse gas emission estimations as well as mitigation measures to reduce severe fire events in the future.

  6. Novel selective TOCSY method enables NMR spectral elucidation of metabolomic mixtures

    NASA Astrophysics Data System (ADS)

    MacKinnon, Neil; While, Peter T.; Korvink, Jan G.

    2016-11-01

    Complex mixture analysis is routinely encountered in NMR-based investigations. With the aim of component identification, spectral complexity may be addressed chromatographically or spectroscopically, the latter being favored to reduce sample handling requirements. An attractive experiment is selective total correlation spectroscopy (sel-TOCSY), which is capable of providing tremendous spectral simplification and thereby enhancing assignment capability. Unfortunately, isolating a well resolved resonance is increasingly difficult as the complexity of the mixture increases and the assumption of single spin system excitation is no longer robust. We present TOCSY optimized mixture elucidation (TOOMIXED), a technique capable of performing spectral assignment particularly in the case where the assumption of single spin system excitation is relaxed. Key to the technique is the collection of a series of 1D sel-TOCSY experiments as a function of the isotropic mixing time (τm), resulting in a series of resonance intensities indicative of the underlying molecular structure. By comparing these τm -dependent intensity patterns with a library of pre-determined component spectra, one is able to regain assignment capability. After consideration of the technique's robustness, we tested TOOMIXED firstly on a model mixture. As a benchmark we were able to assign a molecule with high confidence in the case of selectively exciting an isolated resonance. Assignment confidence was not compromised when performing TOOMIXED on a resonance known to contain multiple overlapping signals, and in the worst case the method suggested a follow-up sel-TOCSY experiment to confirm an ambiguous assignment. TOOMIXED was then demonstrated on two realistic samples (whisky and urine), where under our conditions an approximate limit of detection of 0.6 mM was determined. Taking into account literature reports for the sel-TOCSY limit of detection, the technique should reach on the order of 10 μ M sensitivity. We anticipate this technique will be highly attractive to various analytical fields facing mixture analysis, including metabolomics, foodstuff analysis, pharmaceutical analysis, and forensics.

  7. Laser dissection sampling modes for direct mass spectral analysis [using a hybrid optical microscopy/laser ablation liquid vortex capture/electrospray ionization system

    DOE PAGES

    Cahill, John F.; Kertesz, Vilmos; Van Berkel, Gary J.

    2016-02-01

    Here, laser microdissection coupled directly with mass spectrometry provides the capability of on-line analysis of substrates with high spatial resolution, high collection efficiency, and freedom on shape and size of the sampling area. Establishing the merits and capabilities of the different sampling modes that the system provides is necessary in order to select the best sampling mode for characterizing analytically challenging samples. The capabilities of laser ablation spot sampling, laser ablation raster sampling, and laser 'cut and drop' sampling modes of a hybrid optical microscopy/laser ablation liquid vortex capture electrospray ionization mass spectrometry system were compared for the analysis ofmore » single cells and tissue. Single Chlamydomonas reinhardtii cells were monitored for their monogalactosyldiacylglycerol (MGDG) and diacylglyceryltrimethylhomo-Ser (DGTS) lipid content using the laser spot sampling mode, which was capable of ablating individual cells (4-15 m) even when agglomerated together. Turbid Allium Cepa cells (150 m) having unique shapes difficult to precisely measure using the other sampling modes could be ablated in their entirety using laser raster sampling. Intact microdissections of specific regions of a cocaine-dosed mouse brain tissue were compared using laser 'cut and drop' sampling. Since in laser 'cut and drop' sampling whole and otherwise unmodified sections are captured into the probe, 100% collection efficiencies were achieved. Laser ablation spot sampling has the highest spatial resolution of any sampling mode, while laser ablation raster sampling has the highest sampling area adaptability of the sampling modes. In conclusion, laser ablation spot sampling has the highest spatial resolution of any sampling mode, useful in this case for the analysis of single cells. Laser ablation raster sampling was best for sampling regions with unique shapes that are difficult to measure using other sampling modes. Laser 'cut and drop' sampling can be used for cases where the highest sensitivity is needed, for example, monitoring drugs present in trace amounts in tissue.« less

  8. Laser dissection sampling modes for direct mass spectral analysis [using a hybrid optical microscopy/laser ablation liquid vortex capture/electrospray ionization system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cahill, John F.; Kertesz, Vilmos; Van Berkel, Gary J.

    Here, laser microdissection coupled directly with mass spectrometry provides the capability of on-line analysis of substrates with high spatial resolution, high collection efficiency, and freedom on shape and size of the sampling area. Establishing the merits and capabilities of the different sampling modes that the system provides is necessary in order to select the best sampling mode for characterizing analytically challenging samples. The capabilities of laser ablation spot sampling, laser ablation raster sampling, and laser 'cut and drop' sampling modes of a hybrid optical microscopy/laser ablation liquid vortex capture electrospray ionization mass spectrometry system were compared for the analysis ofmore » single cells and tissue. Single Chlamydomonas reinhardtii cells were monitored for their monogalactosyldiacylglycerol (MGDG) and diacylglyceryltrimethylhomo-Ser (DGTS) lipid content using the laser spot sampling mode, which was capable of ablating individual cells (4-15 m) even when agglomerated together. Turbid Allium Cepa cells (150 m) having unique shapes difficult to precisely measure using the other sampling modes could be ablated in their entirety using laser raster sampling. Intact microdissections of specific regions of a cocaine-dosed mouse brain tissue were compared using laser 'cut and drop' sampling. Since in laser 'cut and drop' sampling whole and otherwise unmodified sections are captured into the probe, 100% collection efficiencies were achieved. Laser ablation spot sampling has the highest spatial resolution of any sampling mode, while laser ablation raster sampling has the highest sampling area adaptability of the sampling modes. In conclusion, laser ablation spot sampling has the highest spatial resolution of any sampling mode, useful in this case for the analysis of single cells. Laser ablation raster sampling was best for sampling regions with unique shapes that are difficult to measure using other sampling modes. Laser 'cut and drop' sampling can be used for cases where the highest sensitivity is needed, for example, monitoring drugs present in trace amounts in tissue.« less

  9. Highly Stretchable and Transparent Thermistor Based on Self-Healing Double Network Hydrogel.

    PubMed

    Wu, Jin; Han, Songjia; Yang, Tengzhou; Li, Zhong; Wu, Zixuan; Gui, Xuchun; Tao, Kai; Miao, Jianmin; Norford, Leslie K; Liu, Chuan; Huo, Fengwei

    2018-06-06

    An ultrastretchable thermistor that combines intrinsic stretchability, thermal sensitivity, transparency, and self-healing capability is fabricated. It is found the polyacrylamide/carrageenan double network (DN) hydrogel is highly sensitive to temperature and therefore can be exploited as a novel channel material for a thermistor. This thermistor can be stretched from 0 to 330% strain with the sensitivity as high as 2.6%/°C at extreme 200% strain. Noticeably, the mechanical, electrical, and thermal sensing properties of the DN hydrogel can be self-healed, analogous to the self-healing capability of human skin. The large mechanical deformations, such as flexion and twist with large angles, do not affect the thermal sensitivity. Good flexibility enables the thermistor to be attached on nonplanar curvilinear surfaces for practical temperature detection. Remarkably, the thermal sensitivity can be improved by introducing mechanical strain, making the sensitivity programmable. This thermistor with tunable sensitivity is advantageous over traditional rigid thermistors that lack flexibility in adjusting their sensitivity. In addition to superior sensitivity and stretchability compared with traditional thermistors, this DN hydrogel-based thermistor provides additional advantages of good transparency and self-healing ability, enabling it to be potentially integrated in soft robots to grasp real world information for guiding their actions.

  10. Diagnosis of aphasia in stroke populations: A systematic review of language tests

    PubMed Central

    2018-01-01

    Background and purpose Accurate aphasia diagnosis is important in stroke care. A wide range of language tests are available and include informal assessments, tests developed by healthcare institutions and commercially published tests available for purchase in pre-packaged kits. The psychometrics of these tests are often reported online or within the purchased test manuals, not the peer-reviewed literature, therefore the diagnostic capabilities of these measures have not been systematically evaluated. This review aimed to identify both commercial and non-commercial language tests and tests used in stroke care and to examine the diagnostic capabilities of all identified measures in diagnosing aphasia in stroke populations. Methods Language tests were identified through a systematic search of 161 publisher databases, professional and resource websites and language tests reported to be used in stroke care. Two independent reviewers evaluated test manuals or associated resources for cohort or cross-sectional studies reporting the tests’ diagnostic capabilities (sensitivity, specificity, likelihood ratios or diagnostic odds ratios) in differentiating aphasic and non-aphasic stroke populations. Results Fifty-six tests met the study eligibility criteria. Six “non-specialist” brief screening tests reported sensitivity and specificity information, however none of these measures reported to meet the specific diagnostic needs of speech pathologists. The 50 remaining measures either did not report validity data (n = 7); did not compare patient test performance with a comparison group (n = 17); included non-stroke participants within their samples (n = 23) or did not compare stroke patient performance against a language reference standard (n = 3). Diagnostic sensitivity analysis was completed for six speech pathology measures (WAB, PICA, CADL-2, ASHA-FACS, Adult FAVRES and EFA-4), however all studies compared aphasic performance with that of non-stroke healthy controls and were consequently excluded from the review. Conclusions No speech pathology test was found which reported diagnostic data for identifying aphasia in stroke populations. A diagnostically validated post-stroke aphasia test is needed. PMID:29566043

  11. A Comparative Analysis of Coprologic Diagnostic Methods for Detection of Toxoplama gondii in Cats

    PubMed Central

    Salant, Harold; Spira, Dan T.; Hamburger, Joseph

    2010-01-01

    The relative role of transmission of Toxoplasma gondii infection from cats to humans appears to have recently increased in certain areas. Large-scale screening of oocyst shedding in cats cannot rely on microscopy because oocyst identification lacks sensitivity and specificity, or on bioassays, which require test animals and weeks before examination. We compared a sensitive and species-specific coprologic–polymerase chain reaction (copro-PCR) for detection of T. gondii infected cats with microscopy and a bioassay. In experimentally infected cats followed over time, microscopy was positive occasionally, and positive copro-PCR and bioassay results were obtained continuously from days 2 to 24 post-infection. The copro-PCR is at least as sensitive and specific as the bioassay and is capable of detecting infective oocysts during cat infection. Therefore, this procedure can be used as the new gold standard for determining potential cat infectivity. Its technologic advantages over the bioassay make it superior for large-scale screening of cats. PMID:20439968

  12. Some Advanced Concepts in Discrete Aerodynamic Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Green, Lawrence L.; Newman, Perry A.; Putko, Michele M.

    2003-01-01

    An efficient incremental iterative approach for differentiating advanced flow codes is successfully demonstrated on a two-dimensional inviscid model problem. The method employs the reverse-mode capability of the automatic differentiation software tool ADIFOR 3.0 and is proven to yield accurate first-order aerodynamic sensitivity derivatives. A substantial reduction in CPU time and computer memory is demonstrated in comparison with results from a straightforward, black-box reverse-mode applicaiton of ADIFOR 3.0 to the same flow code. An ADIFOR-assisted procedure for accurate second-rder aerodynamic sensitivity derivatives is successfully verified on an inviscid transonic lifting airfoil example problem. The method requires that first-order derivatives are calculated first using both the forward (direct) and reverse (adjoinct) procedures; then, a very efficient noniterative calculation of all second-order derivatives can be accomplished. Accurate second derivatives (i.e., the complete Hesian matrices) of lift, wave drag, and pitching-moment coefficients are calculated with respect to geometric shape, angle of attack, and freestream Mach number.

  13. Some Advanced Concepts in Discrete Aerodynamic Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Green, Lawrence L.; Newman, Perry A.; Putko, Michele M.

    2001-01-01

    An efficient incremental-iterative approach for differentiating advanced flow codes is successfully demonstrated on a 2D inviscid model problem. The method employs the reverse-mode capability of the automatic- differentiation software tool ADIFOR 3.0, and is proven to yield accurate first-order aerodynamic sensitivity derivatives. A substantial reduction in CPU time and computer memory is demonstrated in comparison with results from a straight-forward, black-box reverse- mode application of ADIFOR 3.0 to the same flow code. An ADIFOR-assisted procedure for accurate second-order aerodynamic sensitivity derivatives is successfully verified on an inviscid transonic lifting airfoil example problem. The method requires that first-order derivatives are calculated first using both the forward (direct) and reverse (adjoint) procedures; then, a very efficient non-iterative calculation of all second-order derivatives can be accomplished. Accurate second derivatives (i.e., the complete Hessian matrices) of lift, wave-drag, and pitching-moment coefficients are calculated with respect to geometric- shape, angle-of-attack, and freestream Mach number

  14. Multidisciplinary optimization of controlled space structures with global sensitivity equations

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; James, Benjamin B.; Graves, Philip C.; Woodard, Stanley E.

    1991-01-01

    A new method for the preliminary design of controlled space structures is presented. The method coordinates standard finite element structural analysis, multivariable controls, and nonlinear programming codes and allows simultaneous optimization of the structures and control systems of a spacecraft. Global sensitivity equations are a key feature of this method. The preliminary design of a generic geostationary platform is used to demonstrate the multidisciplinary optimization method. Fifteen design variables are used to optimize truss member sizes and feedback gain values. The goal is to reduce the total mass of the structure and the vibration control system while satisfying constraints on vibration decay rate. Incorporating the nonnegligible mass of actuators causes an essential coupling between structural design variables and control design variables. The solution of the demonstration problem is an important step toward a comprehensive preliminary design capability for structures and control systems. Use of global sensitivity equations helps solve optimization problems that have a large number of design variables and a high degree of coupling between disciplines.

  15. Gravitropism: interaction of sensitivity modulation and effector redistribution

    NASA Technical Reports Server (NTRS)

    Evans, M. L.

    1991-01-01

    Our increasing capabilities for quantitative hormone analysis and automated high resolution growth studies have allowed a reassessment of the classical Cholodny-Went hypothesis of gravitropism. According to this hypothesis, gravity induces redistribution of auxin toward the lower side of the organ and this causes the growth asymmetry that leads to reorientation. Arguments against the Cholodny-Went hypothesis that were based primarily on concerns over the timing and magnitude of the development of hormone asymmetry are countered by recent evidence that such asymmetry develops early and is sufficiently large to account for curvature. Thus, it appears that the Cholodny-Went hypothesis is fundamentally valid. However, recent comparative studies of the kinetics of curvature and the timing of the development of hormone asymmetry indicate that this hypothesis alone cannot account for the intricacies of the gravitropic response. It appears that time-dependent gravity-induced changes in hormone sensitivity as well as changes in sensitivity of the gravity receptor play important roles in the response.

  16. Gravitropism: Interaction of Sensitivity Modulation and Effector Redistribution 1

    PubMed Central

    Evans, Michael L.

    1991-01-01

    Our increasing capabilities for quantitative hormone analysis and automated high resolution growth studies have allowed a reassessment of the classical Cholodny-Went hypothesis of gravitropism. According to this hypothesis, gravity induces redistribution of auxin toward the lower side of the organ and this causes the growth asymmetry that leads to reorientation. Arguments against the Cholodny-Went hypothesis that were based primarily on concerns over the timing and magnitude of the development of hormone asymmetry are countered by recent evidence that such asymmetry develops early and is sufficiently large to account for curvature. Thus, it appears that the Cholodny-Went hypothesis is fundamentally valid. However, recent comparative studies of the kinetics of curvature and the timing of the development of hormone asymmetry indicate that this hypothesis alone cannot account for the intricacies of the gravitropic response. It appears that time-dependent gravity-induced changes in hormone sensitivity as well as changes in sensitivity of the gravity receptor play important roles in the response. PMID:11537485

  17. Portable evanescent wave fiber biosensor for highly sensitive detection of Shigella

    NASA Astrophysics Data System (ADS)

    Xiao, Rui; Rong, Zhen; Long, Feng; Liu, Qiqi

    2014-11-01

    A portable evanescent wave fiber biosensor was developed to achieve the rapid and highly sensitive detection of Shigella. In this study, a DNA probe was covalently immobilized onto fiber-optic biosensors that can hybridize with a fluorescently labeled complementary DNA. The sensitivity of detection for synthesized oligonucleotides can reach 10-10 M. The surface of the sensor can be regenerated with 0.5% sodium dodecyl sulfate solution (pH 1.9) for over 30 times without significant deterioration of performance. The total analysis time for a single sample, including the time for measurement and surface regeneration, was less than 6 min. We employed real-time polymerase chain reaction (PCR) and compared the results of both methods to investigate the actual Shigella DNA detection capability of the fiber-optic biosensor. The fiber-optic biosensor could detect as low as 102 colony-forming unit/mL Shigella. This finding was comparable with that by real-time PCR, which suggests that this method is a potential alternative to existing detection methods.

  18. Coupled reactors analysis: New needs and advances using Monte Carlo methodology

    DOE PAGES

    Aufiero, M.; Palmiotti, G.; Salvatores, M.; ...

    2016-08-20

    Coupled reactors and the coupling features of large or heterogeneous core reactors can be investigated with the Avery theory that allows a physics understanding of the main features of these systems. However, the complex geometries that are often encountered in association with coupled reactors, require a detailed geometry description that can be easily provided by modern Monte Carlo (MC) codes. This implies a MC calculation of the coupling parameters defined by Avery and of the sensitivity coefficients that allow further detailed physics analysis. The results presented in this paper show that the MC code SERPENT has been successfully modifed tomore » meet the required capabilities.« less

  19. Manned Mars Mission program concepts

    NASA Technical Reports Server (NTRS)

    Hamilton, E. C.; Johnson, P.; Pearson, J.; Tucker, W.

    1988-01-01

    This paper describes the SRS Manned Mars Mission and Program Analysis study designed to support a manned expedition to Mars contemplated by NASA for the purposes of initiating human exploration and eventual habitation of this planet. The capabilities of the interactive software package being presently developed by the SRS for the mission/program analysis are described, and it is shown that the interactive package can be used to investigate the impact of various mission concepts on the sensitivity of mass required in LEO, schedules, relative costs, and risk. The results, to date, indicate the need for an earth-to-orbit transportation system much larger than the present STS, reliable long-life support systems, and either advanced propulsion or aerobraking technology.

  20. Tunable X-ray speckle-based phase-contrast and dark-field imaging using the unified modulated pattern analysis approach

    NASA Astrophysics Data System (ADS)

    Zdora, M.-C.; Thibault, P.; Deyhle, H.; Vila-Comamala, J.; Rau, C.; Zanette, I.

    2018-05-01

    X-ray phase-contrast and dark-field imaging provides valuable, complementary information about the specimen under study. Among the multimodal X-ray imaging methods, X-ray grating interferometry and speckle-based imaging have drawn particular attention, which, however, in their common implementations incur certain limitations that can restrict their range of applications. Recently, the unified modulated pattern analysis (UMPA) approach was proposed to overcome these limitations and combine grating- and speckle-based imaging in a single approach. Here, we demonstrate the multimodal imaging capabilities of UMPA and highlight its tunable character regarding spatial resolution, signal sensitivity and scan time by using different reconstruction parameters.

  1. Mission Analysis for High Specific Impulse Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Adams, Robert B.; Polsgrove, Tara; Brady, Hugh J. (Technical Monitor)

    2002-01-01

    This paper describes trajectory calculations for high specific impulse engines. Specific impulses on the order of 10,000 to 100,000 sec are predicted in a variety of fusion powered propulsion systems. This paper and its companion paper seek to build on analyses in the literature to yield an analytical routine for determining time of flight and payload fraction to a predetermined destination. The companion paper will compare the results of this analysis to the trajectories determined by several trajectory codes. The major parameters that affect time of flight and payload fraction will be identified and their sensitivities quantified. A review of existing fusion propulsion concepts and their capabilities will also be tabulated.

  2. Application of Partial Least Square (PLS) Analysis on Fluorescence Data of 8-Anilinonaphthalene-1-Sulfonic Acid, a Polarity Dye, for Monitoring Water Adulteration in Ethanol Fuel.

    PubMed

    Kumar, Keshav; Mishra, Ashok Kumar

    2015-07-01

    Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.

  3. Approximate simulation model for analysis and optimization in engineering system design

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Computational support of the engineering design process routinely requires mathematical models of behavior to inform designers of the system response to external stimuli. However, designers also need to know the effect of the changes in design variable values on the system behavior. For large engineering systems, the conventional way of evaluating these effects by repetitive simulation of behavior for perturbed variables is impractical because of excessive cost and inadequate accuracy. An alternative is described based on recently developed system sensitivity analysis that is combined with extrapolation to form a model of design. This design model is complementary to the model of behavior and capable of direct simulation of the effects of design variable changes.

  4. Application of nuclear analytical techniques using long-life sealed-tube neutron generators.

    PubMed

    Bach, P; Cluzeau, S; Lambermont, C

    1994-01-01

    The new range of sealed-tube neutron generators developed by SODERN appears to be appropriate for the industrial environment. The main characteristics are the high emission stability during the very long lifetime of the tube, flexible pulsed mode capability, safety in operation with no radiation in "off" state, and the easy transportation of equipment. Some applications of the neutron generators, called GENIE, are considered: high-sensitivity measurement of transuranic elements in nuclear waste drums, bulk material analysis for process control, and determination of the airborne pollutants for environmental monitoring.

  5. INCA- INTERACTIVE CONTROLS ANALYSIS

    NASA Technical Reports Server (NTRS)

    Bauer, F. H.

    1994-01-01

    The Interactive Controls Analysis (INCA) program was developed to provide a user friendly environment for the design and analysis of linear control systems, primarily feedback control systems. INCA is designed for use with both small and large order systems. Using the interactive graphics capability, the INCA user can quickly plot a root locus, frequency response, or time response of either a continuous time system or a sampled data system. The system configuration and parameters can be easily changed, allowing the INCA user to design compensation networks and perform sensitivity analysis in a very convenient manner. A journal file capability is included. This stores an entire sequence of commands, generated during an INCA session into a file which can be accessed later. Also included in INCA are a context-sensitive help library, a screen editor, and plot windows. INCA is robust to VAX-specific overflow problems. The transfer function is the basic unit of INCA. Transfer functions are automatically saved and are available to the INCA user at any time. A powerful, user friendly transfer function manipulation and editing capability is built into the INCA program. The user can do all transfer function manipulations and plotting without leaving INCA, although provisions are made to input transfer functions from data files. By using a small set of commands, the user may compute and edit transfer functions, and then examine these functions by using the ROOT_LOCUS, FREQUENCY_RESPONSE, and TIME_RESPONSE capabilities. Basic input data, including gains, are handled as single-input single-output transfer functions. These functions can be developed using the function editor or by using FORTRAN- like arithmetic expressions. In addition to the arithmetic functions, special functions are available to 1) compute step, ramp, and sinusoid functions, 2) compute closed loop transfer functions, 3) convert from S plane to Z plane with optional advanced Z transform, and 4) convert from Z plane to W plane and back. These capabilities allow the INCA user to perform block diagram algebraic manipulations quickly for functions in the S, Z, and W domains. Additionally, a versatile digital control capability has been included in INCA. Special plane transformations allow the user to easily convert functions from one domain to another. Other digital control capabilities include: 1) totally independent open loop frequency response analyses on a continuous plant, discrete control system with a delay, 2) advanced Z-transform capability for systems with delays, and 3) multirate sampling analyses. The current version of INCA includes Dynamic Functions (which change when a parameter changes), standard filter generation, PD and PID controller generation, incorporation of the QZ-algorithm (function addition, inverse Laplace), and describing functions that allow the user to calculate the gain and phase characteristics of a nonlinear device. The INCA graphic modes provide the user with a convenient means to document and study frequency response, time response, and root locus analyses. General graphics features include: 1) zooming and dezooming, 2) plot documentation, 3) a table of analytic computation results, 4) multiple curves on the same plot, and 5) displaying frequency and gain information for a specific point on a curve. Additional capabilities in the frequency response mode include: 1) a full complement of graphical methods Bode magnitude, Bode phase, Bode combined magnitude and phase, Bode strip plots, root contour plots, Nyquist, Nichols, and Popov plots; 2) user selected plot scaling; and 3) gain and phase margin calculation and display. In the time response mode, additional capabilities include: 1) support for inverse Laplace and inverse Z transforms, 2) support for various input functions, 3) closed loop response evaluation, 4) loop gain sensitivity analyses, 5) intersample time response for discrete systems using the advanced Z transform, and 6) closed loop time response using mixed plane (S, Z, W) operations with delay. A Graphics mode command was added to the current version of INCA, version 3.13, to produce Metafiles (graphic files) of the currently displayed plot. The metafile can be displayed and edited using the QPLOT Graphics Editor and Replotter for Metafiles (GERM) program included with the INCA package. The INCA program is written in Pascal and FORTRAN for interactive or batch execution and has been implemented on a DEC VAX series computer under VMS. Both source code and executable code are supplied for INCA. Full INCA graphics capabilities are supported for various Tektronix 40xx and 41xx terminals; DEC VT graphics terminals; many PC and Macintosh terminal emulators; TEK014 hardcopy devices such as the LN03 Laserprinter; and bit map graphics external hardcopy devices. Also included for the TEK4510 rasterizer users are a multiple copy feature, a wide line feature, and additional graphics fonts. The INCA program was developed in 1985, Version 2.04 was released in 1986, Version 3.00 was released in 1988, and Version 3.13 was released in 1989. An INCA version 2.0X conversion program is included.

  6. Ultra-Sensitive Photoreceiver Boosts Data Transmission

    NASA Technical Reports Server (NTRS)

    2007-01-01

    NASA depends on advanced, ultra-sensitive photoreceivers and photodetectors to provide high-data communications and pinpoint image-detection and -recognition capabilities from great distances. In 2003, Epitaxial Technologies LLC was awarded a Small Business Innovation Research (SBIR) contract from Goddard Space Flight Center to address needs for advanced sensor components. Epitaxial developed a photoreciever capable of single proton sensitivity that is also smaller, lighter, and requires less power than its predecessor. This receiver operates in several wavelength ranges; will allow data rate transmissions in the terabit range; and will enhance Earth-based missions for remote sensing of crops and other natural resources, including applications for fluorescence and phosphorescence detection. Widespread military and civilian applications are anticipated, especially through enhancing fiber optic communications, laser imaging, and laser communications.

  7. Self-contained microfluidic systems: a review.

    PubMed

    Boyd-Moss, Mitchell; Baratchi, Sara; Di Venere, Martina; Khoshmanesh, Khashayar

    2016-08-16

    Microfluidic systems enable rapid diagnosis, screening and monitoring of diseases and health conditions using small amounts of biological samples and reagents. Despite these remarkable features, conventional microfluidic systems rely on bulky expensive external equipment, which hinders their utility as powerful analysis tools outside of research laboratories. 'Self-contained' microfluidic systems, which contain all necessary components to facilitate a complete assay, have been developed to address this limitation. In this review, we provide an in-depth overview of self-contained microfluidic systems. We categorise these systems based on their operating mechanisms into three major groups: passive, hand-powered and active. Several examples are provided to discuss the structure, capabilities and shortcomings of each group. In particular, we discuss the self-contained microfluidic systems enabled by active mechanisms, due to their unique capability for running multi-step and highly controllable diagnostic assays. Integration of self-contained microfluidic systems with the image acquisition and processing capabilities of smartphones, especially those equipped with accessory optical components, enables highly sensitive and quantitative assays, which are discussed. Finally, the future trends and possible solutions to expand the versatility of self-contained, stand-alone microfluidic platforms are outlined.

  8. Status of LUMINEU program to search for neutrinoless double beta decay of 100Mo with cryogenic ZnMoO4 scintillating bolometers

    NASA Astrophysics Data System (ADS)

    Danevich, F. A.; Bergé, L.; Boiko, R. S.; Chapellier, M.; Chernyak, D. M.; Coron, N.; Devoyon, L.; Drillien, A.-A.; Dumoulin, L.; Enss, C.; Fleischmann, A.; Gastaldo, L.; Giuliani, A.; Gray, D.; Gros, M.; Hervé, S.; Humbert, V.; Ivanov, I. M.; Juillard, A.; Kobychev, V. V.; Koskas, F.; Loidl, M.; Magnier, P.; Makarov, E. P.; Mancuso, M.; de Marcillac, P.; Marnieros, S.; Marrache-Kikuchi, C.; Navick, X.-F.; Nones, C.; Olivieri, E.; Paul, B.; Penichot, Y.; Pessina, G.; Plantevin, O.; Poda, D. V.; Redon, T.; Rodrigues, M.; Shlegel, V. N.; Strazzer, O.; Tenconi, M.; Torres, L.; Tretyak, V. I.; Vasiliev, Ya. V.; Velazquez, M.; Viraphong, O.

    2015-10-01

    The LUMTNEU program aims at performing a pilot experiment on 0ν2β decay of 100Mo using radiopure ZnMoO4 crystals enriched in 100Mo operated as cryogenic scintillating bolometers. Large volume ZnMoO4 crystal scintillators (˜ 0.3 kg) were developed and tested showing high performance in terms of radiopurity, energy resolution and α/β particle discrimination capability. Zinc molybdate crystal scintillators enriched in 100Mo were grown for the first time by the low-thermal-gradient Czochralski technique with a high crystal yield and an acceptable level of enriched molybdenum irrecoverable losses. A background level of ˜ 0.5 counts/(yr keV ton) in the region of interest can be reached in a large detector array thanks to the excellent detectors radiopurity and particle discrimination capability, suppression of randomly coinciding events by pulse-shape analysis, and anticoincidence cut. These results pave the way to future sensitive searches based on the LUMTNEU technology, capable of approachingand exploring the inverted hierarchy region of the neutrino mass pattern.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danevich, F. A., E-mail: danevich@kinr.kiev.ua; Boiko, R. S.; Chernyak, D. M.

    The LUMTNEU program aims at performing a pilot experiment on 0ν2β decay of {sup 100}Mo using radiopure ZnMoO{sub 4} crystals enriched in {sup 100}Mo operated as cryogenic scintillating bolometers. Large volume ZnMoO{sub 4} crystal scintillators (∼ 0.3 kg) were developed and tested showing high performance in terms of radiopurity, energy resolution and α/β particle discrimination capability. Zinc molybdate crystal scintillators enriched in {sup 100}Mo were grown for the first time by the low-thermal-gradient Czochralski technique with a high crystal yield and an acceptable level of enriched molybdenum irrecoverable losses. A background level of ∼ 0.5 counts/(yr keV ton) in the regionmore » of interest can be reached in a large detector array thanks to the excellent detectors radiopurity and particle discrimination capability, suppression of randomly coinciding events by pulse-shape analysis, and anticoincidence cut. These results pave the way to future sensitive searches based on the LUMTNEU technology, capable of approachingand exploring the inverted hierarchy region of the neutrino mass pattern.« less

  10. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Technical Reports Server (NTRS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  11. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    NASA Astrophysics Data System (ADS)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ying

    My graduate research has focused on separation science and bioanalytical analysis, which emphasized in method development. It includes three major areas: enantiomeric separations using high performance liquid chromatography (HPLC), Super/subcritical fluid chromatography (SFC), and capillary electrophoresis (CE); drug-protein binding behavior studies using CE; and carbohydrate analysis using liquid chromatograph-electrospray ionization mass spectrometry (LC-ESI-MS). Enantiomeric separations continue to be extremely important in the pharmaceutical industry. An in-depth evaluation of the enantiomeric separation capabilities of macrocyclic glycopeptides CSPs with SFC mobile phases was investigated using a set of over 100 chiral compounds. It was found that the macrocyclic based CSPs were ablemore » to separate enantiomers of various compounds with different polarities and functionalities. Seventy percent of all separations were achieved in less than 4 min due to the high flow rate (4.0 ml/min) that can be used in SFC. Drug-protein binding is an important process in determining the activity and fate of a drug once it enters the body. Two drug/protein systems have been studied using frontal analysis CE method. More sensitive fluorescence detection was introduced in this assay, which overcame the problem of low sensitivity that is common when using UV detection for drug-protein studies. In addition, the first usage of an argon ion laser with 257 nm beam coupled with CCD camera as a frontal analysis detection method enabled the simultaneous observation of drug fluorescence as well as the protein fluorescence. LC-ESI-MS was used for the separation and characterization of underivatized oligosaccharide mixtures. With the limits of detection as low as 50 picograms, all individual components of oligosaccharide mixtures (up to 11 glucose-units long) were baseline resolved on a Cyclobond I 2000 column and detected using ESI-MS. This system is characterized by high chromatographic resolution, high column stability, and high sensitivity. In addition, this method showed potential usefulness for the sensitive and quick analysis of hydrolysis products of polysaccharides, and for trace level analysis of individual oligosaccharides or oligosaccharide isomers from biological systems.« less

  13. Electrochemical Quartz Crystal Nanobalance (EQCN) Based Biosensor for Sensitive Detection of Antibiotic Residues in Milk.

    PubMed

    Bhand, Sunil; Mishra, Geetesh K

    2017-01-01

    An electrochemical quartz crystal nanobalance (EQCN), which provides real-time analysis of dynamic surface events, is a valuable tool for analyzing biomolecular interactions. EQCN biosensors are based on mass-sensitive measurements that can detect small mass changes caused by chemical binding to small piezoelectric crystals. Among the various biosensors, the piezoelectric biosensor is considered one of the most sensitive analytical techniques, capable of detecting antigens at picogram levels. EQCN is an effective monitoring technique for regulation of the antibiotics below the maximum residual limit (MRL). The analysis of antibiotic residues requires high sensitivity, rapidity, reliability and cost effectiveness. For analytical purposes the general approach is to take advantage of the piezoelectric effect by immobilizing a biosensing layer on top of the piezoelectric crystal. The sensing layer usually comprises a biological material such as an antibody, enzymes, or aptamers having high specificity and selectivity for the target molecule to be detected. The biosensing layer is usually functionalized using surface chemistry modifications. When these bio-functionalized quartz crystals are exposed to a particular substance of interest (e.g., a substrate, inhibitor, antigen or protein), binding interaction occurs. This causes a frequency or mass change that can be used to determine the amount of material interacted or bound. EQCN biosensors can easily be automated by using a flow injection analysis (FIA) setup coupled through automated pumps and injection valves. Such FIA-EQCN biosensors have great potential for the detection of different analytes such as antibiotic residues in various matrices such as water, waste water, and milk.

  14. Highly linear, sensitive analog-to-digital converter

    NASA Technical Reports Server (NTRS)

    Cox, J.; Finley, W. R.

    1969-01-01

    Analog-to-digital converter converts 10 volt full scale input signal into 13 bit digital output. Advantages include high sensitivity, linearity, low quantitizing error, high resistance to mechanical shock and vibration loads, and temporary data storage capabilities.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Stacy; English, Shawn; Briggs, Timothy

    Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less

  16. Simulation and analysis of differential global positioning system for civil helicopter operations

    NASA Technical Reports Server (NTRS)

    Denaro, R. P.; Cabak, A. R.

    1983-01-01

    A Differential Global Positioning System (DGPS) computer simulation was developed, to provide a versatile tool for assessing DGPS referenced civil helicopter navigation. The civil helicopter community will probably be an early user of the GPS capability because of the unique mission requirements which include offshore exploration and low altitude transport into remote areas not currently served by ground based Navaids. The Monte Carlo simulation provided a sufficiently high fidelity dynamic motion and propagation environment to enable accurate comparisons of alternative differential GPS implementations and navigation filter tradeoffs. The analyst has provided the capability to adjust most aspects of the system, the helicopter flight profile, the receiver Kalman filter, and the signal propagation environment to assess differential GPS performance and parameter sensitivities. Preliminary analysis was conducted to evaluate alternative implementations of the differential navigation algorithm in both the position and measurement domain. Results are presented to show that significant performance gains are achieved when compared with conventional GPS but that differences due to DGPS implementation techniques were small. System performance was relatively insensitive to the update rates of the error correction information.

  17. Solving iTOUGH2 simulation and optimization problems using the PEST protocol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, S.A.; Zhang, Y.

    2011-02-01

    The PEST protocol has been implemented into the iTOUGH2 code, allowing the user to link any simulation program (with ASCII-based inputs and outputs) to iTOUGH2's sensitivity analysis, inverse modeling, and uncertainty quantification capabilities. These application models can be pre- or post-processors of the TOUGH2 non-isothermal multiphase flow and transport simulator, or programs that are unrelated to the TOUGH suite of codes. PEST-style template and instruction files are used, respectively, to pass input parameters updated by the iTOUGH2 optimization routines to the model, and to retrieve the model-calculated values that correspond to observable variables. We summarize the iTOUGH2 capabilities and demonstratemore » the flexibility added by the PEST protocol for the solution of a variety of simulation-optimization problems. In particular, the combination of loosely coupled and tightly integrated simulation and optimization routines provides both the flexibility and control needed to solve challenging inversion problems for the analysis of multiphase subsurface flow and transport systems.« less

  18. Logical analysis of diffuse large B-cell lymphomas.

    PubMed

    Alexe, G; Alexe, S; Axelrod, D E; Hammer, P L; Weissmann, D

    2005-07-01

    The goal of this study is to re-examine the oligonucleotide microarray dataset of Shipp et al., which contains the intensity levels of 6817 genes of 58 patients with diffuse large B-cell lymphoma (DLBCL) and 19 with follicular lymphoma (FL), by means of the combinatorics, optimisation, and logic-based methodology of logical analysis of data (LAD). The motivations for this new analysis included the previously demonstrated capabilities of LAD and its expected potential (1) to identify different informative genes than those discovered by conventional statistical methods, (2) to identify combinations of gene expression levels capable of characterizing different types of lymphoma, and (3) to assemble collections of such combinations that if considered jointly are capable of accurately distinguishing different types of lymphoma. The central concept of LAD is a pattern or combinatorial biomarker, a concept that resembles a rule as used in decision tree methods. LAD is able to exhaustively generate the collection of all those patterns which satisfy certain quality constraints, through a systematic combinatorial process guided by clear optimization criteria. Then, based on a set covering approach, LAD aggregates the collection of patterns into classification models. In addition, LAD is able to use the information provided by large collections of patterns in order to extract subsets of variables, which collectively are able to distinguish between different types of disease. For the differential diagnosis of DLBCL versus FL, a model based on eight significant genes is constructed and shown to have a sensitivity of 94.7% and a specificity of 100% on the test set. For the prognosis of good versus poor outcome among the DLBCL patients, a model is constructed on another set consisting also of eight significant genes, and shown to have a sensitivity of 87.5% and a specificity of 90% on the test set. The genes selected by LAD also work well as a basis for other kinds of statistical analysis, indicating their robustness. These two models exhibit accuracies that compare favorably to those in the original study. In addition, the current study also provides a ranking by importance of the genes in the selected significant subsets as well as a library of dozens of combinatorial biomarkers (i.e. pairs or triplets of genes) that can serve as a source of mathematically generated, statistically significant research hypotheses in need of biological explanation.

  19. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  20. In vivo sensitivity estimation and imaging acceleration with rotating RF coil arrays at 7 Tesla.

    PubMed

    Li, Mingyan; Jin, Jin; Zuo, Zhentao; Liu, Feng; Trakic, Adnan; Weber, Ewald; Zhuo, Yan; Xue, Rong; Crozier, Stuart

    2015-03-01

    Using a new rotating SENSitivity Encoding (rotating-SENSE) algorithm, we have successfully demonstrated that the rotating radiofrequency coil array (RRFCA) was capable of achieving a significant reduction in scan time and a uniform image reconstruction for a homogeneous phantom at 7 Tesla. However, at 7 Tesla the in vivo sensitivity profiles (B1(-)) become distinct at various angular positions. Therefore, sensitivity maps at other angular positions cannot be obtained by numerically rotating the acquired ones. In this work, a novel sensitivity estimation method for the RRFCA was developed and validated with human brain imaging. This method employed a library database and registration techniques to estimate coil sensitivity at an arbitrary angular position. The estimated sensitivity maps were then compared to the acquired sensitivity maps. The results indicate that the proposed method is capable of accurately estimating both magnitude and phase of sensitivity at an arbitrary angular position, which enables us to employ the rotating-SENSE algorithm to accelerate acquisition and reconstruct image. Compared to a stationary coil array with the same number of coil elements, the RRFCA was able to reconstruct images with better quality at a high reduction factor. It is hoped that the proposed rotation-dependent sensitivity estimation algorithm and the acceleration ability of the RRFCA will be particularly useful for ultra high field MRI. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. In vivo sensitivity estimation and imaging acceleration with rotating RF coil arrays at 7 Tesla

    NASA Astrophysics Data System (ADS)

    Li, Mingyan; Jin, Jin; Zuo, Zhentao; Liu, Feng; Trakic, Adnan; Weber, Ewald; Zhuo, Yan; Xue, Rong; Crozier, Stuart

    2015-03-01

    Using a new rotating SENSitivity Encoding (rotating-SENSE) algorithm, we have successfully demonstrated that the rotating radiofrequency coil array (RRFCA) was capable of achieving a significant reduction in scan time and a uniform image reconstruction for a homogeneous phantom at 7 Tesla. However, at 7 Tesla the in vivo sensitivity profiles (B1-) become distinct at various angular positions. Therefore, sensitivity maps at other angular positions cannot be obtained by numerically rotating the acquired ones. In this work, a novel sensitivity estimation method for the RRFCA was developed and validated with human brain imaging. This method employed a library database and registration techniques to estimate coil sensitivity at an arbitrary angular position. The estimated sensitivity maps were then compared to the acquired sensitivity maps. The results indicate that the proposed method is capable of accurately estimating both magnitude and phase of sensitivity at an arbitrary angular position, which enables us to employ the rotating-SENSE algorithm to accelerate acquisition and reconstruct image. Compared to a stationary coil array with the same number of coil elements, the RRFCA was able to reconstruct images with better quality at a high reduction factor. It is hoped that the proposed rotation-dependent sensitivity estimation algorithm and the acceleration ability of the RRFCA will be particularly useful for ultra high field MRI.

  2. Process analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy.

    PubMed

    Fink, Herbert; Panne, Ulrich; Niessner, Reinhard

    2002-09-01

    An experimental setup for direct elemental analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy (LIPS, or laser-induced breakdown spectroscopy, LIBS) was realized. The combination of a echelle spectrograph, featuring a high resolution with a broad spectral coverage, with multivariate methods, such as PLS, PCR, and variable subset selection via a genetic algorithm, resulted in considerable improvements in selectivity and sensitivity for this complex matrix. With a normalization to carbon as internal standard, the limits of detection were in the ppm range. A preliminary pattern recognition study points to the possibility of polymer recognition via the line-rich echelle spectra. Several experiments at an extruder within a recycling plant demonstrated successfully the capability of LIPS for different kinds of routine on-line process analysis.

  3. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  4. Three dimensional, numerical analysis of an elasto hydrodynamic lubrication using fluid structure interaction (FSI) approach

    NASA Astrophysics Data System (ADS)

    Hanoca, P.; Ramakrishna, H. V.

    2018-03-01

    This work is related to develop a methodology to model and simulate the TEHD using the sequential application of CFD and CSD. The FSI analyses are carried out using ANSYS Workbench. In this analysis steady state, 3D Navier-Stoke equations along with energy equation are solved. Liquid properties are introduced where the viscosity and density are the function of pressure and temperature. The cavitation phenomenon is adopted in the analysis. Numerical analysis has been carried at different speeds and surfaces temperatures. During the analysis, it was found that as speed increases, hydrodynamic pressures will also increases. The pressure profile obtained from the Roelands equation is more sensitive to the temperature as compared to the Barus equation. The stress distributions specify the significant positions in the bearing structure. The developed method is capable of giving latest approaching into the physics of elasto hydrodynamic lubrication.

  5. Modeling, design, packing and experimental analysis of liquid-phase shear-horizontal surface acoustic wave sensors

    NASA Astrophysics Data System (ADS)

    Pollard, Thomas B

    Recent advances in microbiology, computational capabilities, and microelectromechanical-system fabrication techniques permit modeling, design, and fabrication of low-cost, miniature, sensitive and selective liquid-phase sensors and lab-on-a-chip systems. Such devices are expected to replace expensive, time-consuming, and bulky laboratory-based testing equipment. Potential applications for devices include: fluid characterization for material science and industry; chemical analysis in medicine and pharmacology; study of biological processes; food analysis; chemical kinetics analysis; and environmental monitoring. When combined with liquid-phase packaging, sensors based on surface-acoustic-wave (SAW) technology are considered strong candidates. For this reason such devices are focused on in this work; emphasis placed on device modeling and packaging for liquid-phase operation. Regarding modeling, topics considered include mode excitation efficiency of transducers; mode sensitivity based on guiding structure materials/geometries; and use of new piezoelectric materials. On packaging, topics considered include package interfacing with SAW devices, and minimization of packaging effects on device performance. In this work novel numerical models are theoretically developed and implemented to study propagation and transduction characteristics of sensor designs using wave/constitutive equations, Green's functions, and boundary/finite element methods. Using developed simulation tools that consider finite-thickness of all device electrodes, transduction efficiency for SAW transducers with neighboring uniform or periodic guiding electrodes is reported for the first time. Results indicate finite electrode thickness strongly affects efficiency. Using dense electrodes, efficiency is shown to approach 92% and 100% for uniform and periodic electrode guiding, respectively; yielding improved sensor detection limits. A numerical sensitivity analysis is presented targeting viscosity using uniform-electrode and shear-horizontal mode configurations on potassium-niobate, langasite, and quartz substrates. Optimum configurations are determined yielding maximum sensitivity. Results show mode propagation-loss and sensitivity to viscosity are correlated by a factor independent of substrate material. The analysis is useful for designing devices meeting sensitivity and signal level requirements. A novel, rapid and precise microfluidic chamber alignment/bonding method was developed for SAW platforms. The package is shown to have little effect on device performance and permits simple macrofluidic interfacing. Lastly, prototypes were designed, fabricated, and tested for viscosity and biosensor applications; results show ability to detect as low as 1% glycerol in water and surface-bound DNA crosslinking.

  6. Capillary Array Waveguide Amplified Fluorescence Detector for mHealth

    PubMed Central

    Balsam, Joshua; Bruck, Hugh Alan; Rasooly, Avraham

    2013-01-01

    Mobile Health (mHealth) analytical technologies are potentially useful for carrying out modern medical diagnostics in resource-poor settings. Effective mHealth devices for underserved populations need to be simple, low cost, and portable. Although cell phone cameras have been used for biodetection, their sensitivity is a limiting factor because currently it is too low to be effective for many mHealth applications, which depend on detection of weak fluorescent signals. To improve the sensitivity of portable phones, a capillary tube array was developed to amplify fluorescence signals using their waveguide properties. An array configured with 36 capillary tubes was demonstrated to have a ~100X increase in sensitivity, lowering the limit of detection (LOD) of mobile phones from 1000 nM to 10 nM for fluorescein. To confirm that the amplification was due to waveguide behavior, we coated the external surfaces of the capillaries with silver. The silver coating interfered with the waveguide behavior and diminished the fluorescence signal, thereby proving that the waveguide behavior was the main mechanism for enhancing optical sensitivity. The optical configuration described here is novel in several ways. First, the use of capillaries waveguide properties to improve detection of weak florescence signal is new. Second we describe here a three dimensional illumination system, while conventional angular laser waveguide illumination is spot (or line), which is functionally one-dimensional illumination, can illuminate only a single capillary or a single column (when a line generator is used) of capillaries and thus inherently limits the multiplexing capability of detection. The planar illumination demonstrated in this work enables illumination of a two dimensional capillary array (e.g. x columns and y rows of capillaries). In addition, the waveguide light propagation via the capillary wall provides a third dimension for illumination along the axis of the capillaries. Such an array can potentially be used for sensitive analysis of multiple fluorescent detection assays simultaneously. The simple phone based capillary array approach presented in this paper is capable of amplifying weak fluorescent signals thereby improving the sensitivity of optical detectors based on mobile phones. This may allow sensitive biological assays to be measured with low sensitivity detectors and may make mHealth practical for many diagnostics applications, especially in resource-poor and global health settings. PMID:24039345

  7. Optofluidic cellular immunofunctional analysis by localized surface plasmon resonance

    NASA Astrophysics Data System (ADS)

    Kurabayashi, Katsuo; Oh, Bo-Ram

    2014-08-01

    Cytokine secretion assays provide the means to quantify intercellular-signaling proteins secreted by blood immune cells. These assays allow researchers and clinicians to obtain valuable information on the immune status of the donor. Previous studies have demonstrated that localized surface plasmon resonance (LSPR) effects enable label-free, real-time biosensing on a nanostructured metallic surface with simple optics and sensing tunability. However, limited sensitivity coupled with a lack of sample handling capability makes it challenging to implement LSPR biosensing in cellular functional immunoanalysis based on cytokine secretion assay. This paper describes our recent progress towards full development of a label-free LSPR biosensing technique to detect cell-secreted tumor necrosis factor (TNF)-α cytokines in clinical blood samples. We integrate LSPR bionanosensors in an optofluidic platform capable of handling target immune cells in a microfluidic chamber while readily permitting optical access for cytokine detection.

  8. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  9. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  10. Performance Analysis of Direct-Sequence Code-Division Multiple-Access Communications with Asymmetric Quadrature Phase-Shift-Keying Modulation

    NASA Technical Reports Server (NTRS)

    Wang, C.-W.; Stark, W.

    2005-01-01

    This article considers a quaternary direct-sequence code-division multiple-access (DS-CDMA) communication system with asymmetric quadrature phase-shift-keying (AQPSK) modulation for unequal error protection (UEP) capability. Both time synchronous and asynchronous cases are investigated. An expression for the probability distribution of the multiple-access interference is derived. The exact bit-error performance and the approximate performance using a Gaussian approximation and random signature sequences are evaluated by extending the techniques used for uniform quadrature phase-shift-keying (QPSK) and binary phase-shift-keying (BPSK) DS-CDMA systems. Finally, a general system model with unequal user power and the near-far problem is considered and analyzed. The results show that, for a system with UEP capability, the less protected data bits are more sensitive to the near-far effect that occurs in a multiple-access environment than are the more protected bits.

  11. HRST architecture modeling and assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Comstock, D.A.

    1997-01-01

    This paper presents work supporting the assessment of advanced concept options for the Highly Reusable Space Transportation (HRST) study. It describes the development of computer models as the basis for creating an integrated capability to evaluate the economic feasibility and sustainability of a variety of system architectures. It summarizes modeling capabilities for use on the HRST study to perform sensitivity analysis of alternative architectures (consisting of different combinations of highly reusable vehicles, launch assist systems, and alternative operations and support concepts) in terms of cost, schedule, performance, and demand. In addition, the identification and preliminary assessment of alternative market segmentsmore » for HRST applications, such as space manufacturing, space tourism, etc., is described. Finally, the development of an initial prototype model that can begin to be used for modeling alternative HRST concepts at the system level is presented. {copyright} {ital 1997 American Institute of Physics.}« less

  12. A Fly-Inspired Mushroom Bodies Model for Sensory-Motor Control Through Sequence and Subsequence Learning.

    PubMed

    Arena, Paolo; Calí, Marco; Patané, Luca; Portera, Agnese; Strauss, Roland

    2016-09-01

    Classification and sequence learning are relevant capabilities used by living beings to extract complex information from the environment for behavioral control. The insect world is full of examples where the presentation time of specific stimuli shapes the behavioral response. On the basis of previously developed neural models, inspired by Drosophila melanogaster, a new architecture for classification and sequence learning is here presented under the perspective of the Neural Reuse theory. Classification of relevant input stimuli is performed through resonant neurons, activated by the complex dynamics generated in a lattice of recurrent spiking neurons modeling the insect Mushroom Bodies neuropile. The network devoted to context formation is able to reconstruct the learned sequence and also to trace the subsequences present in the provided input. A sensitivity analysis to parameter variation and noise is reported. Experiments on a roving robot are reported to show the capabilities of the architecture used as a neural controller.

  13. Composite laminate failure parameter optimization through four-point flexure experimentation and analysis

    DOE PAGES

    Nelson, Stacy; English, Shawn; Briggs, Timothy

    2016-05-06

    Fiber-reinforced composite materials offer light-weight solutions to many structural challenges. In the development of high-performance composite structures, a thorough understanding is required of the composite materials themselves as well as methods for the analysis and failure prediction of the relevant composite structures. However, the mechanical properties required for the complete constitutive definition of a composite material can be difficult to determine through experimentation. Therefore, efficient methods are necessary that can be used to determine which properties are relevant to the analysis of a specific structure and to establish a structure's response to a material parameter that can only be definedmore » through estimation. The objectives of this paper deal with demonstrating the potential value of sensitivity and uncertainty quantification techniques during the failure analysis of loaded composite structures; and the proposed methods are applied to the simulation of the four-point flexural characterization of a carbon fiber composite material. Utilizing a recently implemented, phenomenological orthotropic material model that is capable of predicting progressive composite damage and failure, a sensitivity analysis is completed to establish which material parameters are truly relevant to a simulation's outcome. Then, a parameter study is completed to determine the effect of the relevant material properties' expected variations on the simulated four-point flexural behavior as well as to determine the value of an unknown material property. This process demonstrates the ability to formulate accurate predictions in the absence of a rigorous material characterization effort. Finally, the presented results indicate that a sensitivity analysis and parameter study can be used to streamline the material definition process as the described flexural characterization was used for model validation.« less

  14. Artificial neural network model for ozone concentration estimation and Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Gao, Meng; Yin, Liting; Ning, Jicai

    2018-07-01

    Air pollution in urban atmosphere directly affects public-health; therefore, it is very essential to predict air pollutant concentrations. Air quality is a complex function of emissions, meteorology and topography, and artificial neural networks (ANNs) provide a sound framework for relating these variables. In this study, we investigated the feasibility of using ANN model with meteorological parameters as input variables to predict ozone concentration in the urban area of Jinan, a metropolis in Northern China. We firstly found that the architecture of network of neurons had little effect on the predicting capability of ANN model. A parsimonious ANN model with 6 routinely monitored meteorological parameters and one temporal covariate (the category of day, i.e. working day, legal holiday and regular weekend) as input variables was identified, where the 7 input variables were selected following the forward selection procedure. Compared with the benchmarking ANN model with 9 meteorological and photochemical parameters as input variables, the predicting capability of the parsimonious ANN model was acceptable. Its predicting capability was also verified in term of warming success ratio during the pollution episodes. Finally, uncertainty and sensitivity analysis were also performed based on Monte Carlo simulations (MCS). It was concluded that the ANN could properly predict the ambient ozone level. Maximum temperature, atmospheric pressure, sunshine duration and maximum wind speed were identified as the predominate input variables significantly influencing the prediction of ambient ozone concentrations.

  15. Detection of Subsurface Defects in Levees in Correlation to Weather Conditions Utilizing Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Martinez, I. A.; Eisenmann, D.

    2012-12-01

    Ground Penetrating Radar (GPR) has been used for many years in successful subsurface detection of conductive and non-conductive objects in all types of material including different soils and concrete. Typical defect detection is based on subjective examination of processed scans using data collection and analysis software to acquire and analyze the data, often requiring a developed expertise or an awareness of how a GPR works while collecting data. Processing programs, such as GSSI's RADAN analysis software are then used to validate the collected information. Iowa State University's Center for Nondestructive Evaluation (CNDE) has built a test site, resembling a typical levee used near rivers, which contains known sub-surface targets of varying size, depth, and conductivity. Scientist at CNDE have developed software with the enhanced capabilities, to decipher a hyperbola's magnitude and amplitude for GPR signal processing. With this enhanced capability, the signal processing and defect detection capabilities for GPR have the potential to be greatly enhanced. This study will examine the effects of test parameters, antenna frequency (400MHz), data manipulation methods (which include data filters and restricting the range of depth in which the chosen antenna's signal can reach), and real-world conditions using this test site (such as varying weather conditions) , with the goal of improving GPR tests sensitivity for differing soil conditions.

  16. Use of Accelerator Mass Spectrometry in Human Health and Molecular Toxicology

    DOE PAGES

    Enright, Heather A.; Malfatti, Michael A.; Zimmermann, Maike; ...

    2016-10-11

    Accelerator mass spectrometry (AMS) has been adopted as a powerful bioanalytical method for human studies in the areas of pharmacology and toxicology. The exquisite sensitivity (10–18 mol) of AMS has facilitated studies of toxins and drugs at environmentally and physiologically relevant concentrations in humans. Such studies include risk assessment of environmental toxicants, drug candidate selection, absolute bioavailability determination, and more recently, assessment of drug-target binding as a biomarker of response to chemotherapy. Combining AMS with complementary capabilities such as high performance liquid chromatography (HPLC) can maximize data within a single experiment and provide additional insight when assessing drugs and toxins,more » such as metabolic profiling. Recent advances in the AMS technology at Lawrence Livermore National Laboratory have allowed for direct coupling of AMS with complementary capabilities such as HPLC via a liquid sample moving wire interface, offering greater sensitivity compared to that of graphite-based analysis, therefore enabling the use of lower 14C and chemical doses, which are imperative for clinical testing. In conclusion, the aim of this review is to highlight the recent efforts in human studies using AMS, including technological advancements and discussion of the continued promise of AMS for innovative clinical based research.« less

  17. Gas chromatography/mass spectrometry comprehensive analysis of organophosphorus, brominated flame retardants, by-products and formulation intermediates in water.

    PubMed

    Cristale, Joyce; Quintana, Jordi; Chaler, Roser; Ventura, Francesc; Lacorte, Silvia

    2012-06-08

    A multiresidue method based on gas chromatography coupled to quadrupole mass spectrometry was developed to determine organophosphorus flame retardants, polybromodiphenyl ethers (BDEs 28, 47, 99, 100, 153, 154, 183 and 209), new brominated flame retardants, bromophenols, bromoanilines, bromotoluenes and bromoanisoles in water. Two ionization techniques (electron ionization--EI, and electron capture negative ionization--ECNI) and two acquisition modes (selected ion monitoring--SIM, and selected reaction monitoring--SRM) were compared as regards to mass spectral characterization, sensitivity and quantification capabilities. The highest sensitivity, at expenses of identification capacity, was obtained by GC-ECNI-MS/SIM for most of the compounds analyzed, mainly for PBDEs and decabromodiphenyl ethane while GC-EI-MS/MS in SRM was the most selective technique and permitted the identification of target compounds at the pg level, and identification capabilities increased when real samples were analyzed. This method was further used to evaluate the presence and behavior of flame retardants within a drinking water treatment facility. Organophosphorus flame retardants were the only compounds detected in influent waters at levels of 0.32-0.03 μg L⁻¹, and their elimination throughout the different treatment stages was evaluated. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Use of Accelerator Mass Spectrometry in Human Health and Molecular Toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enright, Heather A.; Malfatti, Michael A.; Zimmermann, Maike

    Accelerator mass spectrometry (AMS) has been adopted as a powerful bioanalytical method for human studies in the areas of pharmacology and toxicology. The exquisite sensitivity (10–18 mol) of AMS has facilitated studies of toxins and drugs at environmentally and physiologically relevant concentrations in humans. Such studies include risk assessment of environmental toxicants, drug candidate selection, absolute bioavailability determination, and more recently, assessment of drug-target binding as a biomarker of response to chemotherapy. Combining AMS with complementary capabilities such as high performance liquid chromatography (HPLC) can maximize data within a single experiment and provide additional insight when assessing drugs and toxins,more » such as metabolic profiling. Recent advances in the AMS technology at Lawrence Livermore National Laboratory have allowed for direct coupling of AMS with complementary capabilities such as HPLC via a liquid sample moving wire interface, offering greater sensitivity compared to that of graphite-based analysis, therefore enabling the use of lower 14C and chemical doses, which are imperative for clinical testing. In conclusion, the aim of this review is to highlight the recent efforts in human studies using AMS, including technological advancements and discussion of the continued promise of AMS for innovative clinical based research.« less

  19. Instrument for fluorescence sensing of circulating cells with diffuse light in mice in vivo.

    PubMed

    Zettergren, Eric; Vickers, Dwayne; Runnels, Judith; Murthy, Shashi K; Lin, Charles P; Niedre, Mark

    2012-03-01

    Accurate quantification of circulating cell populations in mice is important in many areas of preclinical biomedical research. Normally, this is done either by extraction and analysis of small blood samples or, more recently, by using microscopy-based in vivo fluorescence flow cytometry. We describe a new technological approach to this problem using detection of diffuse fluorescent light from relatively large blood vessels in vivo. The diffuse fluorescence flow cytometer (DFFC) uses a laser to illuminate a mouse limb and an array of optical fibers coupled to a high-sensitivity photomultiplier tube array operating in photon counting mode to detect weak fluorescence signals from cells. We first demonstrate that the DFFC instrument is capable of detecting fluorescent microspheres and Vybrant-DiD-labeled cells in a custom-made optical flow phantom with similar size, optical properties, linear flow rates, and autofluorescence as a mouse limb. We also present preliminary data demonstrating that the DFFC is capable of detecting circulating cells in nude mice in vivo. In principle, this device would allow interrogation of the whole blood volume of a mouse in minutes, with sensitivity improvement by several orders of magnitude compared to current approaches. © 2012 Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. Massively parallel haplotyping on microscopic beads for the high-throughput phase analysis of single molecules.

    PubMed

    Boulanger, Jérôme; Muresan, Leila; Tiemann-Boege, Irene

    2012-01-01

    In spite of the many advances in haplotyping methods, it is still very difficult to characterize rare haplotypes in tissues and different environmental samples or to accurately assess the haplotype diversity in large mixtures. This would require a haplotyping method capable of analyzing the phase of single molecules with an unprecedented throughput. Here we describe such a haplotyping method capable of analyzing in parallel hundreds of thousands single molecules in one experiment. In this method, multiple PCR reactions amplify different polymorphic regions of a single DNA molecule on a magnetic bead compartmentalized in an emulsion drop. The allelic states of the amplified polymorphisms are identified with fluorescently labeled probes that are then decoded from images taken of the arrayed beads by a microscope. This method can evaluate the phase of up to 3 polymorphisms separated by up to 5 kilobases in hundreds of thousands single molecules. We tested the sensitivity of the method by measuring the number of mutant haplotypes synthesized by four different commercially available enzymes: Phusion, Platinum Taq, Titanium Taq, and Phire. The digital nature of the method makes it highly sensitive to detecting haplotype ratios of less than 1:10,000. We also accurately quantified chimera formation during the exponential phase of PCR by different DNA polymerases.

  1. A New High-sensitivity solar X-ray Spectrophotometer SphinX:early operations and databases

    NASA Astrophysics Data System (ADS)

    Gburek, Szymon; Sylwester, Janusz; Kowalinski, Miroslaw; Siarkowski, Marek; Bakala, Jaroslaw; Podgorski, Piotr; Trzebinski, Witold; Plocieniak, Stefan; Kordylewski, Zbigniew; Kuzin, Sergey; Farnik, Frantisek; Reale, Fabio

    The Solar Photometer in X-rays (SphinX) is an instrument operating aboard Russian CORONAS-Photon satellite. A short description of this unique instrument will be presented and its unique capabilities discussed. SphinX is presently the most sensitive solar X-ray spectrophotometer measuring solar spectra in the energy range above 1 keV. A large archive of SphinX mea-surements has already been collected. General access to these measurements is possible. The SphinX data repositories contain lightcurves, spectra, and photon arrival time measurements. The SphinX data cover nearly continuously the period since the satellite launch on January 30, 2009 up to the end-of November 2009. Present instrument status, data formats and data access methods will be shown. An overview of possible new science coming from SphinX data analysis will be discussed.

  2. JPL Contamination Control Engineering

    NASA Technical Reports Server (NTRS)

    Blakkolb, Brian

    2013-01-01

    JPL has extensive expertise fielding contamination sensitive missions-in house and with our NASA/industry/academic partners.t Development and implementation of performance-driven cleanliness requirements for a wide range missions and payloads - UV-Vis-IR: GALEX, Dawn, Juno, WFPC-II, AIRS, TES, et al - Propulsion, thermal control, robotic sample acquisition systems. Contamination control engineering across the mission life cycle: - System and payload requirements derivation, analysis, and contamination control implementation plans - Hardware Design, Risk trades, Requirements V-V - Assembly, Integration & Test planning and implementation - Launch site operations and launch vehicle/payload integration - Flight ops center dot Personnel on staff have expertise with space materials development and flight experiments. JPL has capabilities and expertise to successfully address contamination issues presented by space and habitable environments. JPL has extensive experience fielding and managing contamination sensitive missions. Excellent working relationship with the aerospace contamination control engineering community/.

  3. Validation of high throughput screening of human sera for detection of anti-PA IgG by Enzyme-Linked Immunosorbent Assay (ELISA) as an emergency response to an anthrax incident

    PubMed Central

    Semenova, Vera A.; Steward-Clark, Evelene; Maniatis, Panagiotis; Epperson, Monica; Sabnis, Amit; Schiffer, Jarad

    2017-01-01

    To improve surge testing capability for a response to a release of Bacillus anthracis, the CDC anti-Protective Antigen (PA) IgG Enzyme-Linked Immunosorbent Assay (ELISA) was re-designed into a high throughput screening format. The following assay performance parameters were evaluated: goodness of fit (measured as the mean reference standard r2), accuracy (measured as percent error), precision (measured as coefficient of variance (CV)), lower limit of detection (LLOD), lower limit of quantification (LLOQ), dilutional linearity, diagnostic sensitivity (DSN) and diagnostic specificity (DSP). The paired sets of data for each sample were evaluated by Concordance Correlation Coefficient (CCC) analysis. The goodness of fit was 0.999; percent error between the expected and observed concentration for each sample ranged from −4.6% to 14.4%. The coefficient of variance ranged from 9.0% to 21.2%. The assay LLOQ was 2.6 μg/mL. The regression analysis results for dilutional linearity data were r2 = 0.952, slope = 1.02 and intercept = −0.03. CCC between assays was 0.974 for the median concentration of serum samples. The accuracy and precision components of CCC were 0.997 and 0.977, respectively. This high throughput screening assay is precise, accurate, sensitive and specific. Anti-PA IgG concentrations determined using two different assays proved high levels of agreement. The method will improve surge testing capability 18-fold from 4 to 72 sera per assay plate. PMID:27814939

  4. Validation of high throughput screening of human sera for detection of anti-PA IgG by Enzyme-Linked Immunosorbent Assay (ELISA) as an emergency response to an anthrax incident.

    PubMed

    Semenova, Vera A; Steward-Clark, Evelene; Maniatis, Panagiotis; Epperson, Monica; Sabnis, Amit; Schiffer, Jarad

    2017-01-01

    To improve surge testing capability for a response to a release of Bacillus anthracis, the CDC anti-Protective Antigen (PA) IgG Enzyme-Linked Immunosorbent Assay (ELISA) was re-designed into a high throughput screening format. The following assay performance parameters were evaluated: goodness of fit (measured as the mean reference standard r 2 ), accuracy (measured as percent error), precision (measured as coefficient of variance (CV)), lower limit of detection (LLOD), lower limit of quantification (LLOQ), dilutional linearity, diagnostic sensitivity (DSN) and diagnostic specificity (DSP). The paired sets of data for each sample were evaluated by Concordance Correlation Coefficient (CCC) analysis. The goodness of fit was 0.999; percent error between the expected and observed concentration for each sample ranged from -4.6% to 14.4%. The coefficient of variance ranged from 9.0% to 21.2%. The assay LLOQ was 2.6 μg/mL. The regression analysis results for dilutional linearity data were r 2  = 0.952, slope = 1.02 and intercept = -0.03. CCC between assays was 0.974 for the median concentration of serum samples. The accuracy and precision components of CCC were 0.997 and 0.977, respectively. This high throughput screening assay is precise, accurate, sensitive and specific. Anti-PA IgG concentrations determined using two different assays proved high levels of agreement. The method will improve surge testing capability 18-fold from 4 to 72 sera per assay plate. Published by Elsevier Ltd.

  5. Factors dominating 3-dimensional ozone distribution during high tropospheric ozone period.

    PubMed

    Chen, Xiaoyang; Liu, Yiming; Lai, Anqi; Han, Shuangshuang; Fan, Qi; Wang, Xuemei; Ling, Zhenhao; Huang, Fuxiang; Fan, Shaojia

    2018-01-01

    Data from an in situ monitoring network and five ozone sondes are analysed during August of 2012, and a high tropospheric ozone episode is observed around the 8th of AUG. The Community Multi-scale Air Quality (CMAQ) model and its process analysis tool were used to study factors and mechanisms for high ozone mixing ratio at different levels of ozone vertical profiles. A sensitive scenario without chemical initial and boundary conditions (ICBCs) from MOZART4-GEOS5 was applied to study the impact of stratosphere-troposphere exchange (STE) on vertical ozone. The simulation results indicated that the first high ozone peak near the tropopause was dominated by STE. Results from process analysis showed that: in the urban area, the second peak at approximately 2 km above ground height was mainly caused by local photochemical production. The third peak (near surface) was mainly caused by the upwind transportation from the suburban/rural areas; in the suburban/rural areas, local photochemical production of ozone dominated the high ozone mixing ratio from the surface to approximately 3 km height. Furthermore, the capability of indicators to distinguish O 3 -precursor sensitivity along the vertical O 3 profiles was investigated. Two sensitive scenarios, which had cut 30% anthropogenic NO X or VOC emissions, showed that O 3 -precursor indicators, specifically the ratios of O 3 /NOy, H 2 O 2 /HNO 3 or H 2 O 2 /NO Z , could partly distinguish the O 3 -precursor sensitivity between VOCs-sensitive and NOx-sensitive along the vertical profiles. In urban area, the O 3 -precursor relationship transferred from VOCs-sensitive within the boundary layer to NOx-sensitive at approximately 1-3 km above ground height, further confirming the dominant roles of transportation and photochemical production in high O 3 peaks at the near-ground layer and 2 km above ground height, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Phosphorus component in AnnAGNPS

    USGS Publications Warehouse

    Yuan, Y.; Bingner, R.L.; Theurer, F.D.; Rebich, R.A.; Moore, P.A.

    2005-01-01

    The USDA Annualized Agricultural Non-Point Source Pollution model (AnnAGNPS) has been developed to aid in evaluation of watershed response to agricultural management practices. Previous studies have demonstrated the capability of the model to simulate runoff and sediment, but not phosphorus (P). The main purpose of this article is to evaluate the performance of AnnAGNPS on P simulation using comparisons with measurements from the Deep Hollow watershed of the Mississippi Delta Management Systems Evaluation Area (MDMSEA) project. A sensitivity analysis was performed to identify input parameters whose impact is the greatest on P yields. Sensitivity analysis results indicate that the most sensitive variables of those selected are initial soil P contents, P application rate, and plant P uptake. AnnAGNPS simulations of dissolved P yield do not agree well with observed dissolved P yield (Nash-Sutcliffe coefficient of efficiency of 0.34, R2 of 0.51, and slope of 0.24); however, AnnAGNPS simulations of total P yield agree well with observed total P yield (Nash-Sutcliffe coefficient of efficiency of 0.85, R2 of 0.88, and slope of 0.83). The difference in dissolved P yield may be attributed to limitations in model simulation of P processes. Uncertainties in input parameter selections also affect the model's performance.

  7. Marine electrical resistivity imaging of submarine groundwater discharge: Sensitivity analysis and application in Waquoit Bay, Massachusetts, USA

    USGS Publications Warehouse

    Henderson, Rory; Day-Lewis, Frederick D.; Abarca, Elena; Harvey, Charles F.; Karam, Hanan N.; Liu, Lanbo; Lane, John W.

    2010-01-01

    Electrical resistivity imaging has been used in coastal settings to characterize fresh submarine groundwater discharge and the position of the freshwater/salt-water interface because of the relation of bulk electrical conductivity to pore-fluid conductivity, which in turn is a function of salinity. Interpretation of tomograms for hydrologic processes is complicated by inversion artifacts, uncertainty associated with survey geometry limitations, measurement errors, and choice of regularization method. Variation of seawater over tidal cycles poses unique challenges for inversion. The capabilities and limitations of resistivity imaging are presented for characterizing the distribution of freshwater and saltwater beneath a beach. The experimental results provide new insight into fresh submarine groundwater discharge at Waquoit Bay National Estuarine Research Reserve, East Falmouth, Massachusetts (USA). Tomograms from the experimental data indicate that fresh submarine groundwater discharge may shut down at high tide, whereas temperature data indicate that the discharge continues throughout the tidal cycle. Sensitivity analysis and synthetic modeling provide insight into resolving power in the presence of a time-varying saline water layer. In general, vertical electrodes and cross-hole measurements improve the inversion results regardless of the tidal level, whereas the resolution of surface arrays is more sensitive to time-varying saline water layer.

  8. Multi-Component Profiling of Trace Volatiles in Blood by Gas Chromatography/Mass Spectrometry with Dynamic Headspace Extraction

    PubMed Central

    Kakuta, Shoji; Yamashita, Toshiyuki; Nishiumi, Shin; Yoshida, Masaru; Fukusaki, Eiichiro; Bamba, Takeshi

    2015-01-01

    A dynamic headspace extraction method (DHS) with high-pressure injection is described. This dynamic extraction method has superior sensitivity to solid phase micro extraction, SPME and is capable of extracting the entire gas phase by purging the headspace of a vial. Optimization of the DHS parameters resulted in a highly sensitive volatile profiling system with the ability to detect various volatile components including alcohols at nanogram levels. The average LOD for a standard volatile mixture was 0.50 ng mL−1, and the average LOD for alcohols was 0.66 ng mL−1. This method was used for the analysis of volatile components from biological samples and compared with acute and chronic inflammation models. The method permitted the identification of volatiles with the same profile pattern as in vitro oxidized lipid-derived volatiles. In addition, the concentration of alcohols and aldehydes from the acute inflammation model samples were significantly higher than that for the chronic inflammation model samples. The different profiles between these samples could also be identified by this method. Finally, it was possible to analyze alcohols and low-molecular-weight volatiles that are difficult to analyze by SPME in high sensitivity and to show volatile profiling based on multi-volatile simultaneous analysis. PMID:26819905

  9. Multi-Component Profiling of Trace Volatiles in Blood by Gas Chromatography/Mass Spectrometry with Dynamic Headspace Extraction.

    PubMed

    Kakuta, Shoji; Yamashita, Toshiyuki; Nishiumi, Shin; Yoshida, Masaru; Fukusaki, Eiichiro; Bamba, Takeshi

    2015-01-01

    A dynamic headspace extraction method (DHS) with high-pressure injection is described. This dynamic extraction method has superior sensitivity to solid phase micro extraction, SPME and is capable of extracting the entire gas phase by purging the headspace of a vial. Optimization of the DHS parameters resulted in a highly sensitive volatile profiling system with the ability to detect various volatile components including alcohols at nanogram levels. The average LOD for a standard volatile mixture was 0.50 ng mL(-1), and the average LOD for alcohols was 0.66 ng mL(-1). This method was used for the analysis of volatile components from biological samples and compared with acute and chronic inflammation models. The method permitted the identification of volatiles with the same profile pattern as in vitro oxidized lipid-derived volatiles. In addition, the concentration of alcohols and aldehydes from the acute inflammation model samples were significantly higher than that for the chronic inflammation model samples. The different profiles between these samples could also be identified by this method. Finally, it was possible to analyze alcohols and low-molecular-weight volatiles that are difficult to analyze by SPME in high sensitivity and to show volatile profiling based on multi-volatile simultaneous analysis.

  10. A review of optimization and quantification techniques for chemical exchange saturation transfer (CEST) MRI toward sensitive in vivo imaging

    PubMed Central

    Guo, Yingkun; Zheng, Hairong; Sun, Phillip Zhe

    2015-01-01

    Chemical exchange saturation transfer (CEST) MRI is a versatile imaging method that probes the chemical exchange between bulk water and exchangeable protons. CEST imaging indirectly detects dilute labile protons via bulk water signal changes following selective saturation of exchangeable protons, which offers substantial sensitivity enhancement and has sparked numerous biomedical applications. Over the past decade, CEST imaging techniques have rapidly evolved due to contributions from multiple domains, including the development of CEST mathematical models, innovative contrast agent designs, sensitive data acquisition schemes, efficient field inhomogeneity correction algorithms, and quantitative CEST (qCEST) analysis. The CEST system that underlies the apparent CEST-weighted effect, however, is complex. The experimentally measurable CEST effect depends not only on parameters such as CEST agent concentration, pH and temperature, but also on relaxation rate, magnetic field strength and more importantly, experimental parameters including repetition time, RF irradiation amplitude and scheme, and image readout. Thorough understanding of the underlying CEST system using qCEST analysis may augment the diagnostic capability of conventional imaging. In this review, we provide a concise explanation of CEST acquisition methods and processing algorithms, including their advantages and limitations, for optimization and quantification of CEST MRI experiments. PMID:25641791

  11. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  12. Understanding Organics in Meteorites and the Pre-Biotic Environment

    NASA Technical Reports Server (NTRS)

    Zare, Richard N.

    2003-01-01

    (1) Refinement of the analytic capabilities of our experiment via characterization of molecule-specific response and the effects upon analysis of the type of sample under investigation; (2) Measurement of polycyclic aromatic hydrocarbons (PAHs) with high sensitivity and spatial resolution within extraterrestrial samples; (3) Investigation of the interstellar reactions of PAHs via the analysis of species formed in systems modeling dust grains and ices; (4) Investigations into the potential role of PAHs in prebiotic and early biotic chemistry via photoreactions of PAHs under simulated prebiotic Earth conditions. To meet these objectives, we use microprobe laser-desorption, laser-ionization mass spectrometry (MuL(exp 2)MS), which is a sensitive, selective, and spatially resolved technique for detection of aromatic compounds. Appendix A presents a description of the MuL(exp 2)MS technique. The initial grant proposal was for a three-year funding period, while the award was given for a one-year interim period. Because of this change in time period, emphasis was shifted from the first research goal, which was more development-oriented, in order to focus more on the other analysis-oriented goals. The progress made on each of the four research areas is given below.

  13. PIXE analysis of caries related trace elements in tooth enamel

    NASA Astrophysics Data System (ADS)

    Annegarn, H. J.; Jodaikin, A.; Cleaton-Jones, P. E.; Sellschop, J. P. F.; Madiba, C. C. P.; Bibby, D.

    1981-03-01

    PIXE analysis has been applied to a set of twenty human teeth to determine trace element concentration in enamel from areas susceptible to dental caries (mesial and distal contact points) and in areas less susceptible to the disease (buccal surfaces), with the aim of determining the possible roles of trace elements in the curious process. The samples were caries-free anterior incisors extracted for periodontal reasons from subjects 10-30 years of age. Prior to extraction of the sample teeth, a detailed dental history and examination was carried out in each individual. PIXE analysis, using a 3 MeV proton beam of 1 mm diameter, allowed the determination of Ca, Mn, Fe, Cu, Zn, Sr and Pb above detection limits. As demonstrated in this work, the enhanced sensitivity of PIXE analysis over electron microprobe analysis, and the capability of localised surface analysis compared with the pooled samples required for neutron activation analysis, makes it a powerful and useful technique in dental analysis.

  14. High-temperature explosive development for geothermal well stimulation. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, E.W.; Mars, J.E.; Wang, C.

    1978-03-31

    A two-component, temperature-resistant liquid explosive called HITEX has been developed which is capable of withstanding 561/sup 0/K (550/sup 0/F) for 24 hours in a geothermal environment. The explosive is intended for the stimulation of nonproducing or marginally producing geothermal (hot dry rock, vapor-dominated or hydrothermal) reservoirs by fracturing the strata in the vicinity of a borehole. The explosive is inherently safe because it is mixed below ground downhole from two nondetonable liquid components. Development and safety tests included differential scanning calorimetry, thermal stability, minerals compatibility, drop-weight sensitivity, adiabatic compression, electrostatic discharge sensitivity, friction sensitivity, detonation arrest capability, cook-off tests, detonabilitymore » at ambient and elevated pressure, detonation velocity and thin film propagation in a wedge.« less

  15. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copps, Kevin D.

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data managementmore » coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.« less

  16. Chemically Designed Metallic/Insulating Hybrid Nanostructures with Silver Nanocrystals for Highly Sensitive Wearable Pressure Sensors.

    PubMed

    Kim, Haneun; Lee, Seung-Wook; Joh, Hyungmok; Seong, Mingi; Lee, Woo Seok; Kang, Min Su; Pyo, Jun Beom; Oh, Soong Ju

    2018-01-10

    With the increase in interest in wearable tactile pressure sensors for e-skin, researches to make nanostructures to achieve high sensitivity have been actively conducted. However, limitations such as complex fabrication processes using expensive equipment still exist. Herein, simple lithography-free techniques to develop pyramid-like metal/insulator hybrid nanostructures utilizing nanocrystals (NCs) are demonstrated. Ligand-exchanged and unexchanged silver NC thin films are used as metallic and insulating components, respectively. The interfaces of each NC layer are chemically engineered to create discontinuous insulating layers, i.e., spacers for improved sensitivity, and eventually to realize fully solution-processed pressure sensors. Device performance analysis with structural, chemical, and electronic characterization and conductive atomic force microscopy study reveals that hybrid nanostructure based pressure sensor shows an enhanced sensitivity of higher than 500 kPa -1 , reliability, and low power consumption with a wide range of pressure sensing. Nano-/micro-hierarchical structures are also designed by combining hybrid nanostructures with conventional microstructures, exhibiting further enhanced sensing range and achieving a record sensitivity of 2.72 × 10 4 kPa -1 . Finally, all-solution-processed pressure sensor arrays with high pixel density, capable of detecting delicate signals with high spatial selectivity much better than the human tactile threshold, are introduced.

  17. a Facile Synthesis of Fully Porous Tazo Composite and its Remarkable Gas Sensitive Performance

    NASA Astrophysics Data System (ADS)

    Liang, Dongdong; Liu, Shimin; Wang, Zhinuo; Guo, Yu; Jiang, Weiwei; Liu, Chaoqian; Ding, Wanyu; Wang, Hualin; Wang, Nan; Zhang, Zhihua

    The composite of a nanocrystalline SnO2 thick film deposited on an Al-doped ZnO ceramic substrate was firstly proposed. This study also provided a simple, fast and cost effective method to prepare SnO2 thick film and Al-doped ZnO ceramic as well as the final composite. The crystal structure, morphology, composition, pore size distribution and gas sensitivity of the composite were investigated by means of X-ray diffraction, scanning electron microscopy, transmission electron microscopy, energy dispersive spectroscopy, Barrett-Joyner-Halenda analysis and gas sensitive measurement system. Results indicated that the composite was fully porous consisted of SnO2, ZnO and ZnAl2O4 crystal phases. The macrosized pores generated in the composite could enhance the gas infiltration into the sensing layers effectively. In this way, combining a high gas-transporting-capability and a nanocrystalline SnO2 thick film, the composite showed very impressive performance. The gas sensitivity of the composite was high enough for ethanol vapor with different concentrations, which was comparable to other kinds of reported SnO2 gas sensors, while showing two straight lines with a turning point at 1000ppm. Finally, the gas sensitive mechanism was proposed based on the microstructure and composition of the composite.

  18. Design and numerical analysis of highly sensitive Au-MoS2-graphene based hybrid surface plasmon resonance biosensor

    NASA Astrophysics Data System (ADS)

    Rahman, M. Saifur; Anower, Md. Shamim; Hasan, Md. Rabiul; Hossain, Md. Biplob; Haque, Md. Ismail

    2017-08-01

    We demonstrate a highly sensitive Au-MoS2-Graphene based hybrid surface plasmon resonance (SPR) biosensor for the detection of DNA hybridization. The performance parameters of the proposed sensor are investigated in terms of sensitivity, detection accuracy and quality factor at operating wavelength of 633 nm. We observed in the numerical study that sensitivity can be greatly increased by adding MoS2 layer in the middle of a Graphene-on-Au layer. It is shown that by using single layer of MoS2 in between gold and graphene layer, the proposed biosensor exhibits simultaneously high sensitivity of 87.8 deg/RIU, high detection accuracy of 1.28 and quality factor of 17.56 with gold layer thickness of 50 nm. This increased performance is due to the absorption ability and optical characteristics of graphene biomolecules and high fluorescence quenching ability of MoS2. On the basis of changing in SPR angle and minimum reflectance, the proposed sensor can sense nucleotides bonding happened between double-stranded DNA (dsDNA) helix structures. Therefore, this sensor can successfully detect the hybridization of target DNAs to the probe DNAs pre-immobilized on the Au-MoS2-Graphene hybrid with capability of distinguishing single-base mismatch.

  19. Integration of Quartz Crystal Microbalance-Dissipation and Reflection-Mode Localized Surface Plasmon Resonance Sensors for Biomacromolecular Interaction Analysis.

    PubMed

    Ferhan, Abdul Rahim; Jackman, Joshua A; Cho, Nam-Joon

    2016-12-20

    The combination of label-free, surface-sensitive measurement techniques based on different physical principles enables detailed characterization of biomacromolecular interactions at solid-liquid interfaces. To date, most combined measurement systems have involved experimental techniques with similar probing volumes, whereas the potential of utilizing techniques with different surface sensitivities remains largely unexplored, especially for data interpretation. Herein, we report a combined measurement approach that integrates a conventional quartz crystal microbalance-dissipation (QCM-D) setup with a reflection-mode localized surface plasmon (LSPR) sensor. Using this platform, we investigate vesicle adsorption on a titanium oxide-coated sensing substrate along with the amphipathic, α-helical (AH) peptide-induced structural transformation of surface-adsorbed lipid vesicles into a supported lipid bilayer (SLB) as a model biomacromolecular interaction. While the QCM-D and LSPR signals both detected mass uptake arising from vesicle adsorption, tracking the AH peptide-induced structural transformation revealed more complex measurement responses based on the different surface sensitivities of the two techniques. In particular, the LSPR signal recorded an increase in optical mass near the sensor surface which indicated SLB formation, whereas the QCM-D signals detected a significant loss in net acoustic mass due to excess lipid and coupled solvent leaving the probing volume. Importantly, these measurement capabilities allowed us to temporally distinguish the process of SLB formation at the sensor surface from the overall structural transformation process. Looking forward, these label-free measurement capabilities to simultaneously probe adsorbates at multiple length scales will provide new insights into complex biomacromolecular interactions.

  20. Dual modal endoscopic cancer detection based on optical pH sensing and Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Kim, Soogeun; Kim, ByungHyun; Sohn, Won Bum; Byun, Kyung Min; Lee, Soo Yeol

    2017-02-01

    To discriminate between normal and cancerous tissue, a dual modal approach using Raman spectroscopy and pH sensor was designed and applied. Raman spectroscopy has demonstrated the possibility of using as diagnostic method for the early detection of precancerous and cancerous lesions in vivo. It also can be used in identifying markers associated with malignant change. However, Raman spectroscopy lacks sufficient sensitivity due to very weak Raman scattering signal or less distinctive spectral pattern. A dual modal approach could be one of the solutions to solve this issue. The level of extracellular pH in cancer tissue is lower than that in normal tissue due to increased lactic acid production, decreased interstitial fluid buffering and decreased perfusion. High sensitivity and specificity required for accurate cancer diagnosis could be achieved by combining the chemical information from Raman spectrum with metabolic information from pH level. Raman spectra were acquired by using a fiber optic Raman probe, a cooled CCD camera connected to a spectrograph and 785 nm laser source. Different transmission spectra depending on tissue pH were measured by a lossy-mode resonance sensor based on fiber optic. The discriminative capability of pH-Raman dual modal method was evaluated using principal component analysis (PCA). The obtained results showed that the pH-Raman dual modal approach can improve discriminative capability between normal and cancerous tissue, which can lead to very high sensitivity and specificity. The proposed method for cancer detection is expected to be used in endoscopic diagnosis later.

  1. Methodology for determining major constituents of ayahuasca and their metabolites in blood.

    PubMed

    McIlhenny, Ethan H; Riba, Jordi; Barbanoj, Manel J; Strassman, Rick; Barker, Steven A

    2012-03-01

    There is an increasing interest in potential medical applications of ayahuasca, a South American psychotropic plant tea with a long cultural history of indigenous medical and religious use. Clinical research into ayahuasca will require specific, sensitive and comprehensive methods for the characterization and quantitation of these compounds and their metabolites in blood. A combination of two analytical techniques (high-performance liquid chromatography with ultraviolet and/or fluorescence detection and gas chromatography with nitrogen-phosphorus detection) has been used for the analysis of some of the constituents of ayahuasca in blood following its oral consumption. We report here a single methodology for the direct analysis of 14 of the major alkaloid components of ayahuasca, including several known and potential metabolites of N,N-dimethyltryptamine and the harmala alkaloids in blood. The method uses 96-well plate/protein precipitation/filtration for plasma samples, and analysis by HPLC-ion trap-ion trap-mass spectrometry using heated electrospray ionization to reduce matrix effects. The method expands the list of compounds capable of being monitored in blood following ayahuasca administration while providing a simplified approach to their analysis. The method has adequate sensitivity, specificity and reproducibility to make it useful for clinical research with ayahuasca. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Triblock copolymer matrix-based capillary electrophoretic microdevice for high-resolution multiplex pathogen detection.

    PubMed

    Kim, Se Jin; Shin, Gi Won; Choi, Seok Jin; Hwang, Hee Sung; Jung, Gyoo Yeol; Seo, Tae Seok

    2010-03-01

    Rapid and simple analysis for the multiple target pathogens is critical for patient management. CE-SSCP analysis on a microchip provides high speed, high sensitivity, and a portable genetic analysis platform in molecular diagnostic fields. The capability of separating ssDNA molecules in a capillary electrophoretic microchannel with high resolution is a critical issue to perform the precise interpretation in the electropherogram. In this study, we explored the potential of poly(ethyleneoxide)-poly(propyleneoxide)-poly(ethyleneoxide) (PEO-PPO-PEO) triblock copolymer as a sieving matrix for CE-SSCP analysis on a microdevice. To demonstrate the superior resolving power of PEO-PPO-PEO copolymers, 255-bp PCR amplicons obtained from 16S ribosomal RNA genes of four bacterial species, namely Proteus mirabilis, Haemophilus ducreyi, Pseudomonas aeruginosa, and Neisseria meningitidis, were analyzed in the PEO-PPO-PEO matrix in comparison with 5% linear polyacrylamide and commercial GeneScan gel. Due to enhanced dynamic coating and sieving ability, PEO-PPO-PEO copolymer displayed fourfold enhancement of resolving power in the CE-SSCP to separate same-sized DNA molecules. Fivefold input of genomic DNA of P. aeruginosa and/or N. meningitidis produced proportionally increased corresponding amplicon peaks, enabling correct quantitative analysis in the pathogen detection. Besides the high-resolution sieving capability, a facile loading and replenishment of gel in the microchannel due to thermally reversible gelation property makes PEO-PPO-PEO triblock copolymer an excellent matrix in the CE-SSCP analysis on the microdevice.

  3. X ray sensitive area detection device

    NASA Technical Reports Server (NTRS)

    Carter, Daniel C. (Inventor); Witherow, William K. (Inventor); Pusey, Marc L. (Inventor); Yost, Vaughn H. (Inventor)

    1990-01-01

    A radiation sensitive area detection device is disclosed which comprises a phosphor-containing film capable of receiving and storing an image formed by a pattern of incoming x rays, UV, or other radiation falling on the film. The device is capable of fluorescing in response to stimulation by a light source in a manner directly proportional to the stored radiation pattern. The device includes: (1) a light source capable of projecting light or other appropriate electromagnetic wave on the film so as to cause it to fluoresce; (2) a means to focus the fluoresced light coming from the phosphor-containing film after light stimulation; and (3) at least one charged coupled detector or other detecting element capable of receiving and digitizing the pattern of fluoresced light coming from the phosphor-containing film. The device will be able to generate superior x ray images of high resolution from a crystal or other sample and will be particularly advantageous in that instantaneous near-real-time images of rapidly deteriorating samples can be obtained. Furthermore, the device can be made compact and sturdy, thus capable of carrying out x ray or other radiation imaging under a variety of conditions, including those experienced in space.

  4. Ruthenium and osmium complexes that bear functional azolate chelates for dye-sensitized solar cells.

    PubMed

    Chi, Yun; Wu, Kuan-Lin; Wei, Tzu-Chien

    2015-05-01

    The preparation of sensitizers for dye-sensitized solar cells (DSSCs) represents an active area of research for both sustainability and renewable energy. Both Ru(II) and Os(II) metal sensitizers offer unique photophysical and electrochemical properties that arise from the intrinsic electronic properties, that is, the higher propensity to form the lower-energy metal-to-ligand charge-transfer (MLCT) transition, and their capability to support chelates with multiple carboxy groups, which serve as a bridge to the metal oxide and enable efficient injection of the photoelectron. Here we present an overview of the synthesis and testing of these metal sensitizers that bear functional azolate chelates (both pyrazolate and triazolate), which are capable of modifying the metal sensitizers in a systematic and beneficial manner. Basic principles of the molecular designs, the structural relationship to the photophysical and electrochemical properties, and performances of the as-fabricated DSSCs are highlighted. The success in the breakthrough of the synthetic protocols and potential applications might provide strong stimulus for the future development of technologies such as DSSCs, organic light-emitting diodes, solar water splitting, and so forth. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Ultra-sensitive chemical and biological analysis via specialty fibers with built-in microstructured optofluidic channels.

    PubMed

    Zhang, Nan; Li, Kaiwei; Cui, Ying; Wu, Zhifang; Shum, Perry Ping; Auguste, Jean-Louis; Dinh, Xuan Quyen; Humbert, Georges; Wei, Lei

    2018-02-13

    All-in-fiber optofluidics is an analytical tool that provides enhanced sensing performance with simplified analyzing system design. Currently, its advance is limited either by complicated liquid manipulation and light injection configuration or by low sensitivity resulting from inadequate light-matter interaction. In this work, we design and fabricate a side-channel photonic crystal fiber (SC-PCF) and exploit its versatile sensing capabilities in in-line optofluidic configurations. The built-in microfluidic channel of the SC-PCF enables strong light-matter interaction and easy lateral access of liquid samples in these analytical systems. In addition, the sensing performance of the SC-PCF is demonstrated with methylene blue for absorptive molecular detection and with human cardiac troponin T protein by utilizing a Sagnac interferometry configuration for ultra-sensitive and specific biomolecular specimen detection. Owing to the features of great flexibility and compactness, high-sensitivity to the analyte variation, and efficient liquid manipulation/replacement, the demonstrated SC-PCF offers a generic solution to be adapted to various fiber-waveguide sensors to detect a wide range of analytes in real time, especially for applications from environmental monitoring to biological diagnosis.

  6. A High Sensitivity IDC-Electronic Tongue Using Dielectric/Sensing Membranes with Solvatochromic Dyes

    PubMed Central

    Khan, Md. Rajibur Rahaman; Khalilian, Alireza; Kang, Shin-Won

    2016-01-01

    In this paper, an electronic tongue/taste sensor array containing different interdigitated capacitor (IDC) sensing elements to detect different types of tastes, such as sweetness (glucose), saltiness (NaCl), sourness (HCl), bitterness (quinine-HCl), and umami (monosodium glutamate) is proposed. We present for the first time an IDC electronic tongue using sensing membranes containing solvatochromic dyes. The proposed highly sensitive (30.64 mV/decade sensitivity) IDC electronic tongue has fast response and recovery times of about 6 s and 5 s, respectively, with extremely stable responses, and is capable of linear sensing performance (R2 ≈ 0.985 correlation coefficient) over the wide dynamic range of 1 µM to 1 M. The designed IDC electronic tongue offers excellent reproducibility, with a relative standard deviation (RSD) of about 0.029. The proposed device was found to have better sensing performance than potentiometric-, cascoded compatible lateral bipolar transistor (C-CLBT)-, Electronic Tongue (SA402)-, and fiber-optic-based taste sensing systems in what concerns dynamic range width, response time, sensitivity, and linearity. Finally, we applied principal component analysis (PCA) to distinguish between various kinds of taste in mixed taste compounds. PMID:27171095

  7. An Evaluation Study of Enzyme-Linked Immunosorbent Assay (ELISA) Using Recombinant Protein Pap31 for Detection of Antibody against Bartonella bacilliformis Infection among the Peruvian Population

    PubMed Central

    Angkasekwinai, Nasikarn; Atkins, Erin H.; Romero, Sofia; Grieco, John; Chao, Chien Chung; Ching, Wei Mei

    2014-01-01

    Reliable laboratory testing is of great importance to detect Bartonella bacilliformis infection. We evaluated the sensitivity and specificity of the enzyme-linked immunosorbent assay (ELISA) using recombinant protein Pap31 (rPap31) for the detection of antibodies against B. bacilliformis as compared with immunofluorescent assay (IFA). Of the 302 sera collected between 1997 and 2000 among an at-risk Peruvian population, 103 and 34 samples tested positive for IFA-immunoglobulin G (IgG) and IFA-IgM, respectively. By using Youden's index, the cutoff values of ELISA-IgG at 0.915 gave a sensitivity of 84.5% and specificity of 94%. The cutoff values of ELISA-IgM at 0.634 gave a sensitivity of 88.2% and specificity of 85.1%. Using latent class analysis, estimates of sensitivity and specificity of almost all the assays were slightly higher than those of a conventional method of calculation. The test is proved beneficial for discriminating between infected and non-infected individuals with the advantage of low-cost and high-throughput capability. PMID:24515944

  8. A High Sensitivity IDC-Electronic Tongue Using Dielectric/Sensing Membranes with Solvatochromic Dyes.

    PubMed

    Khan, Md Rajibur Rahaman; Khalilian, Alireza; Kang, Shin-Won

    2016-05-10

    In this paper, an electronic tongue/taste sensor array containing different interdigitated capacitor (IDC) sensing elements to detect different types of tastes, such as sweetness (glucose), saltiness (NaCl), sourness (HCl), bitterness (quinine-HCl), and umami (monosodium glutamate) is proposed. We present for the first time an IDC electronic tongue using sensing membranes containing solvatochromic dyes. The proposed highly sensitive (30.64 mV/decade sensitivity) IDC electronic tongue has fast response and recovery times of about 6 s and 5 s, respectively, with extremely stable responses, and is capable of linear sensing performance (R² ≈ 0.985 correlation coefficient) over the wide dynamic range of 1 µM to 1 M. The designed IDC electronic tongue offers excellent reproducibility, with a relative standard deviation (RSD) of about 0.029. The proposed device was found to have better sensing performance than potentiometric-, cascoded compatible lateral bipolar transistor (C-CLBT)-, Electronic Tongue (SA402)-, and fiber-optic-based taste sensing systems in what concerns dynamic range width, response time, sensitivity, and linearity. Finally, we applied principal component analysis (PCA) to distinguish between various kinds of taste in mixed taste compounds.

  9. Bi-harmonic cantilever design for improved measurement sensitivity in tapping-mode atomic force microscopy.

    PubMed

    Loganathan, Muthukumaran; Bristow, Douglas A

    2014-04-01

    This paper presents a method and cantilever design for improving the mechanical measurement sensitivity in the atomic force microscopy (AFM) tapping mode. The method uses two harmonics in the drive signal to generate a bi-harmonic tapping trajectory. Mathematical analysis demonstrates that the wide-valley bi-harmonic tapping trajectory is as much as 70% more sensitive to changes in the sample topography than the standard single-harmonic trajectory typically used. Although standard AFM cantilevers can be driven in the bi-harmonic tapping trajectory, they require large forcing at the second harmonic. A design is presented for a bi-harmonic cantilever that has a second resonant mode at twice its first resonant mode, thereby capable of generating bi-harmonic trajectories with small forcing signals. Bi-harmonic cantilevers are fabricated by milling a small cantilever on the interior of a standard cantilever probe using a focused ion beam. Bi-harmonic drive signals are derived for standard cantilevers and bi-harmonic cantilevers. Experimental results demonstrate better than 30% improvement in measurement sensitivity using the bi-harmonic cantilever. Images obtained through bi-harmonic tapping exhibit improved sharpness and surface tracking, especially at high scan speeds and low force fields.

  10. A new era of semiconductor genetics using ion-sensitive field-effect transistors: the gene-sensitive integrated cell.

    PubMed

    Toumazou, Christofer; Thay, Tan Sri Lim Kok; Georgiou, Pantelis

    2014-03-28

    Semiconductor genetics is now disrupting the field of healthcare owing to the rapid parallelization and scaling of DNA sensing using ion-sensitive field-effect transistors (ISFETs) fabricated using commercial complementary metal -oxide semiconductor technology. The enabling concept of DNA reaction monitoring introduced by Toumazou has made this a reality and we are now seeing relentless scaling with Moore's law ultimately achieving the $100 genome. In this paper, we present the next evolution of this technology through the creation of the gene-sensitive integrated cell (GSIC) for label-free real-time analysis based on ISFETs. This device is derived from the traditional metal-oxide semiconductor field-effect transistor (MOSFET) and has electrical performance identical to that of a MOSFET in a standard semiconductor process, yet is capable of incorporating DNA reaction chemistries for applications in single nucleotide polymorphism microarrays and DNA sequencing. Just as application-specific integrated circuits, which are developed in much the same way, have shaped our consumer electronics industry and modern communications and memory technology, so, too, do GSICs based on a single underlying technology principle have the capacity to transform the life science and healthcare industries.

  11. Statistical sensitivity on right-handed currents in presence of eV scale sterile neutrinos with KATRIN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinbrink, Nicholas M.N.; Weinheimer, Christian; Glück, Ferenc

    The KATRIN experiment aims to determine the absolute neutrino mass by measuring the endpoint region of the tritium β-spectrum. As a large-scale experiment with a sharp energy resolution, high source luminosity and low background it may also be capable of testing certain theories of neutrino interactions beyond the standard model (SM). An example of a non-SM interaction are right-handed currents mediated by right-handed W bosons in the left-right symmetric model (LRSM). In this extension of the SM, an additional SU(2){sub R} symmetry in the high-energy limit is introduced, which naturally includes sterile neutrinos and predicts the seesaw mechanism. In tritiummore » β decay, this leads to an additional term from interference between left- and right-handed interactions, which enhances or suppresses certain regions near the endpoint of the beta spectrum. In this work, the sensitivity of KATRIN to right-handed currents is estimated for the scenario of a light sterile neutrino with a mass of some eV. This analysis has been performed with a Bayesian analysis using Markov Chain Monte Carlo (MCMC). The simulations show that, in principle, KATRIN will be able to set sterile neutrino mass-dependent limits on the interference strength. The sensitivity is significantly increased if the Q value of the β decay can be sufficiently constrained. However, the sensitivity is not high enough to improve current upper limits from right-handed W boson searches at the LHC.« less

  12. Statistical sensitivity on right-handed currents in presence of eV scale sterile neutrinos with KATRIN

    NASA Astrophysics Data System (ADS)

    Steinbrink, Nicholas M. N.; Glück, Ferenc; Heizmann, Florian; Kleesiek, Marco; Valerius, Kathrin; Weinheimer, Christian; Hannestad, Steen

    2017-06-01

    The KATRIN experiment aims to determine the absolute neutrino mass by measuring the endpoint region of the tritium β-spectrum. As a large-scale experiment with a sharp energy resolution, high source luminosity and low background it may also be capable of testing certain theories of neutrino interactions beyond the standard model (SM). An example of a non-SM interaction are right-handed currents mediated by right-handed W bosons in the left-right symmetric model (LRSM). In this extension of the SM, an additional SU(2)R symmetry in the high-energy limit is introduced, which naturally includes sterile neutrinos and predicts the seesaw mechanism. In tritium β decay, this leads to an additional term from interference between left- and right-handed interactions, which enhances or suppresses certain regions near the endpoint of the beta spectrum. In this work, the sensitivity of KATRIN to right-handed currents is estimated for the scenario of a light sterile neutrino with a mass of some eV. This analysis has been performed with a Bayesian analysis using Markov Chain Monte Carlo (MCMC). The simulations show that, in principle, KATRIN will be able to set sterile neutrino mass-dependent limits on the interference strength. The sensitivity is significantly increased if the Q value of the β decay can be sufficiently constrained. However, the sensitivity is not high enough to improve current upper limits from right-handed W boson searches at the LHC.

  13. HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, Lothar

    At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. Asmore » part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.« less

  14. Injection Locking Techniques for Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-01

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  15. Pyrotechnic shock measurement and data analysis requirements

    NASA Technical Reports Server (NTRS)

    Albers, L.

    1975-01-01

    A study of laboratory measurement and analysis of pyrotechnic shock prompted by a discrepancy in preliminary Mariner Jupiter/Saturn shock test data is reported. It is shown that before generating shock response plots from any recorded pyrotechnic event, a complete review of each instrumentation and analysis system must be made. In addition, the frequency response capability of the tape recorder used should be as high as possible; the discrepancies in the above data were due to inadequate frequency response in the FM tape recorders. The slew rate of all conditioning amplifiers and input converters must be high enough to prevent signal distortion at maximum input voltage; amplifier ranges should be selected so that the input pulse is approximately 50% of full scale; the Bessel response type should be chosen for digital shock analysis if antialiasing filters are employed; and transducer selection must consider maximum acceleration limit, mounted resonance frequency, flat clean mounting surfaces, base bending sensitivity, and proper torque.

  16. Lempel-Ziv complexity analysis of one dimensional cellular automata.

    PubMed

    Estevez-Rams, E; Lora-Serrano, R; Nunes, C A J; Aragón-Fernández, B

    2015-12-01

    Lempel-Ziv complexity measure has been used to estimate the entropy density of a string. It is defined as the number of factors in a production factorization of a string. In this contribution, we show that its use can be extended, by using the normalized information distance, to study the spatiotemporal evolution of random initial configurations under cellular automata rules. In particular, the transfer information from time consecutive configurations is studied, as well as the sensitivity to perturbed initial conditions. The behavior of the cellular automata rules can be grouped in different classes, but no single grouping captures the whole nature of the involved rules. The analysis carried out is particularly appropriate for studying the computational processing capabilities of cellular automata rules.

  17. Lempel-Ziv complexity analysis of one dimensional cellular automata

    NASA Astrophysics Data System (ADS)

    Estevez-Rams, E.; Lora-Serrano, R.; Nunes, C. A. J.; Aragón-Fernández, B.

    2015-12-01

    Lempel-Ziv complexity measure has been used to estimate the entropy density of a string. It is defined as the number of factors in a production factorization of a string. In this contribution, we show that its use can be extended, by using the normalized information distance, to study the spatiotemporal evolution of random initial configurations under cellular automata rules. In particular, the transfer information from time consecutive configurations is studied, as well as the sensitivity to perturbed initial conditions. The behavior of the cellular automata rules can be grouped in different classes, but no single grouping captures the whole nature of the involved rules. The analysis carried out is particularly appropriate for studying the computational processing capabilities of cellular automata rules.

  18. Analysis of airfoil transitional separation bubbles

    NASA Technical Reports Server (NTRS)

    Davis, R. L.; Carter, J. E.

    1984-01-01

    A previously developed local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation) has been modified to utilize a more accurate windward finite difference procedure in the reversed flow region, and a natural transition/turbulence model has been incorporated for the prediction of transition within the separation bubble. Numerous calculations and experimental comparisons are presented to demonstrate the effects of the windward differencing scheme and the natural transition/turbulence model. Grid sensitivity and convergence capabilities of this inviscid-viscous interaction technique are briefly addressed. A major contribution of this report is that with the use of windward differencing, a second, counter-rotating eddy has been found to exist in the wall layer of the primary separation bubble.

  19. [Express diagnostics of bovine leucosis by immune sensor based on surface plasmon resonance].

    PubMed

    Pyrohova, L V; Starodub, M F; Artiukh, V P; Nahaieva, L I; Dobrosol, H I

    2002-01-01

    An immune sensor based on the surface plasmon resonance (SPR) was developed for express diagnostics of bovine leucosis. The sensor was used for detection of the level of antibodies against bovine leukaemia virus (BLV) in the blood serum. The industrially manufactured BLV antigen for screening test in the agar gel immunodiffusion (AGID) required the additional purification in order to be used in immune sensor analysis. It was shown that immune sensor analysis was more sensitive, rapid and simple in comparison with the traditional AGID test. It was stated that the developed immune sensor was capable to be used for performance of bovine leucosis screening at the farms and the minimal dilution of the serum should be 1:500.

  20. Chemical speciation using high energy resolution PIXE spectroscopy in the tender X-ray range

    NASA Astrophysics Data System (ADS)

    Kavčič, Matjaž; Petric, Marko; Vogel-Mikuš, Katarina

    2018-02-01

    High energy resolution X-ray emission spectroscopy employing wavelength dispersive (WDS) crystal spectrometers can provide energy resolution on the level of core-hole lifetime broadening of the characteristic emission lines. While crystal spectrometers have been traditionally used in combination with electron excitation for major and minor element analysis, they have been rarely considered in proton induced X-ray emission (PIXE) trace element analysis mainly due to low detection efficiency. Compared to the simplest flat crystal WDS spectrometer the efficiency can be improved by employing cylindrically or even spherically curved crystals in combination with position sensitive X-ray detectors. When such spectrometer is coupled to MeV proton excitation, chemical bonding effects are revealed in the high energy resolution spectra yielding opportunity to extend the analytical capabilities of PIXE technique also towards chemical state analysis. In this contribution we will focus on the high energy resolution PIXE (HR-PIXE) spectroscopy in the tender X-ray range performed in our laboratory with our home-built tender X-ray emission spectrometer. Some general properties of high energy resolution PIXE spectroscopy in the tender X-ray range are presented followed by an example of sulfur speciation in biological tissue illustrating the capabilities as well as limitations of HR-PIXE method used for chemical speciation in the tender X-ray range.

  1. Cruise Speed Sensitivity Study for Transonic Truss Braced Wing

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.

    2017-01-01

    NASA's investment and research in aviation has led to new technologies and concepts that make aircraft more efficient and environmentally friendly. One aircraft design operational concept is the reduction of cruise speed to reduce fuel burned during a mission. Although this is not a new idea, it was used by all of the contractors involved in a 2008 NASA sponsored study that solicited concept and technology ideas to reduce environmental impacts for future subsonic passenger transports. NASA is currently improving and building new analysis capabilities to analyze advanced concepts. To test some of these new capabilities, a transonic truss braced wing configuration was used as a test case. This paper examines the effects due to changes in the design cruise speed and other tradeoffs in the design space. The analysis was baselined to the Boeing SUGAR High truss braced wing concept. An optimization was run at five different design cruise Mach numbers. These designs are compared to provide an initial assessment space and the parameters that should be considered when selecting a design cruise speed. A discussion of the design drivers is also included. The results show that the wing weight in the current analysis has more influence on the takeoff gross weight than expected. This effect caused lower than expected wing sweep angle values for higher cruise speed designs.

  2. Field-Effect Biosensors for On-Site Detection: Recent Advances and Promising Targets.

    PubMed

    Choi, Jaebin; Seong, Tae Wha; Jeun, Minhong; Lee, Kwan Hyi

    2017-10-01

    There is an explosive interest in the immediate and cost-effective analysis of field-collected biological samples, as many advanced biodetection tools are highly sensitive, yet immobile. On-site biosensors are portable and convenient sensors that provide detection results at the point of care. They are designed to secure precision in highly ionic and heterogeneous solutions with minimal hardware. Among various methods that are capable of such analysis, field-effect biosensors are promising candidates due to their unique sensitivity, manufacturing scalability, and integrability with computational circuitry. Recent developments in nanotechnological surface modification show promising results in sensing from blood, serum, and urine. This report gives a particular emphasis on the on-site efficacy of recently published field-effect biosensors, specifically, detection limits in physiological solutions, response times, and scalability. The survey of the properties and existing detection methods of four promising biotargets, exosomes, bacteria, viruses, and metabolites, aims at providing a roadmap for future field-effect and other on-site biosensors. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. A new hyperchaotic map and its application for image encryption

    NASA Astrophysics Data System (ADS)

    Natiq, Hayder; Al-Saidi, N. M. G.; Said, M. R. M.; Kilicman, Adem

    2018-01-01

    Based on the one-dimensional Sine map and the two-dimensional Hénon map, a new two-dimensional Sine-Hénon alteration model (2D-SHAM) is hereby proposed. Basic dynamic characteristics of 2D-SHAM are studied through the following aspects: equilibria, Jacobin eigenvalues, trajectory, bifurcation diagram, Lyapunov exponents and sensitivity dependence test. The complexity of 2D-SHAM is investigated using Sample Entropy algorithm. Simulation results show that 2D-SHAM is overall hyperchaotic with the high complexity, and high sensitivity to its initial values and control parameters. To investigate its performance in terms of security, a new 2D-SHAM-based image encryption algorithm (SHAM-IEA) is also proposed. In this algorithm, the essential requirements of confusion and diffusion are accomplished, and the stochastic 2D-SHAM is used to enhance the security of encrypted image. The stochastic 2D-SHAM generates random values, hence SHAM-IEA can produce different encrypted images even with the same secret key. Experimental results and security analysis show that SHAM-IEA has strong capability to withstand statistical analysis, differential attack, chosen-plaintext and chosen-ciphertext attacks.

  4. Time-Resolved Photoluminescence Microscopy for the Analysis of Semiconductor-Based Paint Layers

    PubMed Central

    Mosca, Sara; Gonzalez, Victor; Eveno, Myriam

    2017-01-01

    In conservation, science semiconductors occur as the constituent matter of the so-called semiconductor pigments, produced following the Industrial Revolution and extensively used by modern painters. With recent research highlighting the occurrence of various degradation phenomena in semiconductor paints, it is clear that their detection by conventional optical fluorescence imaging and microscopy is limited by the complexity of historical painting materials. Here, we illustrate and prove the capabilities of time-resolved photoluminescence (TRPL) microscopy, equipped with both spectral and lifetime sensitivity at timescales ranging from nanoseconds to hundreds of microseconds, for the analysis of cross-sections of paint layers made of luminescent semiconductor pigments. The method is sensitive to heterogeneities within micro-samples and provides valuable information for the interpretation of the nature of the emissions in samples. A case study is presented on micro samples from a painting by Henri Matisse and serves to demonstrate how TRPL can be used to identify the semiconductor pigments zinc white and cadmium yellow, and to inform future investigations of the degradation of a cadmium yellow paint. PMID:29160862

  5. In vivo optical microscopy of peripheral nerve myelination with polarization sensitive-optical coherence tomography

    PubMed Central

    Henry, Francis P.; Wang, Yan; Rodriguez, Carissa L. R.; Randolph, Mark A.; Rust, Esther A. Z.; Winograd, Jonathan M.; de Boer, Johannes F.; Park, B. Hyle

    2015-01-01

    Abstract. Assessing nerve integrity and myelination after injury is necessary to provide insight for treatment strategies aimed at restoring neuromuscular function. Currently, this is largely done with electrical analysis, which lacks direct quantitative information. In vivo optical imaging with sufficient imaging depth and resolution could be used to assess the nerve microarchitecture. In this study, we examine the use of polarization sensitive-optical coherence tomography (PS-OCT) to quantitatively assess the sciatic nerve microenvironment through measurements of birefringence after applying a nerve crush injury in a rat model. Initial loss of function and subsequent recovery were demonstrated by calculating the sciatic function index (SFI). We found that the PS-OCT phase retardation slope, which is proportional to birefringence, increased monotonically with the SFI. Additionally, histomorphometric analysis of the myelin thickness and g-ratio shows that the PS-OCT slope is a good indicator of myelin health and recovery after injury. These results demonstrate that PS-OCT is capable of providing nondestructive and quantitative assessment of nerve health after injury and shows promise for continued use both clinically and experimentally in neuroscience. PMID:25858593

  6. In vivo optical microscopy of peripheral nerve myelination with polarization sensitive-optical coherence tomography.

    PubMed

    Henry, Francis P; Wang, Yan; Rodriguez, Carissa L R; Randolph, Mark A; Rust, Esther A Z; Winograd, Jonathan M; de Boer, Johannes F; Park, B Hyle

    2015-04-01

    Assessing nerve integrity and myelination after injury is necessary to provide insight for treatment strategies aimed at restoring neuromuscular function. Currently, this is largely done with electrical analysis, which lacks direct quantitative information. In vivo optical imaging with sufficient imaging depth and resolution could be used to assess the nerve microarchitecture. In this study, we examine the use of polarization sensitive-optical coherence tomography (PS-OCT) to quantitatively assess the sciatic nerve microenvironment through measurements of birefringence after applying a nerve crush injury in a rat model. Initial loss of function and subsequent recovery were demonstrated by calculating the sciatic function index (SFI). We found that the PS-OCT phase retardation slope, which is proportional to birefringence, increased monotonically with the SFI. Additionally, histomorphometric analysis of the myelin thickness and g-ratio shows that the PS-OCT slope is a good indicator of myelin health and recovery after injury. These results demonstrate that PS-OCT is capable of providing nondestructive and quantitative assessment of nerve health after injury and shows promise for continued use both clinically and experimentally in neuroscience.

  7. Department of Defense Energy and Logistics: Implications of Historic and Future Cost, Risk, and Capability Analysis

    NASA Astrophysics Data System (ADS)

    Tisa, Paul C.

    Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and their uncertainty into the same tradespace. The work presented is intended to inform better policies and investment decisions for military acquisitions. The discussion highlights areas within the DoD's understanding of energy that could improve or whose development has faltered. The new metric discussed allows the DoD to better manage and plan for long-term energy-related costs and risk.

  8. Automatic detection of DNA double strand breaks after irradiation using an γH2AX assay.

    PubMed

    Hohmann, Tim; Kessler, Jacqueline; Grabiec, Urszula; Bache, Matthias; Vordermark, Dyrk; Dehghani, Faramarz

    2018-05-01

    Radiation therapy belongs to the most common approaches for cancer therapy leading amongst others to DNA damage like double strand breaks (DSB). DSB can be used as a marker for the effect of radiation on cells. For visualization and assessing the extent of DNA damage the γH2AX foci assay is frequently used. The analysis of the γH2AX foci assay remains complicated as the number of γH2AX foci has to be counted. The quantification is mostly done manually, being time consuming and leading to person-dependent variations. Therefore, we present a method to automatically analyze the number of foci inside nuclei, facilitating and quickening the analysis of DSBs with high reliability in fluorescent images. First nuclei were detected in fluorescent images. Afterwards, the nuclei were analyzed independently from each other with a local thresholding algorithm. This approach allowed accounting for different levels of noise and detection of the foci inside the respective nucleus, using Hough transformation searching for circles. The presented algorithm was able to correctly classify most foci in cases of "high" and "average" image quality (sensitivity>0.8) with a low rate of false positive detections (positive predictive value (PPV)>0.98). In cases of "low" image quality the approach had a decreased sensitivity (0.7-0.9), depending on the manual control counter. The PPV remained high (PPV>0.91). Compared to other automatic approaches the presented algorithm had a higher sensitivity and PPV. The used automatic foci detection algorithm was capable of detecting foci with high sensitivity and PPV. Thus it can be used for automatic analysis of images of varying quality.

  9. Global Sensitivity Analysis and Parameter Calibration for an Ecosystem Carbon Model

    NASA Astrophysics Data System (ADS)

    Safta, C.; Ricciuto, D. M.; Sargsyan, K.; Najm, H. N.; Debusschere, B.; Thornton, P. E.

    2013-12-01

    We present uncertainty quantification results for a process-based ecosystem carbon model. The model employs 18 parameters and is driven by meteorological data corresponding to years 1992-2006 at the Harvard Forest site. Daily Net Ecosystem Exchange (NEE) observations were available to calibrate the model parameters and test the performance of the model. Posterior distributions show good predictive capabilities for the calibrated model. A global sensitivity analysis was first performed to determine the important model parameters based on their contribution to the variance of NEE. We then proceed to calibrate the model parameters in a Bayesian framework. The daily discrepancies between measured and predicted NEE values were modeled as independent and identically distributed Gaussians with prescribed daily variance according to the recorded instrument error. All model parameters were assumed to have uninformative priors with bounds set according to expert opinion. The global sensitivity results show that the rate of leaf fall (LEAFALL) is responsible for approximately 25% of the total variance in the average NEE for 1992-2005. A set of 4 other parameters, Nitrogen use efficiency (NUE), base rate for maintenance respiration (BR_MR), growth respiration fraction (RG_FRAC), and allocation to plant stem pool (ASTEM) contribute between 5% and 12% to the variance in average NEE, while the rest of the parameters have smaller contributions. The posterior distributions, sampled with a Markov Chain Monte Carlo algorithm, exhibit significant correlations between model parameters. However LEAFALL, the most important parameter for the average NEE, is not informed by the observational data, while less important parameters show significant updates between their prior and posterior densities. The Fisher information matrix values, indicating which parameters are most informed by the experimental observations, are examined to augment the comparison between the calibration and global sensitivity analysis results.

  10. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    PubMed

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  11. Analysis of Inorganic Nanoparticles by Single-particle Inductively Coupled Plasma Time-of-Flight Mass Spectrometry.

    PubMed

    Hendriks, Lyndsey; Gundlach-Graham, Alexander; Günther, Detlef

    2018-04-25

    Due to the rapid development of nanotechnologies, engineered nanomaterials (ENMs) and nanoparticles (ENPs) are becoming a part of everyday life: nanotechnologies are quickly migrating from laboratory benches to store shelves and industrial processes. As the use of ENPs continues to expand, their release into the environment is unavoidable; however, understanding the mechanisms and degree of ENP release is only possible through direct detection of these nanospecies in relevant matrices and at realistic concentrations. Key analytical requirements for quantitative detection of ENPs include high sensitivity to detect small particles at low total mass concentrations and the need to separate signals of ENPs from a background of dissolved elemental species and natural nanoparticles (NNPs). To this end, an emerging method called single-particle inductively coupled plasma mass spectrometry (sp-ICPMS) has demonstrated great potential for the characterization of inorganic nanoparticles (NPs) at environmentally relevant concentrations. Here, we comment on the capabilities of modern sp-ICPMS analysis with particular focus on the measurement possibilities offered by ICP-time-of-flight mass spectrometry (ICP-TOFMS). ICP-TOFMS delivers complete elemental mass spectra for individual NPs, which allows for high-throughput, untargeted quantitative analysis of dispersed NPs in natural matrices. Moreover, the multi-element detection capabilities of ICP-TOFMS enable new NP-analysis strategies, including online calibration via microdroplets for accurate NP mass quantification and matrix compensation.

  12. Choline kinase-alpha by regulating cell aggressiveness and drug sensitivity is a potential druggable target for ovarian cancer.

    PubMed

    Granata, A; Nicoletti, R; Tinaglia, V; De Cecco, L; Pisanu, M E; Ricci, A; Podo, F; Canevari, S; Iorio, E; Bagnoli, M; Mezzanzanica, D

    2014-01-21

    Aberrant choline metabolism has been proposed as a novel cancer hallmark. We recently showed that epithelial ovarian cancer (EOC) possesses an altered MRS-choline profile, characterised by increased phosphocholine (PCho) content to which mainly contribute over-expression and activation of choline kinase-alpha (ChoK-alpha). To assess its biological relevance, ChoK-alpha expression was downmodulated by transient RNA interference in EOC in vitro models. Gene expression profiling by microarray analysis and functional analysis was performed to identify the pathway/functions perturbed in ChoK-alpha-silenced cells, then validated by in vitro experiments. In silenced cells, compared with control, we observed: (I) a significant reduction of both CHKA transcript and ChoK-alpha protein expression; (II) a dramatic, proportional drop in PCho content ranging from 60 to 71%, as revealed by (1)H-magnetic spectroscopy analysis; (III) a 35-36% of cell growth inhibition, with no evidences of apoptosis or modification of the main cellular survival signalling pathways; (IV) 476 differentially expressed genes, including genes related to lipid metabolism. Ingenuity pathway analysis identified cellular functions related to cell death and cellular proliferation and movement as the most perturbed. Accordingly, CHKA-silenced cells displayed a significant delay in wound repair, a reduced migration and invasion capability were also observed. Furthermore, although CHKA silencing did not directly induce cell death, a significant increase of sensitivity to platinum, paclitaxel and doxorubicin was observed even in a drug-resistant context. We showed for the first time in EOC that CHKA downregulation significantly decreased the aggressive EOC cell behaviour also affecting cells' sensitivity to drug treatment. These observations open the way to further analysis for ChoK-alpha validation as a new EOC therapeutic target to be used alone or in combination with conventional drugs.

  13. Resolution, sensitivity, and in vivo application of high-resolution computed tomography for titanium-coated polymethyl methacrylate (PMMA) dental implants.

    PubMed

    Cuijpers, Vincent M J I; Jaroszewicz, Jacub; Anil, Sukumaran; Al Farraj Aldosari, Abdullah; Walboomers, X Frank; Jansen, John A

    2014-03-01

    The aims of this study were (i) to determine the spatial resolution and sensitivity of micro- versus nano-computed tomography (CT) techniques and (ii) to validate micro- versus nano-CT in a dog dental implant model, comparative to histological analysis. To determine spatial resolution and sensitivity, standardized reference samples containing standardized nano- and microspheres were prepared in polymer and ceramic matrices. Thereafter, 10 titanium-coated polymer dental implants (3.2 mm in Ø by 4 mm in length) were placed in the mandible of Beagle dogs. Both micro- and nano-CT, as well as histological analyses, were performed. The reference samples confirmed the high resolution of the nano-CT system, which was capable of revealing sub-micron structures embedded in radiodense matrices. The dog implantation study and subsequent statistical analysis showed equal values for bone area and bone-implant contact measurements between micro-CT and histology. However, because of the limited sample size and field of view, nano-CT was not rendering reliable data representative of the entire bone-implant specimen. Micro-CT analysis is an efficient tool to quantitate bone healing parameters at the bone-implant interface, especially when using titanium-coated PMMA implants. Nano-CT is not suitable for such quantification, but reveals complementary morphological information rivaling histology, yet with the advantage of a 3D visualization. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  14. The receiver operational characteristic for binary classification with multiple indices and its application to the neuroimaging study of Alzheimer's disease.

    PubMed

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2013-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis.

  15. The Receiver Operational Characteristic for Binary Classification with Multiple Indices and Its Application to the Neuroimaging Study of Alzheimer’s Disease

    PubMed Central

    Wu, Xia; Li, Juan; Ayutyanont, Napatkamon; Protas, Hillary; Jagust, William; Fleisher, Adam; Reiman, Eric; Yao, Li; Chen, Kewei

    2014-01-01

    Given a single index, the receiver operational characteristic (ROC) curve analysis is routinely utilized for characterizing performances in distinguishing two conditions/groups in terms of sensitivity and specificity. Given the availability of multiple data sources (referred to as multi-indices), such as multimodal neuroimaging data sets, cognitive tests, and clinical ratings and genomic data in Alzheimer’s disease (AD) studies, the single-index-based ROC underutilizes all available information. For a long time, a number of algorithmic/analytic approaches combining multiple indices have been widely used to simultaneously incorporate multiple sources. In this study, we propose an alternative for combining multiple indices using logical operations, such as “AND,” “OR,” and “at least n” (where n is an integer), to construct multivariate ROC (multiV-ROC) and characterize the sensitivity and specificity statistically associated with the use of multiple indices. With and without the “leave-one-out” cross-validation, we used two data sets from AD studies to showcase the potentially increased sensitivity/specificity of the multiV-ROC in comparison to the single-index ROC and linear discriminant analysis (an analytic way of combining multi-indices). We conclude that, for the data sets we investigated, the proposed multiV-ROC approach is capable of providing a natural and practical alternative with improved classification accuracy as compared to univariate ROC and linear discriminant analysis. PMID:23702553

  16. Molecular and Proteomic Analysis of Levofloxacin and Metronidazole Resistant Helicobacter pylori.

    PubMed

    Hanafi, Aimi; Lee, Woon Ching; Loke, Mun Fai; Teh, Xinsheng; Shaari, Ain; Dinarvand, Mojdeh; Lehours, Philippe; Mégraud, Francis; Leow, Alex Hwong Ruey; Vadivelu, Jamuna; Goh, Khean Lee

    2016-01-01

    Antibiotic resistance in bacteria incurs fitness cost, but compensatory mechanisms may ameliorate the cost and sustain the resistance even under antibiotics-free conditions. The aim of this study was to determine compensatory mechanisms of antibiotic resistance in H. pylori . Five strains of levofloxacin-sensitive H. pylori were induced in vitro to develop resistance. In addition, four pairs of metronidazole-sensitive and -resistant H. pylori strains were isolated from patients carrying dual H. pylori populations that consist of both sensitive and resistant phenotypes. Growth rate, virulence and biofilm-forming ability of the sensitive and resistant strains were compared to determine effects of compensatory response. Proteome profiles of paired sensitive and resistant strains were analyzed by liquid chromatography/mass spectrophotometry (LC/MS). Although there were no significant differences in growth rate between sensitive and resistant pairs, bacterial virulence (in terms of abilities to induce apoptosis and form biofilm) differs from pair to pair. These findings demonstrate the complex and strain-specific phenotypic changes in compensation for antibiotics resistance. Compensation for in vitro induced levofloxacin resistance involving mutations of gyrA and gyrB was functionally random. Furthermore, higher protein translation and non-functional protein degradation capabilities in naturally-occuring dual population metronidazole sensitive-resistant strains may be a possible alternative mechanism underlying resistance to metronidazole without mutations in rdxA and frxA . This may explain the lack of mutations in target genes in ~10% of metronidazole resistant strains.

  17. Molecular and Proteomic Analysis of Levofloxacin and Metronidazole Resistant Helicobacter pylori

    PubMed Central

    Hanafi, Aimi; Lee, Woon Ching; Loke, Mun Fai; Teh, Xinsheng; Shaari, Ain; Dinarvand, Mojdeh; Lehours, Philippe; Mégraud, Francis; Leow, Alex Hwong Ruey; Vadivelu, Jamuna; Goh, Khean Lee

    2016-01-01

    Antibiotic resistance in bacteria incurs fitness cost, but compensatory mechanisms may ameliorate the cost and sustain the resistance even under antibiotics-free conditions. The aim of this study was to determine compensatory mechanisms of antibiotic resistance in H. pylori. Five strains of levofloxacin-sensitive H. pylori were induced in vitro to develop resistance. In addition, four pairs of metronidazole-sensitive and -resistant H. pylori strains were isolated from patients carrying dual H. pylori populations that consist of both sensitive and resistant phenotypes. Growth rate, virulence and biofilm-forming ability of the sensitive and resistant strains were compared to determine effects of compensatory response. Proteome profiles of paired sensitive and resistant strains were analyzed by liquid chromatography/mass spectrophotometry (LC/MS). Although there were no significant differences in growth rate between sensitive and resistant pairs, bacterial virulence (in terms of abilities to induce apoptosis and form biofilm) differs from pair to pair. These findings demonstrate the complex and strain-specific phenotypic changes in compensation for antibiotics resistance. Compensation for in vitro induced levofloxacin resistance involving mutations of gyrA and gyrB was functionally random. Furthermore, higher protein translation and non-functional protein degradation capabilities in naturally-occuring dual population metronidazole sensitive-resistant strains may be a possible alternative mechanism underlying resistance to metronidazole without mutations in rdxA and frxA. This may explain the lack of mutations in target genes in ~10% of metronidazole resistant strains. PMID:28018334

  18. Biosensors in Health Care: The Milestones Achieved in Their Development towards Lab-on-Chip-Analysis

    PubMed Central

    Patel, Suprava; Nanda, Rachita; Sahoo, Sibasish; Mohapatra, Eli

    2016-01-01

    Immense potentiality of biosensors in medical diagnostics has driven scientists in evolution of biosensor technologies and innovating newer tools in time. The cornerstone of the popularity of biosensors in sensing wide range of biomolecules in medical diagnostics is due to their simplicity in operation, higher sensitivity, ability to perform multiplex analysis, and capability to be integrated with different function by the same chip. There remains a huge challenge to meet the demands of performance and yield to its simplicity and affordability. Ultimate goal stands for providing point-of-care testing facility to the remote areas worldwide, particularly the developing countries. It entails continuous development in technology towards multiplexing ability, fabrication, and miniaturization of biosensor devices so that they can provide lab-on-chip-analysis systems to the community. PMID:27042353

  19. Development and validation of a numerical acoustic analysis program for aircraft interior noise prediction

    NASA Astrophysics Data System (ADS)

    Garcea, Ralph; Leigh, Barry; Wong, R. L. M.

    Reduction of interior noise in propeller-driven aircraft, to levels comparable with those obtained in jet transports, has become a leading factor in the early design stages of the new generation turboprops- and may be essential if these new designs are to succeed. The need for an analytical capability to predict interior noise is accepted throughout the turboprop aircraft industry. To this end, an analytical noise prediction program, which incorporates the SYSNOISE numerical acoustic analysis software, is under development at de Havilland. The discussion contained herein looks at the development program and how it was used in a design sensitivity analysis to optimize the structural design of the aircraft cabin for the purpose of reducing interior noise levels. This report also summarizes the validation of the SYSNOISE package using numerous classical cases from the literature.

  20. Innovative Tools and Technology for Analysis of Single Cells and Cell-Cell Interaction.

    PubMed

    Konry, Tania; Sarkar, Saheli; Sabhachandani, Pooja; Cohen, Noa

    2016-07-11

    Heterogeneity in single-cell responses and intercellular interactions results from complex regulation of cell-intrinsic and environmental factors. Single-cell analysis allows not only detection of individual cellular characteristics but also correlation of genetic content with phenotypic traits in the same cell. Technological advances in micro- and nanofabrication have benefited single-cell analysis by allowing precise control of the localized microenvironment, cell manipulation, and sensitive detection capabilities. Additionally, microscale techniques permit rapid, high-throughput, multiparametric screening that has become essential for -omics research. This review highlights innovative applications of microscale platforms in genetic, proteomic, and metabolic detection in single cells; cell sorting strategies; and heterotypic cell-cell interaction. We discuss key design aspects of single-cell localization and isolation in microfluidic systems, dynamic and endpoint analyses, and approaches that integrate highly multiplexed detection of various intracellular species.

  1. Collagen morphology and texture analysis: from statistics to classification

    PubMed Central

    Mostaço-Guidolin, Leila B.; Ko, Alex C.-T.; Wang, Fei; Xiang, Bo; Hewko, Mark; Tian, Ganghong; Major, Arkady; Shiomi, Masashi; Sowa, Michael G.

    2013-01-01

    In this study we present an image analysis methodology capable of quantifying morphological changes in tissue collagen fibril organization caused by pathological conditions. Texture analysis based on first-order statistics (FOS) and second-order statistics such as gray level co-occurrence matrix (GLCM) was explored to extract second-harmonic generation (SHG) image features that are associated with the structural and biochemical changes of tissue collagen networks. Based on these extracted quantitative parameters, multi-group classification of SHG images was performed. With combined FOS and GLCM texture values, we achieved reliable classification of SHG collagen images acquired from atherosclerosis arteries with >90% accuracy, sensitivity and specificity. The proposed methodology can be applied to a wide range of conditions involving collagen re-modeling, such as in skin disorders, different types of fibrosis and muscular-skeletal diseases affecting ligaments and cartilage. PMID:23846580

  2. Fibre optic technique for simultaneous measurement of strain and temperature variations in composite materials

    NASA Astrophysics Data System (ADS)

    Michie, W. C.; Culshaw, Brian; Roberts, Scott S. J.; Davidson, Roger

    1991-12-01

    A technique based upon the differential sensitivities of dual mode and polarimetric sensing schemes is shown to be capable of resolving simultaneously temperature and strain variations to within 20 micro-epsilon and 1 K over a strain and temperature excursion of 2 micro-epsilon and 45 K. The technique is evaluated experimentally over an 80 cm sensing length of unembedded optical fiber and in an 8 ply unidirectional carbon/epoxide laminate subject to temperature and strain cycling. A comparative analysis of the performance of the embedded and the unembedded fiber sensors is presented.

  3. A computer program for detailed analysis of the takeoff and approach performance capabilities of transport category aircraft

    NASA Technical Reports Server (NTRS)

    Foss, W. E., Jr.

    1979-01-01

    The takeoff and approach performance of an aircraft is calculated in accordance with the airworthiness standards of the Federal Aviation Regulations. The aircraft and flight constraints are represented in sufficient detail to permit realistic sensitivity studies in terms of either configuration modifications or changes in operational procedures. The program may be used to investigate advanced operational procedures for noise alleviation such as programmed throttle and flap controls. Extensive profile time history data are generated and are placed on an interface file which can be input directly to the NASA aircraft noise prediction program (ANOPP).

  4. Hyperbolic Rendezvous at Mars: Risk Assessments and Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    Jedrey, Ricky; Landau, Damon; Whitley, Ryan

    2015-01-01

    Given the current interest in the use of flyby trajectories for human Mars exploration, a key requirement is the capability to execute hyperbolic rendezvous. Hyperbolic rendezvous is used to transport crew from a Mars centered orbit, to a transiting Earth bound habitat that does a flyby. Representative cases are taken from future potential missions of this type, and a thorough sensitivity analysis of the hyperbolic rendezvous phase is performed. This includes early engine cutoff, missed burn times, and burn misalignment. A finite burn engine model is applied that assumes the hyperbolic rendezvous phase is done with at least two burns.

  5. Estimation of Plutonium-240 Mass in Waste Tanks Using Ultra-Sensitive Detection of Radioactive Xenon Isotopes from Spontaneous Fission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowyer, Theodore W.; Gesh, Christopher J.; Haas, Daniel A.

    This report details efforts to develop a technique which is able to detect and quantify the mass of 240Pu in waste storage tanks and other enclosed spaces. If the isotopic ratios of the plutonium contained in the enclosed space is also known, then this technique is capable of estimating the total mass of the plutonium without physical sample retrieval and radiochemical analysis of hazardous material. Results utilizing this technique are reported for a Hanford Site waste tank (TX-118) and a well-characterized plutonium sample in a laboratory environment.

  6. Rich sensitivities: an analysis of conflict among women in feminist memoir.

    PubMed

    Taylor, Judith

    2009-05-01

    While the North American women's movement is most known for its efforts to transform social relations between women and men, its adherents have also focused on remaking relations among women. Using an innovative data source, social movement memoir, this paper indicates the depth of disappointment feminists cause one another. Memoirists dispute notions found in the movement and mainstream that women are socially capable. The paper offers the concept "relational ideation" to describe the way feminist memoirists critically examine taken for granted understandings of women's sociality and amplify their desires for a new social ethic among them.

  7. CMOS Time-Resolved, Contact, and Multispectral Fluorescence Imaging for DNA Molecular Diagnostics

    PubMed Central

    Guo, Nan; Cheung, Ka Wai; Wong, Hiu Tung; Ho, Derek

    2014-01-01

    Instrumental limitations such as bulkiness and high cost prevent the fluorescence technique from becoming ubiquitous for point-of-care deoxyribonucleic acid (DNA) detection and other in-field molecular diagnostics applications. The complimentary metal-oxide-semiconductor (CMOS) technology, as benefited from process scaling, provides several advanced capabilities such as high integration density, high-resolution signal processing, and low power consumption, enabling sensitive, integrated, and low-cost fluorescence analytical platforms. In this paper, CMOS time-resolved, contact, and multispectral imaging are reviewed. Recently reported CMOS fluorescence analysis microsystem prototypes are surveyed to highlight the present state of the art. PMID:25365460

  8. Array Detector Modules for Spent Fuel Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolotnikov, Aleksey

    Brookhaven National Laboratory (BNL) proposes to evaluate the arrays of position-sensitive virtual Frisch-grid (VFG) detectors for passive gamma-ray emission tomography (ET) to verify the spent fuel in storage casks before storing them in geo-repositories. Our primary objective is to conduct a preliminary analysis of the arrays capabilities and to perform field measurements to validate the effectiveness of the proposed array modules. The outcome of this proposal will consist of baseline designs for the future ET system which can ultimately be used together with neutrons detectors. This will demonstrate the usage of this technology in spent fuel storage casks.

  9. Low frequency vibration isolation technology for microgravity space experiments

    NASA Technical Reports Server (NTRS)

    Grodsinsky, Carlos M.; Brown, Gerald V.

    1989-01-01

    The dynamic acceleration environment observed on Space Shuttle flights to date and predicted for the Space Station has complicated the analysis of prior microgravity experiments and prompted concern for the viability of proposed space experiments requiring long-term, low-g environments. Isolation systems capable of providing significant improvements in this environment exist, but have not been demonstrated in flight configurations. This paper presents a summary of the theoretical evaluation for two one degree-of-freedom (DOF) active magnetic isolators and their predicted response to both direct and base excitations, that can be used to isolate acceleration sensitive microgravity space experiments.

  10. Main propulsion system design recommendations for an advanced Orbit Transfer Vehicle

    NASA Technical Reports Server (NTRS)

    Redd, L.

    1985-01-01

    Various main propulsion system configurations of an advanced OTV are evaluated with respect to the probability of nonindependent failures, i.e., engine failures that disable the entire main propulsion system. Analysis of the life-cycle cost (LCC) indicates that LCC is sensitive to the main propulsion system reliability, vehicle dry weight, and propellant cost; it is relatively insensitive to the number of missions/overhaul, failures per mission, and EVA and IVA cost. In conclusion, two or three engines are recommended in view of their highest reliability, minimum life-cycle cost, and fail operational/fail safe capability.

  11. Reversible Aptamer-Au Plasmon Rulers for Secreted Single Molecules

    DOE PAGES

    Lee, Somin Eunice; Chen, Qian; Bhat, Ramray; ...

    2015-06-03

    Plasmon rulers, consisting of pairs of gold nanoparticles, allow single-molecule analysis without photobleaching or blinking; however, current plasmon rulers are irreversible, restricting detection to only single events. Here, we present a reversible plasmon ruler, comprised of coupled gold nanoparticles linked by a single aptamer, capable of binding individual secreted molecules with high specificity. We show that the binding of target secreted molecules to the reversible plasmon ruler is characterized by single-molecule sensitivity, high specificity, and reversibility. Lastly, such reversible plasmon rulers should enable dynamic and adaptive live-cell measurement of secreted single molecules in their local microenvironment.

  12. iTOUGH2 V6.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finsterle, Stefan A.

    2010-11-01

    iTOUGH2 (inverse TOUGH2) provides inverse modeling capabilities for TOUGH2, a simulator for multi-dimensional , multi-phase, multi-component, non-isothermal flow and transport in fractured porous media. It performs sensitivity analysis, parameter estimation, and uncertainty propagation, analysis in geosciences and reservoir engineering and other application areas. It supports a number of different combination of fluids and components [equation-of-state (EOS) modules]. In addition, the optimization routines implemented in iTOUGH2 can also be used or sensitivity analysis, automatic model calibration, and uncertainty quantification of any external code that uses text-based input and output files. This link is achieved by means of the PEST application programmingmore » interface. iTOUGH2 solves the inverse problem by minimizing a non-linear objective function of the weighted differences between model output and the corresponding observations. Multiple minimization algorithms (derivative fee, gradient-based and second-order; local and global) are available. iTOUGH2 also performs Latin Hypercube Monte Carlos simulation for uncertainty propagation analysis. A detailed residual and error analysis is provided. This upgrade includes new EOS modules (specifically EOS7c, ECO2N and TMVOC), hysteretic relative permeability and capillary pressure functions and the PEST API. More details can be found at http://esd.lbl.gov/iTOUGH2 and the publications cited there. Hardware Req.: Multi-platform; Related/auxiliary software PVM (if running in parallel).« less

  13. New technologies for advanced three-dimensional optimum shape design in aeronautics

    NASA Astrophysics Data System (ADS)

    Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno

    1999-05-01

    The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright

  14. Surface flaw reliability analysis of ceramic components with the SCARE finite element postprocessor program

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, John P.; Nemeth, Noel N.

    1987-01-01

    The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.

  15. Supercontinuum Fourier transform spectrometry with balanced detection on a single photodiode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goncharov, Vasily; Hall, Gregory

    Here, we have developed phase-sensitive signal detection and processing algorithms for Fourier transform spectrometers fitted with supercontinuum sources for applications requiring ultimate sensitivity. Similar to well-established approach of source noise cancellation through balanced detection of monochromatic light, our method is capable of reducing the relative intensity noise of polychromatic light by 40 dB. Unlike conventional balanced detection, which relies on differential absorption measured with a well matched pair of photo-detectors, our algorithm utilizes phase-sensitive differential detection on a single photodiode and is capable of the real-time correction for instabilities in supercontinuum spectral structure over a broad range of wavelengths. Inmore » the resulting method is universal in terms of applicable wavelengths and compatible with commercial spectrometers. We present a proof-of-principle experimental« less

  16. Supercontinuum Fourier transform spectrometry with balanced detection on a single photodiode

    DOE PAGES

    Goncharov, Vasily; Hall, Gregory

    2016-08-25

    Here, we have developed phase-sensitive signal detection and processing algorithms for Fourier transform spectrometers fitted with supercontinuum sources for applications requiring ultimate sensitivity. Similar to well-established approach of source noise cancellation through balanced detection of monochromatic light, our method is capable of reducing the relative intensity noise of polychromatic light by 40 dB. Unlike conventional balanced detection, which relies on differential absorption measured with a well matched pair of photo-detectors, our algorithm utilizes phase-sensitive differential detection on a single photodiode and is capable of the real-time correction for instabilities in supercontinuum spectral structure over a broad range of wavelengths. Inmore » the resulting method is universal in terms of applicable wavelengths and compatible with commercial spectrometers. We present a proof-of-principle experimental« less

  17. Adaptive Planning: Understanding Organizational Workload to Capability/ Capacity through Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Hase, Chris

    2010-01-01

    In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.

  18. Balloon-borne three-meter telescope for far-infrared and submillimeter astronomy

    NASA Technical Reports Server (NTRS)

    Fazio, G. G.

    1985-01-01

    Presented are scientific objectives, engineering analysis and design, and results of technology development for a Three-Meter Balloon-Borne Far-Infrared and Submillimeter Telescope. The scientific rationale is based on two crucial instrumental capabilities: high angular resolution which approaches eight arcseconds at one hundred micron wavelength, and high resolving power spectroscopy with good sensitivity throughout the telescope's 30-micron to 1-mm wavelength range. The high angular resolution will allow us to resolve and study in detail such objects as collapsing protostellar condensations in our own galaxy, clusters of protostars in the Magellanic clouds, giant molecular clouds in nearby galaxies, and spiral arms in distant galaxies. The large aperture of the telescope will permit sensitive spectral line measurements of molecules, atoms, and ions, which can be used to probe the physical, chemical, and dynamical conditions in a wide variety of objects.

  19. A solar infrared photometer for space flight application

    NASA Technical Reports Server (NTRS)

    Kostiuk, Theodor; Deming, Drake

    1991-01-01

    A photometer concept which is capable of nearly simultaneous measurements of solar radiation from 1.6 to 200 microns in seven wavelength bands is described. This range of wavelengths can probe the solar photosphere from below the level of unit optical depth in the visible to the temperature minimum, about 500 km above it. An instrument package including a 20-cm Gregorian telescope and a filter wheel photometer utilizing noncryogenic pyroelectric infrared detectors is described. Approaches to the rejection of the visible solar spectrum in the instrument, the availability of optical and mechanical components, and the expected instrumental sensitivity are discussed. For wavelengths below 35 microns, the projected instrumental sensitivity is found to be adequate to detect the intensity signature of solar p-mode oscillations during 5 min of integration. For longer wavelengths, clear detection is expected through Fourier analysis of modest data sets.

  20. CO2 lidar for measurements of trace gases and wind velocities

    NASA Technical Reports Server (NTRS)

    Hess, R. V.

    1982-01-01

    CO2 lidar systems technology and signal processing requirements relevant to measurement needs and sensitivity are discussed. Doppler processing is similar to microwave radar, with signal reception controlled by a computer capable of both direct and heterodyne operations. Trace gas concentrations have been obtained with the NASA DIAL system, and trace gas transport has been determined with Doppler lidar measurements for wind velocity and turbulence. High vertical resolution measurement of trace gases, wind velocity, and turbulence are most important in the planetary boundary layer and in regions between the PBL and the lower stratosphere. Shear measurements are critical for airport operational safety. A sensitivity analysis for heterodyne detection with the DIAL system and for short pulses using a Doppler lidar system is presented. The development of transient injection locking techniques, as well as frequency stability by reducing chirp and catalytic control of closed cycle CO2 laser chemistry, is described.

  1. Communications network design and costing model technical manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    This computer model provides the capability for analyzing long-haul trunking networks comprising a set of user-defined cities, traffic conditions, and tariff rates. Networks may consist of all terrestrial connectivity, all satellite connectivity, or a combination of terrestrial and satellite connectivity. Network solutions provide the least-cost routes between all cities, the least-cost network routing configuration, and terrestrial and satellite service cost totals. The CNDC model allows analyses involving three specific FCC-approved tariffs, which are uniquely structured and representative of most existing service connectivity and pricing philosophies. User-defined tariffs that can be variations of these three tariffs are accepted as input to the model and allow considerable flexibility in network problem specification. The resulting model extends the domain of network analysis from traditional fixed link cost (distance-sensitive) problems to more complex problems involving combinations of distance and traffic-sensitive tariffs.

  2. Polinsar Experiments of Multi-Mode X-Band Data Over South Area of China

    NASA Astrophysics Data System (ADS)

    Lu, L.; Yan, Q.; Duan, M.; Zhang, Y.

    2012-08-01

    This paper makes the polarimetric and polarimetric interferometric synthetic aperture radar (PolInSAR) experiments with the high-resolution X-band data acquired by Multi-mode airborne SAR system over an area around Linshui, south of China containing tropic vegetation and urban areas. Polarimetric analysis for typical tropic vegetations and man-made objects are presented, some polarimetric descriptors sensitive to vegetations and man-made objects are selected. Then, the PolInSAR information contained in the data is investigated, considering characteristics of the Multi-mode-XSAR dataset, a dual-baseline polarimetric interferometry method is proposed in this paper. The method both guarantees the high coherence on fully polarimetric data and combines the benefits of short and long baseline that helpful to the phase unwrapping and height sensitivity promotion. PolInSAR experiment results displayed demonstrates Multi-mode-XSAR datasets have intuitive capabilities for amount of application of land classification, objects detection and DSM mapping.

  3. Multidirectional flexible force sensors based on confined, self-adjusting carbon nanotube arrays

    NASA Astrophysics Data System (ADS)

    Lee, J.-I.; Pyo, Soonjae; Kim, Min-Ook; Kim, Jongbaeg

    2018-02-01

    We demonstrate a highly sensitive force sensor based on self-adjusting carbon nanotube (CNT) arrays. Aligned CNT arrays are directly synthesized on silicon microstructures by a space-confined growth technique which enables a facile self-adjusting contact. To afford flexibility and softness, the patterned microstructures with the integrated CNTs are embedded in polydimethylsiloxane structures. The sensing mechanism is based on variations in the contact resistance between the facing CNT arrays under the applied force. By finite element analysis, proper dimensions and positions for each component are determined. Further, high sensitivities up to 15.05%/mN of the proposed sensors were confirmed experimentally. Multidirectional sensing capability could also be achieved by designing multiple sets of sensing elements in a single sensor. The sensors show long-term operational stability, owing to the unique properties of the constituent CNTs, such as outstanding mechanical durability and elasticity.

  4. Application of gamma imaging techniques for the characterisation of position sensitive gamma detectors

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Didierjean, F.; Duchêne, G.; Filliger, M.; Gerl, J.; Kojouharov, I.; Li, G.; Pietralla, N.; Schaffner, H.; Sigward, M.-H.

    2017-11-01

    A device to characterize position-sensitive germanium detectors has been implemented at GSI. The main component of this so called scanning table is a gamma camera that is capable of producing online 2D images of the scanned detector by means of a PET technique. To calibrate the gamma camera Compton imaging is employed. The 2D data can be processed further offline to obtain depth information. Of main interest is the response of the scanned detector in terms of the digitized pulse shapes from the preamplifier. This is an important input for pulse-shape analysis algorithms as they are in use for gamma tracking arrays in gamma spectroscopy. To validate the scanning table, a comparison of its results with a second scanning table implemented at the IPHC Strasbourg is envisaged. For this purpose a pixelated germanium detector has been scanned.

  5. A Portable Immunoassay Platform for Multiplexed Detection of Biotoxins in Clinical and Environmental Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Chung-Yan; Piccini, Matthew Ernest; Schaff, Ulrich Y.

    Multiple cases of attempted bioterrorism events using biotoxins have highlighted the urgent need for tools capable of rapid screening of suspect samples in the field (e.g., mailroom and public events). We present a portable microfluidic device capable of analyzing environmental (e.g., white powder), food (e.g., milk) and clinical (e.g., blood) samples for multiplexed detection of biotoxins. The device is rapid (<15-30 min sample-to-answer), sensitive (< 0.08 pg/mL detection limit for botulinum toxin), multiplexed (up to 64 parallel assays) and capable of analyzing small volume samples (< 20 μL total sample input). The immunoassay approach (SpinDx) is based on binding ofmore » toxins in a sample to antibody-laden capture particles followed by sedimentation of particles through a density-media in a microfluidic disk and quantification using a laser-induced fluorescence detector. A direct, blinded comparison with a gold standard ELISA revealed a 5-fold more sensitive detection limit for botulinum toxin while requiring 250-fold less sample volume and a 30 minute assay time with a near unity correlation. A key advantage of the technique is its compatibility with a variety of sample matrices with no additional sample preparation required. Ultrasensitive quantification has been demonstrated from direct analysis of multiple clinical, environmental and food samples, including white powder, whole blood, saliva, salad dressing, whole milk, peanut butter, half and half, honey, and canned meat. We believe that this device can met an urgent need in screening both potentially exposed people as well as suspicious samples in mail-rooms, airports, public sporting venues and emergency rooms. The general-purpose immunodiagnostics device can also find applications in screening of infectious and systemic diseases or serve as a lab device for conducting rapid immunoassays.« less

  6. The effect of free radical inhibitor on the sensitized radiation crosslinking and thermal processing stabilization of polyurethane shape memory polymers.

    PubMed

    Hearon, Keith; Smith, Sarah E; Maher, Cameron A; Wilson, Thomas S; Maitland, Duncan J

    2013-02-01

    The effects of free radical inhibitor on the electron beam crosslinking and thermal processing stabilization of novel radiation crosslinkable polyurethane shape memory polymers (SMPs) blended with acrylic radiation sensitizers have been determined. The SMPs in this study possess novel processing capabilities-that is, the ability to be melt processed into complex geometries as thermoplastics and crosslinked in a secondary step using electron beam irradiation. To increase susceptibility to radiation crosslinking, the radiation sensitizer pentaerythritol triacrylate (PETA) was solution blended with thermoplastic polyurethane SMPs made from 2-butene-1,4-diol and trimethylhexamethylene diisocyanate (TMHDI). Because thermoplastic melt processing methods such as injection molding are often carried out at elevated temperatures, sensitizer thermal instability is a major processing concern. Free radical inhibitor can be added to provide thermal stabilization; however, inhibitor can also undesirably inhibit radiation crosslinking. In this study, we quantified both the thermal stabilization and radiation crosslinking inhibition effects of the inhibitor 1,4-benzoquinone (BQ) on polyurethane SMPs blended with PETA. Sol/gel analysis of irradiated samples showed that the inhibitor had little to no inverse effects on gel fraction at concentrations of 0-10,000 ppm, and dynamic mechanical analysis showed only a slight negative correlation between BQ composition and rubbery modulus. The 1,4-benzoquinone was also highly effective in thermally stabilizing the acrylic sensitizers. The polymer blends could be heated to 150°C for up to five hours or to 125°C for up to 24 hours if stabilized with 10,000 ppm BQ and could also be heated to 125°C for up to 5 hours if stabilized with 1000 ppm BQ without sensitizer reaction occurring. We believe this study provides significant insight into methods for manipulation of the competing mechanisms of radiation crosslinking and thermal stabilization of radiation sensitizers, thereby facilitating further development of radiation crosslinkable thermoplastic SMPs.

  7. Materials Physics | Materials Science | NREL

    Science.gov Websites

    capabilities in this area. Electronic Raman scattering as an ultra-sensitive probe of strain effects in research capabilities in this area. Effects of incident UV light on surface morphology of MBE grown GaAs example, we seek to predict the effects of soiling for different environmental conditions. We are working

  8. High Energy Astronomy Observatory (HEAO)

    NASA Image and Video Library

    1975-01-01

    The family of High Energy Astronomy Observatory (HEAO) instruments consisted of three unmarned scientific observatories capable of detecting the x-rays emitted by the celestial bodies with high sensitivity and high resolution. The celestial gamma-ray and cosmic-ray fluxes were also collected and studied to learn more about the mysteries of the universe. High-Energy rays cannot be studied by Earth-based observatories because of the obscuring effects of the atmosphere that prevent the rays from reaching the Earth's surface. They had been observed initially by sounding rockets and balloons, and by small satellites that do not possess the needed instrumentation capabilities required for high data resolution and sensitivity. The HEAO carried the instrumentation necessary for this capability. In this photograph, an artist's concept of three HEAO spacecraft is shown: HEAO-1, launched on August 12, 1977; HEAO-2, launched on November 13, 1978; and HEAO-3, launched on September 20. 1979.

  9. Communication Studies of DMP and SMP Machines

    NASA Technical Reports Server (NTRS)

    Sohn, Andrew; Biswas, Rupak; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    Understanding the interplay between machines and problems is key to obtaining high performance on parallel machines. This paper investigates the interplay between programming paradigms and communication capabilities of parallel machines. In particular, we explicate the communication capabilities of the IBM SP-2 distributed-memory multiprocessor and the SGI PowerCHALLENGEarray symmetric multiprocessor. Two benchmark problems of bitonic sorting and Fast Fourier Transform are selected for experiments. Communication-efficient algorithms are developed to exploit the overlapping capabilities of the machines. Programs are written in Message-Passing Interface for portability and identical codes are used for both machines. Various data sizes and message sizes are used to test the machines' communication capabilities. Experimental results indicate that the communication performance of the multiprocessors are consistent with the size of messages. The SP-2 is sensitive to message size but yields a much higher communication overlapping because of the communication co-processor. The PowerCHALLENGEarray is not highly sensitive to message size and yields a low communication overlapping. Bitonic sorting yields lower performance compared to FFT due to a smaller computation-to-communication ratio.

  10. Analysis of painted arts by energy sensitive radiographic techniques with the Pixel Detector Timepix

    NASA Astrophysics Data System (ADS)

    Zemlicka, J.; Jakubek, J.; Kroupa, M.; Hradil, D.; Hradilova, J.; Mislerova, H.

    2011-01-01

    Non-invasive techniques utilizing X-ray radiation offer a significant advantage in scientific investigations of painted arts and other cultural artefacts such as painted artworks or statues. In addition, there is also great demand for a mobile analytical and real-time imaging device given the fact that many fine arts cannot be transported. The highly sensitive hybrid semiconductor pixel detector, Timepix, is capable of detecting and resolving subtle and low-contrast differences in the inner composition of a wide variety of objects. Moreover, it is able to map the surface distribution of the contained elements. Several transmission and emission techniques are presented which have been proposed and tested for the analysis of painted artworks. This study focuses on the novel techniques of X-ray transmission radiography (conventional and energy sensitive) and X-ray induced fluorescence imaging (XRF) which can be realised at the table-top scale with the state-of-the-art pixel detector Timepix. Transmission radiography analyses the changes in the X-ray beam intensity caused by specific attenuation of different components in the sample. The conventional approach uses all energies from the source spectrum for the creation of the image while the energy sensitive alternative creates images in given energy intervals which enable identification and separation of materials. The XRF setup is based on the detection of characteristic radiation induced by X-ray photons through a pinhole geometry collimator. The XRF method is extremely sensitive to the material composition but it creates only surface maps of the elemental distribution. For the purpose of the analysis several sets of painted layers have been prepared in a restoration laboratory. The composition of these layers corresponds to those of real historical paintings from the 19th century. An overview of the current status of our methods will be given with respect to the instrumentation and the application in the field of cultural heritage.

  11. Quantitative topographic differentiation of the neonatal EEG.

    PubMed

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  12. Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA

    NASA Technical Reports Server (NTRS)

    Gupta, Garima

    2011-01-01

    NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.

  13. Hydrographic charting from Landsat satellite - A comparison with aircraft imagery

    NASA Technical Reports Server (NTRS)

    Middleton, E. M.; Barker, J. L.

    1976-01-01

    The relative capabilities of two remote-sensing systems in measuring depth and, consequently, bottom contours in sandy-bottomed and sediment-laden coastal waters were determined quantitatively. The Multispectral Scanner (MSS), orbited on the Landsat-2 satellite, and the Ocean Color Scanner (OCS), flown on U-2 aircraft, were used for this evaluation. Analysis of imagery taken simultaneously indicates a potential for hydrographic charting of marine coastal and shallow shelf areas, even when water turbidity is a factor. Several of the eight optical channels examined on the OCS were found to be sensitive to depth or depth-related information. The greatest sensitivity was in OCS-4 (0.544 plus or minus 0.012 micron) from which contours corresponding to depths up to 12 m were determined. The sharpness of these contours and their spatial stability through time suggests that upwelling radiance is a measure of bottom reflectance and not of water turbidity. The two visible channels on Landsat's MSS were less sensitive in the discrimination of contours, with depths up to 8 m in the high-gain mode (3 X) determined in MSS-4 (0.5 to 0.6 micron).

  14. Hydrographic charting from LANDSAT Satellite: A comparison with aircraft imagery

    NASA Technical Reports Server (NTRS)

    Middleton, E. M.; Barker, J. L.

    1976-01-01

    The relative capabilities of two remote-sensing systems in measuring depth and, consequently, bottom contours in sandy-bottomed and sediment-laden coastal waters were determined quantitatively. The multispectral scanner (MSS), orbited on the LANDSAT-2 Satellite, and the ocean color scanner (OCS), flown on U-2 aircraft, were used. Analysis of imagery taken simultaneously indicates a potential for hydrographic charting of marine coastal and shallow shelf areas, even when water turbidity is a factor. Several of the eight optical channels examined on the OCS were found to be sensitive to depth or depth-related information. The greatest sensitivity was in OCS-4(0.544 + or - 0.012 microns) from which contours corresponding to depths up to 12m were determined. The sharpness of these contours and their spatial stability through time suggests that upwelling radiance is a measure of bottom reflectance and not of water turbidity. The two visible channels on LANDSAT's MSS were less sensitive in the discrimination of contours, with depths up to 8m in the high-gain mode (3x) determined in MSS-4(0.5 to 0.6 microns).

  15. Fast cholesterol detection using flow injection microfluidic device with functionalized carbon nanotubes based electrochemical sensor.

    PubMed

    Wisitsoraat, A; Sritongkham, P; Karuwan, C; Phokharatkul, D; Maturos, T; Tuantranont, A

    2010-12-15

    This work reports a new cholesterol detection scheme using functionalized carbon nanotube (CNT) electrode in a polydimethylsiloxane/glass based flow injection microfluidic chip. CNTs working, silver reference and platinum counter electrode layers were fabricated on the chip by sputtering and low temperature chemical vapor deposition methods. Cholesterol oxidase prepared in polyvinyl alcohol solution was immobilized on CNTs by in-channel flow technique. Cholesterol analysis based on flow injection chronoamperometric measurement was performed in 150-μm-wide and 150-μm-deep microchannels. Fast and sensitive real-time detection was achieved with high throughput of more than 60 samples per hour and small sample volume of 15 μl. The cholesterol sensor had a linear detection range between 50 and 400 mg/dl. In addition, low cross-sensitivities toward glucose, ascorbic acid, acetaminophen and uric acid were confirmed. The proposed system is promising for clinical diagnostics of cholesterol with high speed real-time detection capability, very low sample consumption, high sensitivity, low interference and good stability. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Trends in mass spectrometry instrumentation for proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Richard D.

    2002-12-01

    Mass spectrometry has become a primary tool for proteomics due to its capabilities for rapid and sensitive protein identification and quantitation. It is now possible to identify thousands of proteins from microgram sample quantities in a single day and to quantify relative protein abundances. However, the needs for increased capabilities for proteome measurements are immense and are now driving both new strategies and instrument advances. These developments include those based on integration with multi-dimensional liquid separations and high accuracy mass measurements, and promise more than order of magnitude improvements in sensitivity, dynamic range, and throughput for proteomic analyses in themore » near future.« less

  17. Portable SERS sensor for malachite green and other small dye molecules

    NASA Astrophysics Data System (ADS)

    Qiu, Suyan; Zhao, Fusheng; Li, Jingting; Shih, Wei-Chuan

    2017-02-01

    Sensitive detection of specific chemicals on site can be extremely powerful in many fields. Owing to its molecular fingerprinting capability, surface-enhanced Raman scattering has been one of the technological contenders. In this paper, we describe the novel use of DNA topological nanostructure on nanoporous gold nanoparticle (NPG-NP) array chip for chemical sensing. NPG-NP features large surface area and high-density plasmonic field enhancement known as "hotspots". Hence, NPG-NP array chip has found many applications in nanoplasmonic sensor development. This technique can provide novel label-free molecular sensing capability and enables high sensitivity and specificity detection using a portable Raman spectrometer.

  18. Thermal hydraulic simulations, error estimation and parameter sensitivity studies in Drekar::CFD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Thomas Michael; Shadid, John N.; Pawlowski, Roger P.

    2014-01-01

    This report describes work directed towards completion of the Thermal Hydraulics Methods (THM) CFD Level 3 Milestone THM.CFD.P7.05 for the Consortium for Advanced Simulation of Light Water Reactors (CASL) Nuclear Hub effort. The focus of this milestone was to demonstrate the thermal hydraulics and adjoint based error estimation and parameter sensitivity capabilities in the CFD code called Drekar::CFD. This milestone builds upon the capabilities demonstrated in three earlier milestones; THM.CFD.P4.02 [12], completed March, 31, 2012, THM.CFD.P5.01 [15] completed June 30, 2012 and THM.CFD.P5.01 [11] completed on October 31, 2012.

  19. A nano-patterned self assembled monolayer (SAM) rutile titania cancer chip for rapid, low cost, highly sensitive, direct cancer analysis in MALDI-MS.

    PubMed

    Manikandan, M; Gopal, Judy; Hasan, Nazim; Wu, Hui-Fen

    2014-12-01

    We developed a cancer chip by nano-patterning a highly sensitive SAM titanium surface capable of capturing and sensing concentrations as low as 10 cancer cells/mL from the environment by Matrix Assisted Laser Desorption and Ionization Time of Flight Mass Spectrometry (MALDI-TOF MS). The current approach evades any form of pretreatment and sample preparation processes; it is time saving and does not require the (expensive) conventional MALDI target plate. The home made aluminium (Al) target holder cost, on which we loaded the cancer chips for MALDI-TOF MS analysis, is about 60 USD. While the conventional stainless steel MALDI target plate is more than 700 USD. The SAM surface was an effective platform leading to on-chip direct MALDI-MS detection of cancer cells. We compared the functionality of this chip with the unmodified titanium surfaces and thermally oxidized (TO) titanium surfaces. The lowest detectable concentration of the TO chip was 10(3) cells/mL, while the lowest detectable concentration of the control or unmodified titanium chips was 10(6) cells/mL. Compared to the control surface, the SAM cancer chip showed 100,000 times of enhanced sensitivity and compared with the TO chip, 1000 times of increased sensitivity. The high sensitivity of the SAM surfaces is attributed to the presence of the rutile SAM, surface roughness and surface wettability as confirmed by AFM, XRD, contact angle microscope and FE-SEM. This study opens a new avenue for the potent application of the SAM cancer chip for direct cancer diagnosis by MALDI-TOF MS in the near future. Copyright © 2014. Published by Elsevier B.V.

  20. Handheld confocal Raman microspectrometer for in-vivo skin cancer measurement

    NASA Astrophysics Data System (ADS)

    Lieber, Chad A.; Ellis, Darrel L.; Billheimer, D. D.; Mahadevan-Jansen, Anita

    2004-07-01

    Several studies have demonstrated Raman spectroscopy to be capable of tissue diagnosis with accuracy rivaling that of histopathologic analysis. This technique obtains biochemical-specific information noninvasively, and can eliminate the pain, time, and cost associated with biopsy and pathological analysis. Furthermore, when used in a confocal arrangement, Raman spectra can be obtained from localized regions of the tissue. Skin cancers are an ideal candidate for this emerging technology, due to their obvious accessibility and presentation at specific depths. However, most commercially available confocal Raman microspectrometers are large, rigid systems ill-suited for clinical application. We developed a bench-top confocal Raman microspectrometer using a portable external-cavity diode laser excitation source. This system was used to study several skin lesions in vitro. Results show the depth-resolved Raman spectra can diagnose in vitro skin lesions with 96% sensitivity, 88% specificity, and 86% pathological classification accuracy. Based on the success of this study, a portable Raman system with a handheld confocal microscope was developed for clinical application. Preliminary in vivo data show several distinct spectral differences between skin pathologies. Diagnostic algorithms are planned for this continuing study to assess the capability of Raman spectroscopy for clinical skin cancer diagnosis.

  1. Aviation System Analysis Capability Air Carrier Investment Model-Cargo

    NASA Technical Reports Server (NTRS)

    Johnson, Jesse; Santmire, Tara

    1999-01-01

    The purpose of the Aviation System Analysis Capability (ASAC) Air Cargo Investment Model-Cargo (ACIMC), is to examine the economic effects of technology investment on the air cargo market, particularly the market for new cargo aircraft. To do so, we have built an econometrically based model designed to operate like the ACIM. Two main drivers account for virtually all of the demand: the growth rate of the Gross Domestic Product (GDP) and changes in the fare yield (which is a proxy of the price charged or fare). These differences arise from a combination of the nature of air cargo demand and the peculiarities of the air cargo market. The net effect of these two factors are that sales of new cargo aircraft are much less sensitive to either increases in GDP or changes in the costs of labor, capital, fuel, materials, and energy associated with the production of new cargo aircraft than the sales of new passenger aircraft. This in conjunction with the relatively small size of the cargo aircraft market means technology improvements to the cargo aircraft will do relatively very little to spur increased sales of new cargo aircraft.

  2. Concurrent topology optimization for minimization of total mass considering load-carrying capabilities and thermal insulation simultaneously

    NASA Astrophysics Data System (ADS)

    Long, Kai; Wang, Xuan; Gu, Xianguang

    2017-09-01

    The present work introduces a novel concurrent optimization formulation to meet the requirements of lightweight design and various constraints simultaneously. Nodal displacement of macrostructure and effective thermal conductivity of microstructure are regarded as the constraint functions, which means taking into account both the load-carrying capabilities and the thermal insulation properties. The effective properties of porous material derived from numerical homogenization are used for macrostructural analysis. Meanwhile, displacement vectors of macrostructures from original and adjoint load cases are used for sensitivity analysis of the microstructure. Design variables in the form of reciprocal functions of relative densities are introduced and used for linearization of the constraint function. The objective function of total mass is approximately expressed by the second order Taylor series expansion. Then, the proposed concurrent optimization problem is solved using a sequential quadratic programming algorithm, by splitting into a series of sub-problems in the form of the quadratic program. Finally, several numerical examples are presented to validate the effectiveness of the proposed optimization method. The various effects including initial designs, prescribed limits of nodal displacement, and effective thermal conductivity on optimized designs are also investigated. An amount of optimized macrostructures and their corresponding microstructures are achieved.

  3. Using long-term ground-based HSRL and geostationary observations in combination with model re-analysis to help disentangle local and long-range transported aerosols in Seoul, South Korea

    NASA Astrophysics Data System (ADS)

    Phillips, C.; Holz, R.; Eloranta, E. W.; Reid, J. S.; Kim, S. W.; Kuehn, R.; Marais, W.

    2017-12-01

    The University of Wisconsin High Spectral Resolution Lidar (HSRL) has been continuously operating at Seoul National University as part of the Korea-United States Air Quality Study (KORUS-AQ). The instrument was installed in March of 2016 and continues to operate as of August 2017, providing a truly unique data set to monitor aerosol and cloud properties. With its capability to separate the molecular and particulate scattering, the HSRL is able to detect extremely thin aerosol layers with sub-molecular scattering sensitivity. The system deployed in Seoul has depolarization measurements at 532 nm as well as a near IR channel at 1064 nm providing discrimination between dust, smoke, pollution, water clouds, and ice clouds. As will be presented, these capabilities can be used to produce three channel combined RGB images that provide visualization of small changes in the aerosol properties. A primary motivation of KORUS-AQ was to determine the relative effects of transported pollution and local pollution on air quality in Seoul. We hypothesize that HSRL-based image analysis algorithms combined with satellite and model re-analysis has the potential to identify cases when remote sources of aerosols and pollution are advected into the boundary layer with impacts to the surface air quality. To facilitate this research we have developed the capability to combine ten-minute geostationary imagery from Himawari-8, nearby radiosondes, model output, surface PM measurements, and AERONET data over the HSRL site. On a case-by-case basis, it is possible to separate layers of aerosols with different scattering properties using these tools. Additionally, a preliminary year-long aerosol climatology with integrated geo-stationary retrievals and modeling data will be presented. The focus is on investigating correlations between the HSRL aerosol measurements (depolarization, color ratio, extinction, and lidar ratio) with the model output and aerosol sources. This analysis will use recently developed algorithms that automate the HSRL cloud and aerosol masking, providing the capability to characterize the seasonal changes in aerosol radiative properties and supplement the month-long field campaign with almost two years of continuous HSRL observations.

  4. Electric Propulsion Upper-Stage for Launch Vehicle Capability Enhancement

    NASA Technical Reports Server (NTRS)

    Kemp, Gregory E.; Dankanich, John W.; Woodcock, Gordon R.; Wingo, Dennis R.

    2007-01-01

    The NASA In-Space Propulsion Technology Project Office initiated a preliminary study to evaluate the performance benefits of a solar electric propulsion (SEP) upper-stage with existing and near-term small launch vehicles. The analysis included circular and elliptical Low Earth Orbit (LEO) to Geosynchronous Earth Orbit (GEO) transfers, and LEO to Low Lunar Orbit (LLO) applications. SEP subsystem options included state-of-the-art and near-term solar arrays and electric thrusters. In-depth evaluations of the Aerojet BPT-4000 Hall thruster and NEXT gridded ion engine were conducted to compare performance, cost and revenue potential. Preliminary results indicate that Hall thruster technology is favored for low-cost, low power SEP stages, while gridded-ion engines are favored for higher power SEP systems unfettered by transfer time constraints. A low-cost point design is presented that details one possible stage configuration and outlines system limitations, in particular fairing volume constraints. The results demonstrate mission enhancements to large and medium class launch vehicles, and mission enabling performance when SEP system upper stages are mounted to low-cost launchers such as the Minotaur and Falcon 1. Study results indicate the potential use of SEP upper stages to double GEO payload mass capability and to possibly enable launch on demand capability for GEO assets. Transition from government to commercial applications, with associated cost/benefit analysis, has also been assessed. The sensitivity of system performance to specific impulse, array power, thruster size, and component costs are also discussed.

  5. Development of GENOA Progressive Failure Parallel Processing Software Systems

    NASA Technical Reports Server (NTRS)

    Abdi, Frank; Minnetyan, Levon

    1999-01-01

    A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.

  6. Enabling inspection solutions for future mask technologies through the development of massively parallel E-Beam inspection

    NASA Astrophysics Data System (ADS)

    Malloy, Matt; Thiel, Brad; Bunday, Benjamin D.; Wurm, Stefan; Jindal, Vibhu; Mukhtar, Maseeh; Quoi, Kathy; Kemen, Thomas; Zeidler, Dirk; Eberle, Anna Lena; Garbowski, Tomasz; Dellemann, Gregor; Peters, Jan Hendrik

    2015-09-01

    The new device architectures and materials being introduced for sub-10nm manufacturing, combined with the complexity of multiple patterning and the need for improved hotspot detection strategies, have pushed current wafer inspection technologies to their limits. In parallel, gaps in mask inspection capability are growing as new generations of mask technologies are developed to support these sub-10nm wafer manufacturing requirements. In particular, the challenges associated with nanoimprint and extreme ultraviolet (EUV) mask inspection require new strategies that enable fast inspection at high sensitivity. The tradeoffs between sensitivity and throughput for optical and e-beam inspection are well understood. Optical inspection offers the highest throughput and is the current workhorse of the industry for both wafer and mask inspection. E-beam inspection offers the highest sensitivity but has historically lacked the throughput required for widespread adoption in the manufacturing environment. It is unlikely that continued incremental improvements to either technology will meet tomorrow's requirements, and therefore a new inspection technology approach is required; one that combines the high-throughput performance of optical with the high-sensitivity capabilities of e-beam inspection. To support the industry in meeting these challenges SUNY Poly SEMATECH has evaluated disruptive technologies that can meet the requirements for high volume manufacturing (HVM), for both the wafer fab [1] and the mask shop. Highspeed massively parallel e-beam defect inspection has been identified as the leading candidate for addressing the key gaps limiting today's patterned defect inspection techniques. As of late 2014 SUNY Poly SEMATECH completed a review, system analysis, and proof of concept evaluation of multiple e-beam technologies for defect inspection. A champion approach has been identified based on a multibeam technology from Carl Zeiss. This paper includes a discussion on the need for high-speed e-beam inspection and then provides initial imaging results from EUV masks and wafers from 61 and 91 beam demonstration systems. Progress towards high resolution and consistent intentional defect arrays (IDA) is also shown.

  7. 47 CFR 80.913 - Radiotelephone receivers.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... pursuant to § 80.909 of this part. (c) If a very high frequency radiotelephone installation is provided... radiotelephone installation must have a sensitivity of at least 50 microvolts in the case of MF equipment, and 1... be capable of efficient operation when energized by the reserve source of energy. (g) The sensitivity...

  8. 47 CFR 80.913 - Radiotelephone receivers.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... pursuant to § 80.909 of this part. (c) If a very high frequency radiotelephone installation is provided... radiotelephone installation must have a sensitivity of at least 50 microvolts in the case of MF equipment, and 1... be capable of efficient operation when energized by the reserve source of energy. (g) The sensitivity...

  9. 47 CFR 80.913 - Radiotelephone receivers.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... pursuant to § 80.909 of this part. (c) If a very high frequency radiotelephone installation is provided... radiotelephone installation must have a sensitivity of at least 50 microvolts in the case of MF equipment, and 1... be capable of efficient operation when energized by the reserve source of energy. (g) The sensitivity...

  10. 47 CFR 80.913 - Radiotelephone receivers.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... pursuant to § 80.909 of this part. (c) If a very high frequency radiotelephone installation is provided... radiotelephone installation must have a sensitivity of at least 50 microvolts in the case of MF equipment, and 1... be capable of efficient operation when energized by the reserve source of energy. (g) The sensitivity...

  11. 47 CFR 80.913 - Radiotelephone receivers.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... pursuant to § 80.909 of this part. (c) If a very high frequency radiotelephone installation is provided... radiotelephone installation must have a sensitivity of at least 50 microvolts in the case of MF equipment, and 1... be capable of efficient operation when energized by the reserve source of energy. (g) The sensitivity...

  12. Sensitivity and fragmentation calibration of the time-of-flight mass spectrometer RTOF on board ESA's Rosetta mission

    NASA Astrophysics Data System (ADS)

    Gasc, Sébastien; Altwegg, Kathrin; Jäckel, Annette; Le Roy, Léna; Rubin, Martin; Fiethe, Björn; Mall, Urs; Rème, Henri

    2014-05-01

    The European Space Agency's Rosetta mission will rendez-vous comet 67P/Churyumov-Gerasimenko (67P) in September 2014. The Rosetta spacecraft with the Rosetta Orbiter Spectrometer for Ion and Neutral Analysis (ROSINA) onboard will follow and survey 67P for more than a year until the comet reaches its perihelion and beyond. ROSINA will provide new information on the global molecular, elemental, and isotopic composition of the coma [1]. ROSINA consists of a pressure sensor (COPS) and two mass spectrometers, the Double Focusing Mass Spectrometer (DFMS) and the Reflectron Time Of Flight mass spectrometer (RTOF). RTOF has a wide mass range, from 1 amu/e to >300 amu/e, and contains two ion sources, a reflectron and two detectors. The two ion sources, the orthogonal and the storage source, are capable to measure cometary ions while the latter also allows measuring cometary neutral gas. In neutral gas mode the ionization is performed through electron impact. A built-in Gas Calibration Unit (GCU) contains a known gas mixture composed of He, CO2, and Kr that can be used for in-flight calibration of the instrument. Among other ROSINA specific scientific goals, RTOF's task will be to determine molecular composition of volatiles via measuring and separating heavy hydrocarbons; it has been designed to study the development of the cometary activity as well as the coma chemistry between 3.5 AU and perihelion. From the spectroscopic studies and in-situ observations of other comets, we expect to find molecules such as H2O, CO, CO2, hydrocarbons, alcohols, formaldehyde, and other organic compounds in the coma of 67P/Churyumov-Gerasimenko [2]. To demonstrate and quantify the sensitivity and functionality of RTOF, calibration measurements have been realized with more than 20 species among the most abundant molecules quoted above, as well as other species such as PAHs. We will describe the applied methods used to realize this calibration and will discuss our preliminary results, i.e. RTOF capabilities in terms of sensitivity, isotopic ratios, and fragmentation patterns. We will demonstrate that RTOF is well capable to meet the requirements to address the scientific questions discussed above. [1] Balsiger, H. et al.: ROSINA-Rosetta Orbiter Spectrometer for Ion and Neutral Analysis, Space Science Reviews, Vol. 128, 745-801, 2007. [2] Bockelée-Morvan, D., Crovisier, J., Mumma, M. J., and Weaver, H. A.: The Composition of Cometary Volatiles, in Comets II (M. C. Festou et al., eds), Univ. Arizona Press, Tucson, 2004

  13. Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals

    NASA Astrophysics Data System (ADS)

    Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.

    2018-02-01

    Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.

  14. Malignancy Detection on Mammography Using Dual Deep Convolutional Neural Networks and Genetically Discovered False Color Input Enhancement.

    PubMed

    Teare, Philip; Fishman, Michael; Benzaquen, Oshra; Toledano, Eyal; Elnekave, Eldad

    2017-08-01

    Breast cancer is the most prevalent malignancy in the US and the third highest cause of cancer-related mortality worldwide. Regular mammography screening has been attributed with doubling the rate of early cancer detection over the past three decades, yet estimates of mammographic accuracy in the hands of experienced radiologists remain suboptimal with sensitivity ranging from 62 to 87% and specificity from 75 to 91%. Advances in machine learning (ML) in recent years have demonstrated capabilities of image analysis which often surpass those of human observers. Here we present two novel techniques to address inherent challenges in the application of ML to the domain of mammography. We describe the use of genetic search of image enhancement methods, leading us to the use of a novel form of false color enhancement through contrast limited adaptive histogram equalization (CLAHE), as a method to optimize mammographic feature representation. We also utilize dual deep convolutional neural networks at different scales, for classification of full mammogram images and derivative patches combined with a random forest gating network as a novel architectural solution capable of discerning malignancy with a specificity of 0.91 and a specificity of 0.80. To our knowledge, this represents the first automatic stand-alone mammography malignancy detection algorithm with sensitivity and specificity performance similar to that of expert radiologists.

  15. Spatial Heterodyne Observations of Water (SHOW) vapour in the upper troposphere and lower stratosphere from a high altitude aircraft: Modelling and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Langille, J. A.; Letros, D.; Zawada, D.; Bourassa, A.; Degenstein, D.; Solheim, B.

    2018-04-01

    A spatial heterodyne spectrometer (SHS) has been developed to measure the vertical distribution of water vapour in the upper troposphere and the lower stratosphere with a high vertical resolution (∼500 m). The Spatial Heterodyne Observations of Water (SHOW) instrument combines an imaging system with a monolithic field-widened SHS to observe limb scattered sunlight in a vibrational band of water (1363 nm-1366 nm). The instrument has been optimized for observations from NASA's ER-2 aircraft as a proof-of-concept for a future low earth orbit satellite deployment. A robust model has been developed to simulate SHOW ER-2 limb measurements and retrievals. This paper presents the simulation of the SHOW ER-2 limb measurements along a hypothetical flight track and examines the sensitivity of the measurement and retrieval approach. Water vapour fields from an Environment and Climate Change Canada forecast model are used to represent realistic spatial variability along the flight path. High spectral resolution limb scattered radiances are simulated using the SASKTRAN radiative transfer model. It is shown that the SHOW instrument onboard the ER-2 is capable of resolving the water vapour variability in the UTLS from approximately 12 km - 18 km with ±1 ppm accuracy. Vertical resolutions between 500 m and 1 km are feasible. The along track sampling capability of the instrument is also discussed.

  16. Calibration of an Ultra-Low-Background Proportional Counter for Measuring 37Ar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seifert, Allen; Aalseth, Craig E.; Bonicalzi, Ricco

    Abstract. An ultra-low-background proportional counter (ULBPC) design has been developed at Pacific Northwest National Laboratory (PNNL) using clean materials, primarily electrochemically-purified copper. This detector, along with an ultra-low-background counting system (ULBCS), was developed to complement a new shallow underground laboratory (30 meters water-equivalent) constructed at PNNL. The ULBCS design includes passive neutron and gamma shielding, along with an active cosmic-veto system. This system provides a capability for making ultra-sensitive measurements to support applications like age-dating soil hydrocarbons with 14C/3H, age-dating of groundwater with 39Ar, and soil-gas assay for 37Ar to support On-Site Inspection (OSI). On-Site Inspection is a key componentmore » of the verification regime for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Measurements of radionuclides created by an underground nuclear explosion are valuable signatures of a Treaty violation. For OSI, the 35-day half-life of 37Ar, produced from neutron interactions with calcium in soil, provides both high specific activity and sufficient time for inspection before decay limits sensitivity. This work describes the calibration techniques and analysis methods developed to enable quantitative measurements of 37Ar samples over a broad range of pressures. These efforts, along with parallel work in progress on gas chemistry separation, are expected to provide a significant new capability for 37Ar soil gas background studies.« less

  17. Sensitivity Analysis of Genetic Algorithm Parameters for Optimal Groundwater Monitoring Network Design

    NASA Astrophysics Data System (ADS)

    Abdeh-Kolahchi, A.; Satish, M.; Datta, B.

    2004-05-01

    A state art groundwater monitoring network design is introduced. The method combines groundwater flow and transport results with optimization Genetic Algorithm (GA) to identify optimal monitoring well locations. Optimization theory uses different techniques to find a set of parameter values that minimize or maximize objective functions. The suggested groundwater optimal monitoring network design is based on the objective of maximizing the probability of tracking a transient contamination plume by determining sequential monitoring locations. The MODFLOW and MT3DMS models included as separate modules within the Groundwater Modeling System (GMS) are used to develop three dimensional groundwater flow and contamination transport simulation. The groundwater flow and contamination simulation results are introduced as input to the optimization model, using Genetic Algorithm (GA) to identify the groundwater optimal monitoring network design, based on several candidate monitoring locations. The groundwater monitoring network design model is used Genetic Algorithms with binary variables representing potential monitoring location. As the number of decision variables and constraints increase, the non-linearity of the objective function also increases which make difficulty to obtain optimal solutions. The genetic algorithm is an evolutionary global optimization technique, which is capable of finding the optimal solution for many complex problems. In this study, the GA approach capable of finding the global optimal solution to a groundwater monitoring network design problem involving 18.4X 1018 feasible solutions will be discussed. However, to ensure the efficiency of the solution process and global optimality of the solution obtained using GA, it is necessary that appropriate GA parameter values be specified. The sensitivity analysis of genetic algorithms parameters such as random number, crossover probability, mutation probability, and elitism are discussed for solution of monitoring network design.

  18. Thermodynamic Modeling and Dispatch of Distributed Energy Technologies including Fuel Cell -- Gas Turbine Hybrids

    NASA Astrophysics Data System (ADS)

    McLarty, Dustin Fogle

    Distributed energy systems are a promising means by which to reduce both emissions and costs. Continuous generators must be responsive and highly efficiency to support building dynamics and intermittent on-site renewable power. Fuel cell -- gas turbine hybrids (FC/GT) are fuel-flexible generators capable of ultra-high efficiency, ultra-low emissions, and rapid power response. This work undertakes a detailed study of the electrochemistry, chemistry and mechanical dynamics governing the complex interaction between the individual systems in such a highly coupled hybrid arrangement. The mechanisms leading to the compressor stall/surge phenomena are studied for the increased risk posed to particular hybrid configurations. A novel fuel cell modeling method introduced captures various spatial resolutions, flow geometries, stack configurations and novel heat transfer pathways. Several promising hybrid configurations are analyzed throughout the work and a sensitivity analysis of seven design parameters is conducted. A simple estimating method is introduced for the combined system efficiency of a fuel cell and a turbine using component performance specifications. Existing solid oxide fuel cell technology is capable of hybrid efficiencies greater than 75% (LHV) operating on natural gas, and existing molten carbonate systems greater than 70% (LHV). A dynamic model is calibrated to accurately capture the physical coupling of a FC/GT demonstrator tested at UC Irvine. The 2900 hour experiment highlighted the sensitivity to small perturbations and a need for additional control development. Further sensitivity studies outlined the responsiveness and limits of different control approaches. The capability for substantial turn-down and load following through speed control and flow bypass with minimal impact on internal fuel cell thermal distribution is particularly promising to meet local demands or provide dispatchable support for renewable power. Advanced control and dispatch heuristics are discussed using a case study of the UCI central plant. Thermal energy storage introduces a time horizon into the dispatch optimization which requires novel solution strategies. Highly efficient and responsive generators are required to meet the increasingly dynamic loads of today's efficient buildings and intermittent local renewable wind and solar power. Fuel cell gas turbine hybrids will play an integral role in the complex and ever-changing solution to local electricity production.

  19. Next generation laser-based standoff spectroscopy techniques for Mars exploration.

    PubMed

    Gasda, Patrick J; Acosta-Maeda, Tayro E; Lucey, Paul G; Misra, Anupam K; Sharma, Shiv K; Taylor, G Jeffrey

    2015-01-01

    In the recent Mars 2020 Rover Science Definition Team Report, the National Aeronautics and Space Administration (NASA) has sought the capability to detect and identify elements, minerals, and most importantly, biosignatures, at fine scales for the preparation of a retrievable cache of samples. The current Mars rover, the Mars Science Laboratory Curiosity, has a remote laser-induced breakdown spectroscopy (LIBS) instrument, a type of quantitative elemental analysis, called the Chemistry Camera (ChemCam) that has shown that laser-induced spectroscopy instruments are not only feasible for space exploration, but are reliable and complementary to traditional elemental analysis instruments such as the Alpha Particle X-Ray Spectrometer. The superb track record of ChemCam has paved the way for other laser-induced spectroscopy instruments, such as Raman and fluorescence spectroscopy. We have developed a prototype remote LIBS-Raman-fluorescence instrument, Q-switched laser-induced time-resolved spectroscopy (QuaLITy), which is approximately 70 000 times more efficient at recording signals than a commercially available LIBS instrument. The increase in detection limits and sensitivity is due to our development of a directly coupled system, the use of an intensified charge-coupled device image detector, and a pulsed laser that allows for time-resolved measurements. We compare the LIBS capabilities of our system with an Ocean Optics spectrometer instrument at 7 m and 5 m distance. An increase in signal-to-noise ratio of at least an order of magnitude allows for greater quantitative analysis of the elements in a LIBS spectrum with 200-300 μm spatial resolution at 7 m, a Raman instrument capable of 1 mm spatial resolution at 3 m, and bioorganic fluorescence detection at longer distances. Thus, the new QuaLITy instrument fulfills all of the NASA expectations for proposed instruments.

  20. Satellite laser ranging as a tool for the recovery of tropospheric gradients

    NASA Astrophysics Data System (ADS)

    Drożdżewski, M.; Sośnica, K.

    2018-11-01

    Space geodetic techniques, such as Global Navigation Satellite Systems (GNSS) and Very Long Baseline Interferometry (VLBI) have been extensively used for the recovery of the tropospheric parameters. Both techniques employ microwave observations, for which the troposphere is a non-dispersive medium and which are very sensitive to the water vapor content. Satellite laser ranging (SLR) is the only space geodetic technique used for the definition of the terrestrial reference frames which employs optical - laser observations. The SLR sensitivity to the hydrostatic part of the troposphere delay is similar to that of microwave observations, whereas the sensitivity of laser observations to non-hydrostatic part of the delay is about two orders of magnitude smaller than in the case of microwave observations. Troposphere is a dispersive medium for optical wavelengths, which means that the SLR tropospheric delay depends on the laser wavelength. This paper presents the sensitivity and capability of the SLR observations for the recovery of azimuthal asymmetry over the SLR stations, which can be described as horizontal gradients of the troposphere delay. For the first time, the horizontal gradients are estimated, together with other parameters typically estimated from the SLR observations to spherical LAGEOS satellites, i.e., station coordinates, earth rotation parameters, and satellite orbits. Most of the SLR stations are co-located with GNSS receivers, thus, a cross-correlation between both techniques is possible. We compare our SLR horizontal gradients to GNSS results and to the horizontal gradients derived from the numerical weather models (NWM). Due to a small number of the SLR observations, SLR is not capable of reconstructing short-period phenomena occurring in the atmosphere. However, the long-term analysis allows for the recovery of the atmosphere asymmetry using SLR. As a result, the mean offsets of the SLR-derived horizontal gradients agree to the level of 47%, 74%, 54% with GNSS, hydrostatic delay, and total delay from NWM, respectively. SLR can be thus employed as a tool for the recovery of the atmospheric parameters with a major sensitivity to the hydrostatic part of the delay.

  1. Optimal cutoff points for HOMA-IR and QUICKI in the diagnosis of metabolic syndrome and non-alcoholic fatty liver disease: A population based study.

    PubMed

    Motamed, Nima; Miresmail, Seyed Javad Haji; Rabiee, Behnam; Keyvani, Hossein; Farahani, Behzad; Maadi, Mansooreh; Zamani, Farhad

    2016-03-01

    The present study was carried out to determine the optimal cutoff points for homeostatic model assessment (HOMA-IR) and quantitative insulin sensitivity check index (QUICKI) in the diagnosis of metabolic syndrome (MetS) and non-alcoholic fatty liver disease (NAFLD). The baseline data of 5511 subjects aged ≥18years of a cohort study in northern Iran were utilized to analyze. Receiver operating characteristic (ROC) analysis was conducted to determine the discriminatory capability of HOMA-IR and QUICKI in the diagnosis of MetS and NAFLD. Youden index was utilized to determine the optimal cutoff points of HOMA-IR and QUICKI in the diagnosis of MetS and NAFLD. The optimal cutoff points for HOMA-IR in the diagnosis of MetS and NAFLD were 2.0 [sensitivity=64.4%, specificity=66.8%] and 1.79 [sensitivity=66.2%, specificity=62.2%] in men and were 2.5 [sensitivity=57.6%, specificity=67.9%] and 1.95 [sensitivity=65.1%, specificity=54.7%] in women respectively. Furthermore, the optimal cutoff points for QUICKI in the diagnosis of MetS and NAFLD were 0.343 [sensitivity=63.7%, specificity=67.8%] and 0.347 [sensitivity=62.9%, specificity=65.0%] in men and were 0.331 [sensitivity=55.7%, specificity=70.7%] and 0.333 [sensitivity=53.2%, specificity=67.7%] in women respectively. Not only the optimal cutoff points of HOMA-IR and QUICKI were different for MetS and NAFLD, but also different cutoff points were obtained for men and women for each of these two conditions. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Programmable Bio-Nano-Chip Systems for Serum CA125 Quantification: Towards Ovarian Cancer Diagnostics at the Point-of-Care

    PubMed Central

    Raamanathan, Archana; Simmons, Glennon W.; Christodoulides, Nicolaos; Floriano, Pierre N.; Furmaga, Wieslaw B.; Redding, Spencer W.; Lu, Karen H.; Bast, Robert C.; McDevitt, John T.

    2013-01-01

    Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, modular (Programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multi-modal and multi-marker screening approaches. In the p-BNC, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a 3-D microfluidic environment, the p-BNC operating variables (incubation times, flow rates and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (Inter- and intra-assay precision of 1.2% and 1.9% and LODs of 1.0 U/mL) was achieved on this mini-sensor ensemble. Further validation with sera of ovarian cancer patients (n=20) demonstrated excellent correlation (R2 = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics. PMID:22490510

  3. Programmable bio-nano-chip systems for serum CA125 quantification: toward ovarian cancer diagnostics at the point-of-care.

    PubMed

    Raamanathan, Archana; Simmons, Glennon W; Christodoulides, Nicolaos; Floriano, Pierre N; Furmaga, Wieslaw B; Redding, Spencer W; Lu, Karen H; Bast, Robert C; McDevitt, John T

    2012-05-01

    Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive, and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, and modular (programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multimodal and multimarker screening approaches. In the p-BNCs, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing, and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a three-dimensional microfluidic environment, the p-BNC operating variables (incubation times, flow rates, and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (inter- and intra-assay precision of 1.2% and 1.9% and limit of detection of 1.0 U/mL) was achieved on this minisensor ensemble. Furthermore, validation with sera of patients with ovarian cancer (n = 20) showed excellent correlation (R(2) = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise, and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics.

  4. Noninvasive detection of nasopharyngeal carcinoma based on saliva proteins using surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Lin, Xueliang; Lin, Duo; Ge, Xiaosong; Qiu, Sufang; Feng, Shangyuan; Chen, Rong

    2017-10-01

    The present study evaluated the capability of saliva analysis combining membrane protein purification with surface-enhanced Raman spectroscopy (SERS) for noninvasive detection of nasopharyngeal carcinoma (NPC). A rapid and convenient protein purification method based on cellulose acetate membrane was developed. A total of 659 high-quality SERS spectra were acquired from purified proteins extracted from the saliva samples of 170 patients with pathologically confirmed NPC and 71 healthy volunteers. Spectral analysis of those saliva protein SERS spectra revealed specific changes in some biochemical compositions, which were possibly associated with NPC transformation. Furthermore, principal component analysis combined with linear discriminant analysis (PCA-LDA) was utilized to analyze and classify the saliva protein SERS spectra from NPC and healthy subjects. Diagnostic sensitivity of 70.7%, specificity of 70.3%, and diagnostic accuracy of 70.5% could be achieved by PCA-LDA for NPC identification. These results show that this assay based on saliva protein SERS analysis holds promising potential for developing a rapid, noninvasive, and convenient clinical tool for NPC screening.

  5. Determination of trace amino acids in human serum by a selective and sensitive pre-column derivatization method using HPLC-FLD-MS/MS and derivatization optimization by response surface methodology.

    PubMed

    Li, Guoliang; Cui, Yanyan; You, Jinmao; Zhao, Xianen; Sun, Zhiwei; Xia, Lian; Suo, Yourui; Wang, Xiao

    2011-04-01

    Analysis of trace amino acids (AA) in physiological fluids has received more attention, because the analysis of these compounds could provide fundamental and important information for medical, biological, and clinical researches. More accurate method for the determination of those compounds is highly desirable and valuable. In the present study, we developed a selective and sensitive method for trace AA determination in biological samples using 2-[2-(7H-dibenzo [a,g]carbazol-7-yl)-ethoxy] ethyl chloroformate (DBCEC) as labeling reagent by HPLC-FLD-MS/MS. Response surface methodology (RSM) was first employed to optimize the derivatization reaction between DBCEC and AA. Compared with traditional single-factor design, RSM was capable of lessening laborious, time and reagents consumption. The complete derivatization can be achieved within 6.3 min at room temperature. In conjunction with a gradient elution, a baseline resolution of 20 AA containing acidic, neutral, and basic AA was achieved on a reversed-phase Hypersil BDS C(18) column. This method showed excellent reproducibility and correlation coefficient, and offered the exciting detection limits of 0.19-1.17 fmol/μL. The developed method was successfully applied to determinate AA in human serum. The sensitive and prognostic index of serum AA for liver diseases has also been discussed.

  6. Epileptic Seizure Prediction Using Diffusion Distance and Bayesian Linear Discriminate Analysis on Intracranial EEG.

    PubMed

    Yuan, Shasha; Zhou, Weidong; Chen, Liyan

    2018-02-01

    Epilepsy is a chronic neurological disorder characterized by sudden and apparently unpredictable seizures. A system capable of forecasting the occurrence of seizures is crucial and could open new therapeutic possibilities for human health. This paper addresses an algorithm for seizure prediction using a novel feature - diffusion distance (DD) in intracranial Electroencephalograph (iEEG) recordings. Wavelet decomposition is conducted on segmented electroencephalograph (EEG) epochs and subband signals at scales 3, 4 and 5 are utilized to extract the diffusion distance. The features of all channels composing a feature vector are then fed into a Bayesian Linear Discriminant Analysis (BLDA) classifier. Finally, postprocessing procedure is applied to reduce false prediction alarms. The prediction method is evaluated on the public intracranial EEG dataset, which consists of 577.67[Formula: see text]h of intracranial EEG recordings from 21 patients with 87 seizures. We achieved a sensitivity of 85.11% for a seizure occurrence period of 30[Formula: see text]min and a sensitivity of 93.62% for a seizure occurrence period of 50[Formula: see text]min, both with the seizure prediction horizon of 10[Formula: see text]s. Our false prediction rate was 0.08/h. The proposed method yields a high sensitivity as well as a low false prediction rate, which demonstrates its potential for real-time prediction of seizures.

  7. Automated Pilot Performance Assessment in the T-37: A Feasibility Study. Final Report (May 1968-April 1971).

    ERIC Educational Resources Information Center

    Knoop, Patricia A.; Welde, William L.

    Air Force investigators conducted a three year program to develop a capability for automated quantification and assessment of in-flight pilot performance. Such a capability enhances pilot training by making ratings more objective, valid, reliable and sensitive, and by freeing instructors from rating responsibilities, allowing them to concentrate…

  8. Phase sensitive spectral domain interferometry for label free biomolecular interaction analysis and biosensing applications

    NASA Astrophysics Data System (ADS)

    Chirvi, Sajal

    Biomolecular interaction analysis (BIA) plays vital role in wide variety of fields, which include biomedical research, pharmaceutical industry, medical diagnostics, and biotechnology industry. Study and quantification of interactions between natural biomolecules (proteins, enzymes, DNA) and artificially synthesized molecules (drugs) is routinely done using various labeled and label-free BIA techniques. Labeled BIA (Chemiluminescence, Fluorescence, Radioactive) techniques suffer from steric hindrance of labels on interaction site, difficulty of attaching labels to molecules, higher cost and time of assay development. Label free techniques with real time detection capabilities have demonstrated advantages over traditional labeled techniques. The gold standard for label free BIA is surface Plasmon resonance (SPR) that detects and quantifies the changes in refractive index of the ligand-analyte complex molecule with high sensitivity. Although SPR is a highly sensitive BIA technique, it requires custom-made sensor chips and is not well suited for highly multiplexed BIA required in high throughput applications. Moreover implementation of SPR on various biosensing platforms is limited. In this research work spectral domain phase sensitive interferometry (SD-PSI) has been developed for label-free BIA and biosensing applications to address limitations of SPR and other label free techniques. One distinct advantage of SD-PSI compared to other label-free techniques is that it does not require use of custom fabricated biosensor substrates. Laboratory grade, off-the-shelf glass or plastic substrates of suitable thickness with proper surface functionalization are used as biosensor chips. SD-PSI is tested on four separate BIA and biosensing platforms, which include multi-well plate, flow cell, fiber probe with integrated optics and fiber tip biosensor. Sensitivity of 33 ng/ml for anti-IgG is achieved using multi-well platform. Principle of coherence multiplexing for multi-channel label-free biosensing applications is introduced. Simultaneous interrogation of multiple biosensors is achievable with a single spectral domain phase sensitive interferometer by coding the individual sensograms in coherence-multiplexed channels. Experimental results demonstrating multiplexed quantitative biomolecular interaction analysis of antibodies binding to antigen coated functionalized biosensor chip surfaces on different platforms are presented.

  9. Systems analysis of decontamination options for civilian vehicles.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foltz, Greg W.; Hoette, Trisha Marie

    2010-11-01

    The objective of this project, which was supported by the Department of Homeland Security (DHS) Science and Technology Directorate (S&T) Chemical and Biological Division (CBD), was to investigate options for the decontamination of the exteriors and interiors of vehicles in the civilian setting in order to restore those vehicles to normal use following the release of a highly toxic chemical. The decontamination of vehicles is especially challenging because they often contain sensitive electronic equipment, multiple materials some of which strongly adsorb chemical agents, and in the case of aircraft, have very rigid material compatibility requirements (i.e., they cannot be exposedmore » to reagents that may cause even minor corrosion). A systems analysis approach was taken examine existing and future civilian vehicle decontamination capabilities.« less

  10. Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Llames, Rene Lim

    1991-01-01

    Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.

  11. The role of atomic fluorescence spectrometry in the automatic environmental monitoring of trace element analysis

    PubMed Central

    Stockwell, P. B.; Corns, W. T.

    1993-01-01

    Considerable attention has been drawn to the environmental levels of mercury, arsenic, selenium and antimony in the last decade. Legislative and environmental pressure has forced levels to be lowered and this has created an additional burden for analytical chemists. Not only does an analysis have to reach lower detection levels, but it also has to be seen to be correct. Atomic fluorescence detection, especially when coupled to vapour generation techniques, offers both sensitivity and specificity. Developments in the design of specified atomic fluorescence detectors for mercury, for the hydride-forming elements and also for cadmium, are described in this paper. Each of these systems is capable of analysing samples in the part per trillion (ppt) range reliably and economically. Several analytical applications are described. PMID:18924964

  12. Application of epithermal neutron activation in multielement analysis of silicate rocks employing both coaxial Ge(Li) and low energy photon detector systems

    USGS Publications Warehouse

    Baedecker, P.A.; Rowe, J.J.; Steinnes, E.

    1977-01-01

    The instrumental activation analysis of silicate rocks using epithermal neutrons has been studied using both high resolution coaxial Ge(Li) detectors and low energy photon detectors, and applied to the determination of 23 elements in eight new U.S.G.S. standard rocks. The analytical use X-ray peaks associated with electron capture or internal conversion processes has been evaluated. Of 28 elements which can be considered to be determinable by instrumental means, the epithermal activation approach is capable of giving improved sensitivity and precision in 16 cases, over the normal INAA procedure. In eleven cases the use of the low energy photon detector is thought to show advantages over convertional coaxial Ge(Li) spectroscopy. ?? 1977 Akade??miai Kiado??.

  13. Satellite on-board processing for earth resources data

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E.; Gonzalez, R. C.; Gupta, J. N.; Hwang, K.; Rochelle, R. W.; Wilson, J. B.; Wintz, P. A.

    1975-01-01

    Results of a survey of earth resources user applications and their data requirements, earth resources multispectral scanner sensor technology, and preprocessing algorithms for correcting the sensor outputs and for data bulk reduction are presented along with a candidate data format. Computational requirements required to implement the data analysis algorithms are included along with a review of computer architectures and organizations. Computer architectures capable of handling the algorithm computational requirements are suggested and the environmental effects of an on-board processor discussed. By relating performance parameters to the system requirements of each of the user requirements the feasibility of on-board processing is determined for each user. A tradeoff analysis is performed to determine the sensitivity of results to each of the system parameters. Significant results and conclusions are discussed, and recommendations are presented.

  14. Analytical description of the modern steam automobile

    NASA Technical Reports Server (NTRS)

    Peoples, J. A.

    1974-01-01

    The sensitivity of operating conditions upon performance of the modern steam automobile is discussed. The word modern has been used in the title to indicate that emphasis is upon miles per gallon rather than theoretical thermal efficiency. This has been accomplished by combining classical power analysis with the ideal Pressure-Volume diagram. Several parameters are derived which characterize performance capability of the modern steam car. The report illustrates that performance is dictated by the characteristics of the working medium, and the supply temperature. Performance is nearly independent of pressures above 800 psia. Analysis techniques were developed specifically for reciprocating steam engines suitable for automotive application. Specific performance charts have been constructed on the basis of water as a working medium. The conclusions and data interpretation are therefore limited within this scope.

  15. Redundancy Analysis of Capacitance Data of a Coplanar Electrode Array for Fast and Stable Imaging Processing

    PubMed Central

    Wen, Yintang; Zhang, Zhenda; Zhang, Yuyan; Sun, Dongtao

    2017-01-01

    A coplanar electrode array sensor is established for the imaging of composite-material adhesive-layer defect detection. The sensor is based on the capacitive edge effect, which leads to capacitance data being considerably weak and susceptible to environmental noise. The inverse problem of coplanar array electrical capacitance tomography (C-ECT) is ill-conditioning, in which a small error of capacitance data can seriously affect the quality of reconstructed images. In order to achieve a stable image reconstruction process, a redundancy analysis method for capacitance data is proposed. The proposed method is based on contribution rate and anti-interference capability. According to the redundancy analysis, the capacitance data are divided into valid and invalid data. When the image is reconstructed by valid data, the sensitivity matrix needs to be changed accordingly. In order to evaluate the effectiveness of the sensitivity map, singular value decomposition (SVD) is used. Finally, the two-dimensional (2D) and three-dimensional (3D) images are reconstructed by the Tikhonov regularization method. Through comparison of the reconstructed images of raw capacitance data, the stability of the image reconstruction process can be improved, and the quality of reconstructed images is not degraded. As a result, much invalid data are not collected, and the data acquisition time can also be reduced. PMID:29295537

  16. Trace analysis of high-purity graphite by LA-ICP-MS.

    PubMed

    Pickhardt, C; Becker, J S

    2001-07-01

    Laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been established as a very efficient and sensitive technique for the direct analysis of solids. In this work the capability of LA-ICP-MS was investigated for determination of trace elements in high-purity graphite. Synthetic laboratory standards with a graphite matrix were prepared for the purpose of quantifying the analytical results. Doped trace elements, concentration 0.5 microg g(-1), in a laboratory standard were determined with an accuracy of 1% to +/- 7% and a relative standard deviation (RSD) of 2-13%. Solution-based calibration was also used for quantitative analysis of high-purity graphite. It was found that such calibration led to analytical results for trace-element determination in graphite with accuracy similar to that obtained by use of synthetic laboratory standards for quantification of analytical results. Results from quantitative determination of trace impurities in a real reactor-graphite sample, using both quantification approaches, were in good agreement. Detection limits for all elements of interest were determined in the low ng g(-1) concentration range. Improvement of detection limits by a factor of 10 was achieved for analyses of high-purity graphite with LA-ICP-MS under wet plasma conditions, because the lower background signal and increased element sensitivity.

  17. Reducing the overlay metrology sensitivity to perturbations of the measurement stack

    NASA Astrophysics Data System (ADS)

    Zhou, Yue; Park, DeNeil; Gutjahr, Karsten; Gottipati, Abhishek; Vuong, Tam; Bae, Sung Yong; Stokes, Nicholas; Jiang, Aiqin; Hsu, Po Ya; O'Mahony, Mark; Donini, Andrea; Visser, Bart; de Ruiter, Chris; Grzela, Grzegorz; van der Laan, Hans; Jak, Martin; Izikson, Pavel; Morgan, Stephen

    2017-03-01

    Overlay metrology setup today faces a continuously changing landscape of process steps. During Diffraction Based Overlay (DBO) metrology setup, many different metrology target designs are evaluated in order to cover the full process window. The standard method for overlay metrology setup consists of single-wafer optimization in which the performance of all available metrology targets is evaluated. Without the availability of external reference data or multiwafer measurements it is hard to predict the metrology accuracy and robustness against process variations which naturally occur from wafer-to-wafer and lot-to-lot. In this paper, the capabilities of the Holistic Metrology Qualification (HMQ) setup flow are outlined, in particular with respect to overlay metrology accuracy and process robustness. The significance of robustness and its impact on overlay measurements is discussed using multiple examples. Measurement differences caused by slight stack variations across the target area, called grating imbalance, are shown to cause significant errors in the overlay calculation in case the recipe and target have not been selected properly. To this point, an overlay sensitivity check on perturbations of the measurement stack is presented for improvement of the overlay metrology setup flow. An extensive analysis on Key Performance Indicators (KPIs) from HMQ recipe optimization is performed on µDBO measurements of product wafers. The key parameters describing the sensitivity to perturbations of the measurement stack are based on an intra-target analysis. Using advanced image analysis, which is only possible for image plane detection of μDBO instead of pupil plane detection of DBO, the process robustness performance of a recipe can be determined. Intra-target analysis can be applied for a wide range of applications, independent of layers and devices.

  18. DIELECTROPHORESIS-BASED MICROFLUIDIC SEPARATION AND DETECTION SYSTEMS

    PubMed Central

    Yang, Jun; Vykoukal, Jody; Noshari, Jamileh; Becker, Frederick; Gascoyne, Peter; Krulevitch, Peter; Fuller, Chris; Ackler, Harold; Hamilton, Julie; Boser, Bernhard; Eldredge, Adam; Hitchens, Duncan; Andrews, Craig

    2009-01-01

    Diagnosis and treatment of human diseases frequently requires isolation and detection of certain cell types from a complex mixture. Compared with traditional separation and detection techniques, microfluidic approaches promise to yield easy-to-use diagnostic instruments tolerant of a wide range of operating environments and capable of accomplishing automated analyses. These approaches will enable diagnostic advances to be disseminated from sophisticated clinical laboratories to the point-of-care. Applications will include the separation and differential analysis of blood cell subpopulations for host-based detection of blood cell changes caused by disease, infection, or exposure to toxins, and the separation and analysis of surface-sensitized, custom dielectric beads for chemical, biological, and biomolecular targets. Here we report a new particle separation and analysis microsystem that uses dielectrophoretic field-flow fractionation (DEP-FFF). The system consists of a microfluidic chip with integrated sample injector, a DEP-FFF separator, and an AC impedance sensor. We show the design of a miniaturized impedance sensor integrated circuit (IC) with improved sensitivity, a new packaging approach for micro-flumes that features a slide-together compression package and novel microfluidic interconnects, and the design, control, integration and packaging of a fieldable prototype. Illustrative applications will be shown, including the separation of different sized beads and different cell types, blood cell differential analysis, and impedance sensing results for beads, spores and cells. PMID:22025905

  19. Molecularly Imprinted Nanomaterials for Sensor Applications

    PubMed Central

    Irshad, Muhammad; Iqbal, Naseer; Mujahid, Adnan; Afzal, Adeel; Hussain, Tajamal; Sharif, Ahsan; Ahmad, Ejaz; Athar, Muhammad Makshoof

    2013-01-01

    Molecular imprinting is a well-established technology to mimic antibody-antigen interaction in a synthetic platform. Molecularly imprinted polymers and nanomaterials usually possess outstanding recognition capabilities. Imprinted nanostructured materials are characterized by their small sizes, large reactive surface area and, most importantly, with rapid and specific analysis of analytes due to the formation of template driven recognition cavities within the matrix. The excellent recognition and selectivity offered by this class of materials towards a target analyte have found applications in many areas, such as separation science, analysis of organic pollutants in water, environmental analysis of trace gases, chemical or biological sensors, biochemical assays, fabricating artificial receptors, nanotechnology, etc. We present here a concise overview and recent developments in nanostructured imprinted materials with respect to various sensor systems, e.g., electrochemical, optical and mass sensitive, etc. Finally, in light of recent studies, we conclude the article with future perspectives and foreseen applications of imprinted nanomaterials in chemical sensors. PMID:28348356

  20. Combined spectral-domain optical coherence tomography and hyperspectral imaging applied for tissue analysis: Preliminary results

    NASA Astrophysics Data System (ADS)

    Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.

    2017-09-01

    In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.

  1. Quantum-Dot-Based Electrochemical Immunoassay for High-Throughput Screening of the Prostate-Specific Antigen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Jun; Liu, Guodong; Wu, Hong

    2008-01-01

    In this paper, we demonstrate an electrochemical high-throughput sensing platform for simple, sensitive detection of PSA based on QD labels. This sensing platform uses a microplate for immunoreactions and disposable screen-printed electrodes (SPE) for electrochemical stripping analysis of metal ions released from QD labels. With the 96-well microplate, capturing antibodies are conveniently immobilized to the well surface, and the process of immunoreaction is easily controlled. The formed sandwich complexes on the well surface are also easily isolated from reaction solutions. In particular, a microplate-based electrochemical assay can make it feasible to conduct a parallel analysis of several samples or multiplemore » protein markers. This assay offers a number of advantages including (1) simplicity, cost-effectiveness, (2) high sensitivity, (3) capability to sense multiple samples or targets in parallel, and (4) a potentially portable device with an SPE array implanted in the microplate. This PSA assay is sensitive because it uses two amplification processes: (1) QDs as a label for enhancing electrical signal since secondary antibodies are linked to QDs that contain a large number of metal atoms and (2) there is inherent signal amplification for electrochemical stripping analysis—preconcentration of metal ion onto the electrode surface for amplifying electrical signals. Therefore, the high sensitivity of this method, stemming from dual signal amplification via QD labels and pre-concentration, allows low concentration levels to be detected while using small sample volumes. Thus, this QD-based electrochemical detection approach offers a simple, rapid, cost-effective, and high throughput assay of PSA.« less

  2. The multi-disciplinary design study: A life cycle cost algorithm

    NASA Technical Reports Server (NTRS)

    Harding, R. R.; Pichi, F. J.

    1988-01-01

    The approach and results of a Life Cycle Cost (LCC) analysis of the Space Station Solar Dynamic Power Subsystem (SDPS) including gimbal pointing and power output performance are documented. The Multi-Discipline Design Tool (MDDT) computer program developed during the 1986 study has been modified to include the design, performance, and cost algorithms for the SDPS as described. As with the Space Station structural and control subsystems, the LCC of the SDPS can be computed within the MDDT program as a function of the engineering design variables. Two simple examples of MDDT's capability to evaluate cost sensitivity and design based on LCC are included. MDDT was designed to accept NASA's IMAT computer program data as input so that IMAT's detailed structural and controls design capability can be assessed with expected system LCC as computed by MDDT. No changes to IMAT were required. Detailed knowledge of IMAT is not required to perform the LCC analyses as the interface with IMAT is noninteractive.

  3. Sensor readout detector circuit

    DOEpatents

    Chu, Dahlon D.; Thelen, Jr., Donald C.

    1998-01-01

    A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems.

  4. Sensor readout detector circuit

    DOEpatents

    Chu, D.D.; Thelen, D.C. Jr.

    1998-08-11

    A sensor readout detector circuit is disclosed that is capable of detecting sensor signals down to a few nanoamperes or less in a high (microampere) background noise level. The circuit operates at a very low standby power level and is triggerable by a sensor event signal that is above a predetermined threshold level. A plurality of sensor readout detector circuits can be formed on a substrate as an integrated circuit (IC). These circuits can operate to process data from an array of sensors in parallel, with only data from active sensors being processed for digitization and analysis. This allows the IC to operate at a low power level with a high data throughput for the active sensors. The circuit may be used with many different types of sensors, including photodetectors, capacitance sensors, chemically-sensitive sensors or combinations thereof to provide a capability for recording transient events or for recording data for a predetermined period of time following an event trigger. The sensor readout detector circuit has applications for portable or satellite-based sensor systems. 6 figs.

  5. Ultrasensitive Wearable Soft Strain Sensors of Conductive, Self-healing, and Elastic Hydrogels with Synergistic "Soft and Hard" Hybrid Networks.

    PubMed

    Liu, Yan-Jun; Cao, Wen-Tao; Ma, Ming-Guo; Wan, Pengbo

    2017-08-02

    Robust, stretchable, and strain-sensitive hydrogels have recently attracted immense research interest because of their potential application in wearable strain sensors. The integration of the synergistic characteristics of decent mechanical properties, reliable self-healing capability, and high sensing sensitivity for fabricating conductive, elastic, self-healing, and strain-sensitive hydrogels is still a great challenge. Inspired by the mechanically excellent and self-healing biological soft tissues with hierarchical network structures, herein, functional network hydrogels are fabricated by the interconnection between a "soft" homogeneous polymer network and a "hard" dynamic ferric (Fe 3+ ) cross-linked cellulose nanocrystals (CNCs-Fe 3+ ) network. Under stress, the dynamic CNCs-Fe 3+ coordination bonds act as sacrificial bonds to efficiently dissipate energy, while the homogeneous polymer network leads to a smooth stress-transfer, which enables the hydrogels to achieve unusual mechanical properties, such as excellent mechanical strength, robust toughness, and stretchability, as well as good self-recovery property. The hydrogels demonstrate autonomously self-healing capability in only 5 min without the need of any stimuli or healing agents, ascribing to the reorganization of CNCs and Fe 3+ via ionic coordination. Furthermore, the resulted hydrogels display tunable electromechanical behavior with sensitive, stable, and repeatable variations in resistance upon mechanical deformations. Based on the tunable electromechanical behavior, the hydrogels can act as a wearable strain sensor to monitor finger joint motions, breathing, and even the slight blood pulse. This strategy of building synergistic "soft and hard" structures is successful to integrate the decent mechanical properties, reliable self-healing capability, and high sensing sensitivity together for assembling a high-performance, flexible, and wearable strain sensor.

  6. A Scanning Quantum Cryogenic Atom Microscope

    NASA Astrophysics Data System (ADS)

    Lev, Benjamin

    Microscopic imaging of local magnetic fields provides a window into the organizing principles of complex and technologically relevant condensed matter materials. However, a wide variety of intriguing strongly correlated and topologically nontrivial materials exhibit poorly understood phenomena outside the detection capability of state-of-the-art high-sensitivity, high-resolution scanning probe magnetometers. We introduce a quantum-noise-limited scanning probe magnetometer that can operate from room-to-cryogenic temperatures with unprecedented DC-field sensitivity and micron-scale resolution. The Scanning Quantum Cryogenic Atom Microscope (SQCRAMscope) employs a magnetically levitated atomic Bose-Einstein condensate (BEC), thereby providing immunity to conductive and blackbody radiative heating. The SQCRAMscope has a field sensitivity of 1.4 nT per resolution-limited point (2 um), or 6 nT / Hz1 / 2 per point at its duty cycle. Compared to point-by-point sensors, the long length of the BEC provides a naturally parallel measurement, allowing one to measure nearly one-hundred points with an effective field sensitivity of 600 pT / Hz1 / 2 each point during the same time as a point-by-point scanner would measure these points sequentially. Moreover, it has a noise floor of 300 pT and provides nearly two orders of magnitude improvement in magnetic flux sensitivity (down to 10- 6 Phi0 / Hz1 / 2) over previous atomic probe magnetometers capable of scanning near samples. These capabilities are for the first time carefully benchmarked by imaging magnetic fields arising from microfabricated wire patterns and done so using samples that may be scanned, cryogenically cooled, and easily exchanged. We anticipate the SQCRAMscope will provide charge transport images at temperatures from room to \\x9D4K in unconventional superconductors and topologically nontrivial materials.

  7. Scanning Quantum Cryogenic Atom Microscope

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Kollár, Alicia J.; Taylor, Stephen F.; Turner, Richard W.; Lev, Benjamin L.

    2017-03-01

    Microscopic imaging of local magnetic fields provides a window into the organizing principles of complex and technologically relevant condensed-matter materials. However, a wide variety of intriguing strongly correlated and topologically nontrivial materials exhibit poorly understood phenomena outside the detection capability of state-of-the-art high-sensitivity high-resolution scanning probe magnetometers. We introduce a quantum-noise-limited scanning probe magnetometer that can operate from room-to-cryogenic temperatures with unprecedented dc-field sensitivity and micron-scale resolution. The Scanning Quantum Cryogenic Atom Microscope (SQCRAMscope) employs a magnetically levitated atomic Bose-Einstein condensate (BEC), thereby providing immunity to conductive and blackbody radiative heating. The SQCRAMscope has a field sensitivity of 1.4 nT per resolution-limited point (approximately 2 μ m ) or 6 nT /√{Hz } per point at its duty cycle. Compared to point-by-point sensors, the long length of the BEC provides a naturally parallel measurement, allowing one to measure nearly 100 points with an effective field sensitivity of 600 pT /√{Hz } for each point during the same time as a point-by-point scanner measures these points sequentially. Moreover, it has a noise floor of 300 pT and provides nearly 2 orders of magnitude improvement in magnetic flux sensitivity (down to 10-6 Φ0/√{Hz } ) over previous atomic probe magnetometers capable of scanning near samples. These capabilities are carefully benchmarked by imaging magnetic fields arising from microfabricated wire patterns in a system where samples may be scanned, cryogenically cooled, and easily exchanged. We anticipate the SQCRAMscope will provide charge-transport images at temperatures from room temperature to 4 K in unconventional superconductors and topologically nontrivial materials.

  8. The Case for Space-Borne Far-Infrared Line Surveys

    NASA Technical Reports Server (NTRS)

    Bock, J. J.; Bradford, C. M.; Dragovan, M.; Earle, L.; Glenn, J.; Naylor, B.; Nguyen, H. T.; Zmuidzinas, J.

    2004-01-01

    The combination of sensitive direct detectors and a cooled aperture promises orders of magnitude improvement in the sensitivity and survey time for far-infrared and submillimeter spectroscopy compared to existing or planned capabilities. Continuing advances in direct detector technology enable spectroscopy that approaches the background limit available only from space at these wavelengths. Because the spectral confusion limit is significantly lower than the more familiar spatial confusion limit encountered in imaging applications, spectroscopy can be carried out to comparable depth with a significantly smaller aperture. We are developing a novel waveguide-coupled grating spectrometer that disperses radiation into a wide instantaneous bandwidth with moderate resolution (R 1000) in a compact 2-dimensional format. A line survey instrument coupled to a modest cooled single aperture provides an attractive scientific application for spectroscopy with direct detectors. Using a suite of waveguide spectrometers, we can obtain complete coverage over the entire far-infrared and sub-millimeter. This concept requires no moving parts to modulate the optical signal. Such an instrument would be able to conduct a far-infrared line survey 10 6 times faster than planned capabilities, assuming existing detector technology. However, if historical improvements in bolometer sensitivity continue, so that photon-limited sensitivity is obtained, the integration time can be further reduced by 2 to 4 orders of magnitude, depending on wavelength. The line flux sensitivity would be comparable to ALMA, but at shorter wavelengths and with the continuous coverage needed to extract line fluxes for sources at unknown redshifts. For example, this capability would break the current spectroscopic bottleneck in the study of far-infrared galaxies, the recently discovered, rapidly evolving objects abundant at cosmological distances.

  9. Predictions of the electro-mechanical response of conductive CNT-polymer composites

    NASA Astrophysics Data System (ADS)

    Matos, Miguel A. S.; Tagarielli, Vito L.; Baiz-Villafranca, Pedro M.; Pinho, Silvestre T.

    2018-05-01

    We present finite element simulations to predict the conductivity, elastic response and strain-sensing capability of conductive composites comprising a polymeric matrix and carbon nanotubes. Realistic representative volume elements (RVE) of the microstructure are generated and both constituents are modelled as linear elastic solids, with resistivity independent of strain; the electrical contact between nanotubes is represented by a new element which accounts for quantum tunnelling effects and captures the sensitivity of conductivity to separation. Monte Carlo simulations are conducted and the sensitivity of the predictions to RVE size is explored. Predictions of modulus and conductivity are found in good agreement with published results. The strain-sensing capability of the material is explored for multiaxial strain states.

  10. High-speed free-space based reconfigurable card-to-card optical interconnects with broadcast capability.

    PubMed

    Wang, Ke; Nirmalathas, Ampalavanapillai; Lim, Christina; Skafidas, Efstratios; Alameh, Kamal

    2013-07-01

    In this paper, we propose and experimentally demonstrate a free-space based high-speed reconfigurable card-to-card optical interconnect architecture with broadcast capability, which is required for control functionalities and efficient parallel computing applications. Experimental results show that 10 Gb/s data can be broadcast to all receiving channels for up to 30 cm with a worst-case receiver sensitivity better than -12.20 dBm. In addition, arbitrary multicasting with the same architecture is also investigated. 10 Gb/s reconfigurable point-to-point link and multicast channels are simultaneously demonstrated with a measured receiver sensitivity power penalty of ~1.3 dB due to crosstalk.

  11. General shape optimization capability

    NASA Technical Reports Server (NTRS)

    Chargin, Mladen K.; Raasch, Ingo; Bruns, Rudolf; Deuermeyer, Dawson

    1991-01-01

    A method is described for calculating shape sensitivities, within MSC/NASTRAN, in a simple manner without resort to external programs. The method uses natural design variables to define the shape changes in a given structure. Once the shape sensitivities are obtained, the shape optimization process is carried out in a manner similar to property optimization processes. The capability of this method is illustrated by two examples: the shape optimization of a cantilever beam with holes, loaded by a point load at the free end (with the shape of the holes and the thickness of the beam selected as the design variables), and the shape optimization of a connecting rod subjected to several different loading and boundary conditions.

  12. Constructing a Cyber Preparedness Framework (CPF): The Lockheed Martin Case Study

    ERIC Educational Resources Information Center

    Beyer, Dawn M.

    2014-01-01

    The protection of sensitive data and technologies is critical in preserving United States (U.S.) national security and minimizing economic losses. However, during a cyber attack, the operational capability to constrain the exfiltrations of sensitive data and technologies may not be available. A cyber preparedness methodology (CPM) can improve…

  13. Ultrasensitive Biosensors Using Enhanced Fano Resonances in Capped Gold Nanoslit Arrays

    PubMed Central

    Lee, Kuang-Li; Huang, Jhih-Bin; Chang, Jhih-Wei; Wu, Shu-Han; Wei, Pei-Kuen

    2015-01-01

    Nanostructure-based sensors are capable of sensitive and label-free detection for biomedical applications. However, plasmonic sensors capable of highly sensitive detection with high-throughput and low-cost fabrication techniques are desirable. We show that capped gold nanoslit arrays made by thermal-embossing nanoimprint method on a polymer film can produce extremely sharp asymmetric resonances for a transverse magnetic-polarized wave. An ultrasmall linewidth is formed due to the enhanced Fano coupling between the cavity resonance mode in nanoslits and surface plasmon resonance mode on periodic metallic surface. With an optimal slit length and width, the full width at half-maximum bandwidth of the Fano mode is only 3.68 nm. The wavelength sensitivity is 926 nm/RIU for 60-nm-width and 1,000-nm-period nanoslits. The figure of merit is up to 252. The obtained value is higher than the theoretically estimated upper limits of the prism-coupling SPR sensors and the previously reported record high figure-of-merit in array sensors. In addition, the structure has an ultrahigh intensity sensitivity up to 48,117%/RIU. PMID:25708955

  14. The assessment of vulnerability to natural disasters in China by using the DEA method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei Yiming; Fan Ying; Lu Cong

    2004-05-01

    China has been greatly affected by natural disasters, so that it is of great importance to analyze the impact of natural disasters on national economy. Usually, the frequency of disasters or absolute loss inflicted by disasters is the first priority to be considered, while the capability of regions to overcome disasters is ignored. The concept of vulnerability is used to measure the capability to overcome disasters in different regions with distinctive economies. Traditional methods for vulnerability analysis calculate sub-indices based on disaster frequency, loss, the economic impact and the population of each region, and then add the sub-indices to getmore » a composite index for regional vulnerability. But those methods are sensitive to the weights selected for sub-indices when multi-indexes are added up to get an index of total vulnerability. The analytic results are less convincing because of the subjectivity of different weighting methods. A data envelopment analysis (DEA)-based model for analysis of regional vulnerability to natural disasters is presented here to improve upon the traditional method. This paper systematically describes the DEA method to evaluate the relative severity of disasters in each region. A model for regional vulnerability analysis is developed, based on the annual governmental statistics from 1989 to 2000. The regional vulnerabilities in China's mainland are illustrated as a case study, and a new method for the classification of regional vulnerability to natural disasters in China is proposed.« less

  15. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  16. Native Mass Spectrometry in Fragment-Based Drug Discovery.

    PubMed

    Pedro, Liliana; Quinn, Ronald J

    2016-07-28

    The advent of native mass spectrometry (MS) in 1990 led to the development of new mass spectrometry instrumentation and methodologies for the analysis of noncovalent protein-ligand complexes. Native MS has matured to become a fast, simple, highly sensitive and automatable technique with well-established utility for fragment-based drug discovery (FBDD). Native MS has the capability to directly detect weak ligand binding to proteins, to determine stoichiometry, relative or absolute binding affinities and specificities. Native MS can be used to delineate ligand-binding sites, to elucidate mechanisms of cooperativity and to study the thermodynamics of binding. This review highlights key attributes of native MS for FBDD campaigns.

  17. A comparative analysis of area navigation systems in general aviation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Dodge, S. M.

    1973-01-01

    Radio navigation systems which offer the capabilities of area navigation to general aviation operators are discussed. The systems considered are: (1) the VORTAC system, (2) the Loran-C system, and (3) the Differential Omega system. The inital analyses are directed toward a comparison of the systems with respect to their compliance to specified performance parameters and to the cost effectiveness of each system in relation to those specifications. Further analyses lead to the development of system cost sensitivity charts, and the employment of these charts allows conclusions to be drawn relative to the cost-effectiveness of the candidate navigation system.

  18. Multidisciplinary optimization of a controlled space structure using 150 design variables

    NASA Technical Reports Server (NTRS)

    James, Benjamin B.

    1992-01-01

    A general optimization-based method for the design of large space platforms through integration of the disciplines of structural dynamics and control is presented. The method uses the global sensitivity equations approach and is especially appropriate for preliminary design problems in which the structural and control analyses are tightly coupled. The method is capable of coordinating general purpose structural analysis, multivariable control, and optimization codes, and thus, can be adapted to a variety of controls-structures integrated design projects. The method is used to minimize the total weight of a space platform while maintaining a specified vibration decay rate after slewing maneuvers.

  19. SPIDER beam dump as diagnostic of the particle beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaupa, M., E-mail: matteo.zaupa@igi.cnr.it; Sartori, E.; Consorzio RFX, Corso Stati Uniti 4, Padova 35127

    The beam power produced by the negative ion source for the production of ion of deuterium extracted from RF plasma is mainly absorbed by the beam dump component which has been designed also for measuring the temperatures on the dumping panels for beam diagnostics. A finite element code has been developed to characterize, by thermo-hydraulic analysis, the sensitivity of the beam dump to the different beam parameters. The results prove the capability of diagnosing the beam divergence and the horizontal misalignment, while the entity of the halo fraction appears hardly detectable without considering the other foreseen diagnostics like tomography andmore » beam emission spectroscopy.« less

  20. Multidisciplinary Techniques and Novel Aircraft Control Systems

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Rogers, James L.; Raney, David L.

    2000-01-01

    The Aircraft Morphing Program at NASA Langley Research Center explores opportunities to improve airframe designs with smart technologies. Two elements of this basic research program are multidisciplinary design optimization (MDO) and advanced flow control. This paper describes examples where MDO techniques such as sensitivity analysis, automatic differentiation, and genetic algorithms contribute to the design of novel control systems. In the test case, the design and use of distributed shape-change devices to provide low-rate maneuvering capability for a tailless aircraft is considered. The ability of MDO to add value to control system development is illustrated using results from several years of research funded by the Aircraft Morphing Program.

  1. SPIDER beam dump as diagnostic of the particle beam

    NASA Astrophysics Data System (ADS)

    Zaupa, M.; Dalla Palma, M.; Sartori, E.; Brombin, M.; Pasqualotto, R.

    2016-11-01

    The beam power produced by the negative ion source for the production of ion of deuterium extracted from RF plasma is mainly absorbed by the beam dump component which has been designed also for measuring the temperatures on the dumping panels for beam diagnostics. A finite element code has been developed to characterize, by thermo-hydraulic analysis, the sensitivity of the beam dump to the different beam parameters. The results prove the capability of diagnosing the beam divergence and the horizontal misalignment, while the entity of the halo fraction appears hardly detectable without considering the other foreseen diagnostics like tomography and beam emission spectroscopy.

  2. Elastic cavitation and fracture via injection.

    PubMed

    Hutchens, Shelby B; Fakhouri, Sami; Crosby, Alfred J

    2016-03-07

    The cavitation rheology technique extracts soft materials mechanical properties through pressure-monitored fluid injection. Properties are calculated from the system's response at a critical pressure that is governed by either elasticity or fracture (or both); however previous elementary analysis has not been capable of accurately determining which mechanism is dominant. We combine analyses of both mechanisms in order to determine how the full system thermodynamics, including far-field compliance, dictate whether a bubble in an elastomeric solid will grow through either reversible or irreversible deformations. Applying these analyses to experimental data, we demonstrate the sensitivity of cavitation rheology to microstructural variation via a co-dependence between modulus and fracture energy.

  3. Simulated characteristics of the DEGAS γ-detector array

    NASA Astrophysics Data System (ADS)

    Li, G. S.; Lizarazo, C.; Gerl, J.; Kojouharov, I.; Schaffner, H.; Górska, M.; Pietralla, N.; Saha, S.; Liu, M. L.; Wang, J. G.

    2018-05-01

    The performance of the novel HPGe-Cluster array DEGAS to be used at FAIR has been studied through GEANT4 simulations using accurate geometries of most of the detector components. The simulation framework has been tested by comparing experimental data of various detector setups. The study showed that the DEGAS system could provide a clear improvement of the photo-peak efficiency compared to the previous RISING array. In addition, the active BGO Back-catcher could greatly enhance the background suppression capability. The add-back analysis revealed that even at a γ multiplicity of six the sensitivity is improved by adding back the energy depositions of the neighboring Ge crystals.

  4. The Advanced Gamma-ray Imaging System (AGIS): A Nanosecond Time Scale Stereoscopic Array Trigger System.

    NASA Astrophysics Data System (ADS)

    Krennrich, Frank; Buckley, J.; Byrum, K.; Dawson, J.; Drake, G.; Horan, D.; Krawzcynski, H.; Schroedter, M.

    2008-04-01

    Imaging atmospheric Cherenkov telescope arrays (VERITAS, HESS) have shown unprecedented background suppression capabilities for reducing cosmic-ray induced air showers, muons and night sky background fluctuations. Next-generation arrays with on the order of 100 telescopes offer larger collection areas, provide the possibility to see the air shower from more view points on the ground, have the potential to improve the sensitivity and give additional background suppression. Here we discuss the design of a fast array trigger system that has the potential to perform a real time image analysis allowing substantially improved background rate suppression at the trigger level.

  5. Multidisciplinary Techniques and Novel Aircraft Control Systems

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.; Rogers, James L.; Raney, David L.

    2000-01-01

    The Aircraft Morphing Program at NASA Langley Research Center explores opportunities to improve airframe designs with smart technologies. Two elements of this basic research program are multidisciplinary design optimization (MDO) and advanced flow control. This paper describes examples where MDO techniques such as sensitivity analysis, automatic differentiation, and genetic algorithms contribute to the design of novel control systems. In the test case, the design and use of distributed shapechange devices to provide low-rate maneuvering capability for a tailless aircraft is considered. The ability of MDO to add value to control system development is illustrated using results from several years of research funded by the Aircraft Morphing Program.

  6. Earth materials research: Report of a Workshop on Physics and Chemistry of Earth Materials

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The report concludes that an enhanced effort of earth materials research is necessary to advance the understanding of the processes that shape the planet. In support of such an effort, there are new classes of experiments, new levels of analytical sensitivity and precision, and new levels of theory that are now applicable in understanding the physical and chemical properties of geological materials. The application of these capabilities involves the need to upgrade and make greater use of existing facilities as well as the development of new techniques. A concomitant need is for a sample program involving their collection, synthesis, distribution, and analysis.

  7. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    PubMed

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  8. Laser machined plastic laminates. Towards portable diagnostic devices for use in low resource environments

    DOE PAGES

    Harper, Jason C.; Carson, Bryan D.; Bachand, George D.; ...

    2015-07-14

    Despite significant progress in development of bioanalytical devices cost, complexity, access to reagents and lack of infrastructure have prevented use of these technologies in resource-limited regions. To provide a sustainable tool in the global effort to combat infectious diseases the diagnostic device must be low cost, simple to operate and read, robust, and have sensitivity and specificity comparable to laboratory analysis. Thus, in this mini-review we describe recent work using laser machined plastic laminates to produce diagnostic devices that are capable of a wide variety of bioanalytical measurements and show great promise towards future use in low-resource environments.

  9. HAWC+/SOFIA Instrumental Polarization Calibration

    NASA Astrophysics Data System (ADS)

    Michail, Joseph M.; Chuss, David; Dowell, Charles D.; Santos, Fabio; Siah, Javad; Vaillancourt, John; HAWC+ Instrument Team

    2018-01-01

    HAWC+ is a new far-infrared polarimeter for the NASA/DLR SOFIA (Stratospheric Observatory for Infrared Astronomy) telescope. HAWC+ has the capability to measure the polarization of astronomical sources with unprecedented sensitivity and angular resolution in four bands from 50-250 microns. Using data obtained during commissioning flights, we implemented a calibration strategy that separates the astronomical polarization signal from the induced instrumental polarization. The result of this analysis is a map of the instrumental polarization as a function of position in the instrument's focal plane in each band. The results show consistency between bands, as well as with other methods used to determine preliminary instrumental polarization values.

  10. Development and Analysis of Hybrid Thermoelectric Refrigerator Systems

    NASA Astrophysics Data System (ADS)

    Saifizi, M.; Zakaria, M. S.; Yaacob, Sazali; Wan, Khairunizam

    2018-03-01

    Thermoelectric module (TEM) is a type of solid-state devices which has the capability to maintain the accuracy of small temperature variation application. In this study, a hybrid thermoelectric refrigerator system is introduced by utilizing TEMs; direct and air to air thermoelectric heat pump to cool down and maintain low temperature for vaccines storage. Two different materials which are aluminum and stainless steel are used as container in hybrid thermoelectric refrigerator (HTER) configuration to investigate the response of every system in transient and steady state mode. A proper temperature sensor calibration technique is implemented to make certain real time data acquisition of the systems are not affected very much from the noise generated. From step response analysis, it is indicated that HTER I (aluminum) has rapid settling time from transient to steady state than HTER II (stainless steel) since aluminum has better thermal conductivity as compared to stainless steel. It is found that HTER I is better in cooling capability with the same input current instead of HTER II which required a longer time to achieve steady state mode. Besides, in Pseudo Random Binary Sequence (PRBS) response analysis injected to both systems shows HTER I is very sensitive to current input as the sequence length of HTER I is shorter than HTER II. However both systems depict the varying temperature in the range of 4 oC due to differences in thermal conductivity of container.

  11. A Customizable Flow Injection System for Automated, High Throughput, and Time Sensitive Ion Mobility Spectrometry and Mass Spectrometry Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Orton, Daniel J.; Tfaily, Malak M.; Moore, Ronald J.

    To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, lowmore » pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.« less

  12. Diagnostic utility of brain activity flow patterns analysis in attention deficit hyperactivity disorder.

    PubMed

    Biederman, J; Hammerness, P; Sadeh, B; Peremen, Z; Amit, A; Or-Ly, H; Stern, Y; Reches, A; Geva, A; Faraone, S V

    2017-05-01

    A previous small study suggested that Brain Network Activation (BNA), a novel ERP-based brain network analysis, may have diagnostic utility in attention deficit hyperactivity disorder (ADHD). In this study we examined the diagnostic capability of a new advanced version of the BNA methodology on a larger population of adults with and without ADHD. Subjects were unmedicated right-handed 18- to 55-year-old adults of both sexes with and without a DSM-IV diagnosis of ADHD. We collected EEG while the subjects were performing a response inhibition task (Go/NoGo) and then applied a spatio-temporal Brain Network Activation (BNA) analysis of the EEG data. This analysis produced a display of qualitative measures of brain states (BNA scores) providing information on cortical connectivity. This complex set of scores was then fed into a machine learning algorithm. The BNA analysis of the EEG data recorded during the Go/NoGo task demonstrated a high discriminative capacity between ADHD patients and controls (AUC = 0.92, specificity = 0.95, sensitivity = 0.86 for the Go condition; AUC = 0.84, specificity = 0.91, sensitivity = 0.76 for the NoGo condition). BNA methodology can help differentiate between ADHD and healthy controls based on functional brain connectivity. The data support the utility of the tool to augment clinical examinations by objective evaluation of electrophysiological changes associated with ADHD. Results also support a network-based approach to the study of ADHD.

  13. Optical Microresonators for Sensing and Transduction: A Materials Perspective.

    PubMed

    Heylman, Kevin D; Knapper, Kassandra A; Horak, Erik H; Rea, Morgan T; Vanga, Sudheer K; Goldsmith, Randall H

    2017-08-01

    Optical microresonators confine light to a particular microscale trajectory, are exquisitely sensitive to their microenvironment, and offer convenient readout of their optical properties. Taken together, this is an immensely attractive combination that makes optical microresonators highly effective as sensors and transducers. Meanwhile, advances in material science, fabrication techniques, and photonic sensing strategies endow optical microresonators with new functionalities, unique transduction mechanisms, and in some cases, unparalleled sensitivities. In this progress report, the operating principles of these sensors are reviewed, and different methods of signal transduction are evaluated. Examples are shown of how choice of materials must be suited to the analyte, and how innovations in fabrication and sensing are coupled together in a mutually reinforcing cycle. A tremendously broad range of capabilities of microresonator sensors is described, from electric and magnetic field sensing to mechanical sensing, from single-molecule detection to imaging and spectroscopy, from operation at high vacuum to in live cells. Emerging sensing capabilities are highlighted and put into context in the field. Future directions are imagined, where the diverse capabilities laid out are combined and advances in scalability and integration are implemented, leading to the creation of a sensor unparalleled in sensitivity and information content. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Fatigue Crack Length Sizing Using a Novel Flexible Eddy Current Sensor Array.

    PubMed

    Xie, Ruifang; Chen, Dixiang; Pan, Mengchun; Tian, Wugang; Wu, Xuezhong; Zhou, Weihong; Tang, Ying

    2015-12-21

    The eddy current probe, which is flexible, array typed, highly sensitive and capable of quantitative inspection is one practical requirement in nondestructive testing and also a research hotspot. A novel flexible planar eddy current sensor array for the inspection of microcrack presentation in critical parts of airplanes is developed in this paper. Both exciting and sensing coils are etched on polyimide films using a flexible printed circuit board technique, thus conforming the sensor to complex geometric structures. In order to serve the needs of condition-based maintenance (CBM), the proposed sensor array is comprised of 64 elements. Its spatial resolution is only 0.8 mm, and it is not only sensitive to shallow microcracks, but also capable of sizing the length of fatigue cracks. The details and advantages of our sensor design are introduced. The working principal and the crack responses are analyzed by finite element simulation, with which a crack length sizing algorithm is proposed. Experiments based on standard specimens are implemented to verify the validity of our simulation and the efficiency of the crack length sizing algorithm. Experimental results show that the sensor array is sensitive to microcracks, and is capable of crack length sizing with an accuracy within ±0.2 mm.

  15. Fatigue Crack Length Sizing Using a Novel Flexible Eddy Current Sensor Array

    PubMed Central

    Xie, Ruifang; Chen, Dixiang; Pan, Mengchun; Tian, Wugang; Wu, Xuezhong; Zhou, Weihong; Tang, Ying

    2015-01-01

    The eddy current probe, which is flexible, array typed, highly sensitive and capable of quantitative inspection is one practical requirement in nondestructive testing and also a research hotspot. A novel flexible planar eddy current sensor array for the inspection of microcrack presentation in critical parts of airplanes is developed in this paper. Both exciting and sensing coils are etched on polyimide films using a flexible printed circuit board technique, thus conforming the sensor to complex geometric structures. In order to serve the needs of condition-based maintenance (CBM), the proposed sensor array is comprised of 64 elements. Its spatial resolution is only 0.8 mm, and it is not only sensitive to shallow microcracks, but also capable of sizing the length of fatigue cracks. The details and advantages of our sensor design are introduced. The working principal and the crack responses are analyzed by finite element simulation, with which a crack length sizing algorithm is proposed. Experiments based on standard specimens are implemented to verify the validity of our simulation and the efficiency of the crack length sizing algorithm. Experimental results show that the sensor array is sensitive to microcracks, and is capable of crack length sizing with an accuracy within ±0.2 mm. PMID:26703608

  16. Electrochemical Branched-DNA Assay for Polymerase Chain Reaction-Free Detection and Quantification of Oncogenes in Messenger RNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ai Cheng; Dai, Ziyu; Chen, Baowei

    2008-12-01

    We describe a novel electrochemical branched-DNA (bDNA) assay for polymerase chain reaction (PCR)-free detection and quantification of p185 BCR-ABL leukemia fusion transcript in the population of messenger RNA (mRNA) extracted from cell lines. The bDNA amplifier carrying high loading of alkaline phosphatase (ALP) tracers was used to amplify targets signal. The targets were captured on microplate well surfaces through cooperative sandwich hybridization prior to the labeling of bDNA. The activity of captured ALP was monitored by square-wave voltammetric (SWV) analysis of the electroactive enzymatic product in the presence of 1-napthyl-phosphate. The specificity and sensitivity of assay enabled direct detection ofmore » target transcript in as little as 4.6 ng mRNA without PCR amplification. In combination with the use of a well-quantified standard, the electrochemical bDNA assay was capable of direct use for a PCR-free quantitative analysis of target transcript in total mRNA population. The approach thus provides a simple, sensitive, accurate and quantitative tool alternate to the RQ-PCR for early disease diagnosis.« less

  17. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Techno-economic evaluation of simultaneous production of extra-cellular polymeric substance (EPS) and lipids by Cloacibacterium normanense NK6 using crude glycerol and sludge as substrate.

    PubMed

    Ram, S K; Kumar, L R; Tyagi, R D; Drogui, P

    2018-05-01

    This study used the technical, economic analysis tool, SuperPro designer in evaluating a novel technology for simultaneous production of extracellular polymeric substance (EPS) and biodiesel using crude glycerol and secondary sludge. As renewable energy sources are depleting, the process utilizes municipal sewage sludge for production of EPS and biodiesel along with crude glycerol, which is a waste byproduct of biodiesel industry providing an alternate way for disposal of municipal sludge and crude glycerol. Newly isolated Cloacibacterium normanense NK6 is used as micro-organism in the study as it is capable of producing high EPS concentration, using activated sludge and crude glycerol as the sole carbon source. The technology has many environmental and economic advantages like the simultaneous production of two major products: EPS and lipids. Sensitivity analysis of the process revealed that biomass lipid content is a most significant factor where unit cost production of biodiesel was highly sensitive to lipid content during bioreaction. B7 biodiesel unit production cost can be lowered from $1 to $0.6 if the lipid content of the biomass is improved by various process parameter modifications.

  19. The miniaturized Raman system and detection of traces of life in halite from the Atacama Desert: some considerations for the search for life signatures on Mars.

    PubMed

    Vítek, Petr; Jehlička, Jan; Edwards, Howell G M; Hutchinson, Ian; Ascaso, Carmen; Wierzchos, Jacek

    2012-12-01

    Raman spectroscopy is being adopted as a nondestructive instrumentation for the robotic exploration of Mars to search for traces of life in the geological record. Here, miniaturized Raman spectrometers of two different types equipped with 532 and 785 nm lasers for excitation, respectively, were compared for the detection of microbial biomarkers in natural halite from the hyperarid region of the Atacama Desert. Measurements were performed directly on the rock as well as on the homogenized, powdered samples prepared from this material-the effects of this sample preparation and the excitation wavelength employed in the analysis are compared and discussed. From these results, 532 nm excitation was found to be superior for the analysis of powdered specimens due to its high sensitivity toward carotenoids and hence a higher capability for their detection at relatively low concentration in bulk powdered specimens. For the same reason, this wavelength was a better choice for the detection of carotenoids in direct measurements made on the rock samples. The 785 nm excitation wavelength, in contrast, proved to be more sensitive toward the detection of scytonemin.

  20. Ecosystem Services and Climate Change Considerations for ...

    EPA Pesticide Factsheets

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework “iemWatersheds” has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water

  1. Chemical Visualization of Phosphoproteomes on Membrane*

    PubMed Central

    Iliuk, Anton; Liu, X. Shawn; Xue, Liang; Liu, Xiaoqi; Tao, W. Andy

    2012-01-01

    With new discoveries of important roles of phosphorylation on a daily basis, phospho-specific antibodies, as the primary tool for on-membrane detection of phosphoproteins, face enormous challenges. To address an urgent need for convenient and reliable analysis of phosphorylation events, we report a novel strategy for sensitive phosphorylation analysis in the Western blotting format. The chemical reagent, which we termed pIMAGO, is based on a multifunctionalized soluble nanopolymer and is capable of selectively binding to phosphorylated residues independent of amino acid microenvironment, thus offering great promise as a universal tool in biological analyses where the site of phosphorylation is not known or its specific antibody is not available. The specificity and sensitivity of the approach was first examined using a mixture of standard proteins. The method was then applied to monitor phosphorylation changes in in vitro kinase and phosphatase assays. Finally, to demonstrate the unique ability of pIMAGO to measure endogenous phosphorylation, we used it to visualize and determine the differences in phosphorylated proteins that interact with wild-type and kinase dead mutant of Polo-like kinase 1 during mitosis, the results of which were further confirmed by a quantitative phosphoproteomics experiment. PMID:22593177

  2. Refining and validating a two-stage and web-based cancer risk assessment tool for village doctors in China.

    PubMed

    Shen, Xing-Rong; Chai, Jing; Feng, Rui; Liu, Tong-Zhu; Tong, Gui-Xian; Cheng, Jing; Li, Kai-Chun; Xie, Shao-Yu; Shi, Yong; Wang, De-Bin

    2014-01-01

    The big gap between efficacy of population level prevention and expectations due to heterogeneity and complexity of cancer etiologic factors calls for selective yet personalized interventions based on effective risk assessment. This paper documents our research protocol aimed at refining and validating a two-stage and web- based cancer risk assessment tool, from a tentative one in use by an ongoing project, capable of identifying individuals at elevated risk for one or more types of the 80% leading cancers in rural China with adequate sensitivity and specificity and featuring low cost, easy application and cultural and technical sensitivity for farmers and village doctors. The protocol adopted a modified population-based case control design using 72, 000 non-patients as controls, 2, 200 cancer patients as cases, and another 600 patients as cases for external validation. Factors taken into account comprised 8 domains including diet and nutrition, risk behaviors, family history, precancerous diseases, related medical procedures, exposure to environment hazards, mood and feelings, physical activities and anthropologic and biologic factors. Modeling stresses explored various methodologies like empirical analysis, logistic regression, neuro-network analysis, decision theory and both internal and external validation using concordance statistics, predictive values, etc..

  3. Position-sensitive, fast ionization chambers

    NASA Astrophysics Data System (ADS)

    Lai, J.; Afanasieva, L.; Blackmon, J. C.; Deibel, C. M.; Gardiner, H. E.; Lauer, A.; Linhardt, L. E.; Macon, K. T.; Rasco, B. C.; Williams, C.; Santiago-Gonzalez, D.; Kuvin, S. A.; Almaraz-Calderon, S.; Baby, L. T.; Baker, J.; Belarge, J.; Wiedenhöver, I.; Need, E.; Avila, M. L.; Back, B. B.; DiGiovine, B.; Hoffman, C. R.

    2018-05-01

    A high-count-rate ionization chamber design with position-sensitivity has been developed and deployed at several accelerator facilities. Counting rates of ≥ 500 kHz with good Z-separation (up to 5% energy resolution) for particle identification have been demonstrated in a series of commissioning experiments. A position-sensitive capability, with a resolution of 3 mm, has been implemented for the first time to record position information and suppress pileup. The design and performance of the detectors are described.

  4. Characterization and Selection of Polymer Materials for Binary Munitions Storage. Part 3. Branch Content Determination.

    DTIC Science & Technology

    1987-09-01

    accuracy. The data aquisition system combines a position- sensitive X-ray detector with a 65 kilobyte microcomputer capable of operating as a...The rapid X-ray diffraction system measures intensity versus 20 patterns by placing the detector with its sensitivity axis positioned parallel to the...plane of the diffractometer (see Figure 2). As shown in Figure 2, the detector sensitivity axis z is coplanar with both the incident beam and the

  5. Fusogenic activity of PEGylated pH-sensitive liposomes.

    PubMed

    Vanić, Zeljka; Barnert, Sabine; Süss, Regine; Schubert, Rolf

    2012-06-01

    The aim of this study was to investigate the fusogenic properties of poly(ethylene glycol) (PEG)ylated dioleoylphosphatidylethanolamine/cholesteryl hemisuccinate (DOPE/CHEMS) liposomes. These pH-sensitive liposomes were prepared by incorporating two different PEG lipids: distearoylphosphatidylethanolamine (DSPE)-PEG₂₀₀₀ was mixed with the liposomal lipids using the conventional method, whereas sterol-PEG₁₁₀₀ was inserted into the outer monolayer of preformed vesicles. Both types of PEGylated liposomes were characterized and compared for their entrapment efficiency, zeta potential and size, and were tested in vitro for pH sensitivity by means of proton-induced leakage and membrane fusion activity. To mimic the routes of intracellular delivery, fusion between pH-sensitive liposomes and liposomes designed to simulate the endosomal membrane was studied. Our investigations confirmed that DOPE/CHEMS liposomes were capable of rapidly releasing calcein and of fusing upon acidification. However, after incorporation of DSPE-PEG₂₀₀₀ or sterol-PEG₁₁₀₀ into the membrane, pH sensitivity was significantly reduced; as the mol ratio of PEG-lipid was increased, the ability to fuse was decreased. Comparison between two different PEGylated pH-sensitive liposomes showed that only vesicles containing 0.6 mol% sterol-PEG₁₁₀₀ in the outer monolayer were still capable of fusing with the endosome-like liposomes and showing leakage of calcein at pH 5.5.

  6. The current role of high-resolution mass spectrometry in food analysis.

    PubMed

    Kaufmann, Anton

    2012-05-01

    High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Norman A.; /SLAC

    Maximizing the physics performance of detectors being designed for the International Linear Collider, while remaining sensitive to cost constraints, requires a powerful, efficient, and flexible simulation, reconstruction and analysis environment to study the capabilities of a large number of different detector designs. The preparation of Letters Of Intent for the International Linear Collider involved the detailed study of dozens of detector options, layouts and readout technologies; the final physics benchmarking studies required the reconstruction and analysis of hundreds of millions of events. We describe the Java-based software toolkit (org.lcsim) which was used for full event reconstruction and analysis. The componentsmore » are fully modular and are available for tasks from digitization of tracking detector signals through to cluster finding, pattern recognition, track-fitting, calorimeter clustering, individual particle reconstruction, jet-finding, and analysis. The detector is defined by the same xml input files used for the detector response simulation, ensuring the simulation and reconstruction geometries are always commensurate by construction. We discuss the architecture as well as the performance.« less

  8. Building model analysis applications with the Joint Universal Parameter IdenTification and Evaluation of Reliability (JUPITER) API

    USGS Publications Warehouse

    Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.

    2008-01-01

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.

  9. Distributed generation capabilities of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaCommare, Kristina Hamachi; Edwards, Jennifer L.; Marnay, Chris

    2003-01-01

    This report describes Berkeley Lab's exploration of how the National Energy Modeling System (NEMS) models distributed generation (DG) and presents possible approaches for improving how DG is modeled. The on-site electric generation capability has been available since the AEO2000 version of NEMS. Berkeley Lab has previously completed research on distributed energy resources (DER) adoption at individual sites and has developed a DER Customer Adoption Model called DER-CAM. Given interest in this area, Berkeley Lab set out to understand how NEMS models small-scale on-site generation to assess how adequately DG is treated in NEMS, and to propose improvements or alternatives. Themore » goal is to determine how well NEMS models the factors influencing DG adoption and to consider alternatives to the current approach. Most small-scale DG adoption takes place in the residential and commercial modules of NEMS. Investment in DG ultimately offsets purchases of electricity, which also eliminates the losses associated with transmission and distribution (T&D). If the DG technology that is chosen is photovoltaics (PV), NEMS assumes renewable energy consumption replaces the energy input to electric generators. If the DG technology is fuel consuming, consumption of fuel in the electric utility sector is replaced by residential or commercial fuel consumption. The waste heat generated from thermal technologies can be used to offset the water heating and space heating energy uses, but there is no thermally activated cooling capability. This study consists of a review of model documentation and a paper by EIA staff, a series of sensitivity runs performed by Berkeley Lab that exercise selected DG parameters in the AEO2002 version of NEMS, and a scoping effort of possible enhancements and alternatives to NEMS current DG capabilities. In general, the treatment of DG in NEMS is rudimentary. The penetration of DG is determined by an economic cash-flow analysis that determines adoption based on the n umber of years to a positive cash flow. Some important technologies, e.g. thermally activated cooling, are absent, and ceilings on DG adoption are determined by some what arbitrary caps on the number of buildings that can adopt DG. These caps are particularly severe for existing buildings, where the maximum penetration for any one technology is 0.25 percent. On the other hand, competition among technologies is not fully considered, and this may result in double-counting for certain applications. A series of sensitivity runs show greater penetration with net metering enhancements and aggressive tax credits and a more limited response to lowered DG technology costs. Discussion of alternatives to the current code is presented in Section 4. Alternatives or improvements to how DG is modeled in NEMS cover three basic areas: expanding on the existing total market for DG both by changing existing parameters in NEMS and by adding new capabilities, such as for missing technologies; enhancing the cash flow analysis but incorporating aspects of DG economics that are not currently represented, e.g. complex tariffs; and using an external geographic information system (GIS) driven analysis that can better and more intuitively identify niche markets.« less

  10. Integrated multidisciplinary design optimization using discrete sensitivity analysis for geometrically complex aeroelastic configurations

    NASA Astrophysics Data System (ADS)

    Newman, James Charles, III

    1997-10-01

    The first two steps in the development of an integrated multidisciplinary design optimization procedure capable of analyzing the nonlinear fluid flow about geometrically complex aeroelastic configurations have been accomplished in the present work. For the first step, a three-dimensional unstructured grid approach to aerodynamic shape sensitivity analysis and design optimization has been developed. The advantage of unstructured grids, when compared with a structured-grid approach, is their inherent ability to discretize irregularly shaped domains with greater efficiency and less effort. Hence, this approach is ideally suited for geometrically complex configurations of practical interest. In this work the time-dependent, nonlinear Euler equations are solved using an upwind, cell-centered, finite-volume scheme. The discrete, linearized systems which result from this scheme are solved iteratively by a preconditioned conjugate-gradient-like algorithm known as GMRES for the two-dimensional cases and a Gauss-Seidel algorithm for the three-dimensional; at steady-state, similar procedures are used to solve the accompanying linear aerodynamic sensitivity equations in incremental iterative form. As shown, this particular form of the sensitivity equation makes large-scale gradient-based aerodynamic optimization possible by taking advantage of memory efficient methods to construct exact Jacobian matrix-vector products. Various surface parameterization techniques have been employed in the current study to control the shape of the design surface. Once this surface has been deformed, the interior volume of the unstructured grid is adapted by considering the mesh as a system of interconnected tension springs. Grid sensitivities are obtained by differentiating the surface parameterization and the grid adaptation algorithms with ADIFOR, an advanced automatic-differentiation software tool. To demonstrate the ability of this procedure to analyze and design complex configurations of practical interest, the sensitivity analysis and shape optimization has been performed for several two- and three-dimensional cases. In twodimensions, an initially symmetric NACA-0012 airfoil and a high-lift multielement airfoil were examined. For the three-dimensional configurations, an initially rectangular wing with uniform NACA-0012 cross-sections was optimized; in addition, a complete Boeing 747-200 aircraft was studied. Furthermore, the current study also examines the effect of inconsistency in the order of spatial accuracy between the nonlinear fluid and linear shape sensitivity equations. The second step was to develop a computationally efficient, high-fidelity, integrated static aeroelastic analysis procedure. To accomplish this, a structural analysis code was coupled with the aforementioned unstructured grid aerodynamic analysis solver. The use of an unstructured grid scheme for the aerodynamic analysis enhances the interaction compatibility with the wing structure. The structural analysis utilizes finite elements to model the wing so that accurate structural deflections may be obtained. In the current work, parameters have been introduced to control the interaction of the computational fluid dynamics and structural analyses; these control parameters permit extremely efficient static aeroelastic computations. To demonstrate and evaluate this procedure, static aeroelastic analysis results for a flexible wing in low subsonic, high subsonic (subcritical), transonic (supercritical), and supersonic flow conditions are presented.

  11. A Comparison of the Capability of Sensitivity Level 3 and Sensitivity Level 4 Fluorescent Penetrants to Detect Fatigue Cracks in Aluminum

    NASA Technical Reports Server (NTRS)

    Parker, Bradford, H.

    2009-01-01

    Historically both sensitivity level 3 and sensitivity level 4 fluorescent penetrants have been used to perform NASA Standard Level inspections of aerospace hardware. In April 2008, NASA-STD-5009 established a requirement that only sensitivity level 4 penetrants were acceptable for inspections of NASA hardware. Having NASA contractors change existing processes or perform demonstration tests to certify sensitivity level 3 penetrants posed a potentially huge cost to the Agency. This study was conducted to directly compare the probability of detection sensitivity level 3 and level 4 penetrants using both Method A and Method D inspection processes. The study results strongly support the conclusion that sensitivity level 3 penetrants are acceptable for NASA Standard Level inspections

  12. Ligand-Controlled Integration of Zn and Tb by Photoactive Terpyridyl-Functionalized Tricarboxylates as Highly Selective and Sensitive Sensors for Nitrofurans.

    PubMed

    Zhou, Zhi-Hang; Dong, Wen-Wen; Wu, Ya-Pan; Zhao, Jun; Li, Dong-Sheng; Wu, Tao; Bu, Xian-Hui

    2018-04-02

    The integration of terpyridyl and tricarboxylate functionality in a novel ligand allows concerted 3:1 stoichiometric assembly of size-and charge-complementary Zn 2+ /Tb 3+ ions into a water-stable 3D luminescent framework (CTGU-8) capable of highly selective, sensitive, and recyclable of nitrofurans.

  13. Methods for increasing the sensitivity of gamma-ray imagers

    DOEpatents

    Mihailescu, Lucian [Pleasanton, CA; Vetter, Kai M [Alameda, CA; Chivers, Daniel H [Fremont, CA

    2012-02-07

    Methods are presented that increase the position resolution and granularity of double sided segmented semiconductor detectors. These methods increase the imaging resolution capability of such detectors, either used as Compton cameras, or as position sensitive radiation detectors in imagers such as SPECT, PET, coded apertures, multi-pinhole imagers, or other spatial or temporal modulated imagers.

  14. Systems for increasing the sensitivity of gamma-ray imagers

    DOEpatents

    Mihailescu, Lucian; Vetter, Kai M.; Chivers, Daniel H.

    2012-12-11

    Systems that increase the position resolution and granularity of double sided segmented semiconductor detectors are provided. These systems increase the imaging resolution capability of such detectors, either used as Compton cameras, or as position sensitive radiation detectors in imagers such as SPECT, PET, coded apertures, multi-pinhole imagers, or other spatial or temporal modulated imagers.

  15. How Are Task Reflexivity and Intercultural Sensitivity Related to the Academic Performance of MBA Students?

    ERIC Educational Resources Information Center

    Lyubovnikova, Joanne; Napiersky, Uwe; Vlachopoulos, Panos

    2015-01-01

    Higher education in business school environments is increasingly focused on how to best equip students with the skills necessary for leadership in the global workplace. This paper examines the impact of two particularly important cognitive capabilities--task reflexivity and intercultural sensitivity, on academic performance in an MBA programme. It…

  16. Enhanced performance of PbS-sensitized solar cells via controlled successive ionic-layer adsorption and reaction.

    PubMed

    Abbas, Muhammad A; Basit, Muhammad A; Park, Tae Joo; Bang, Jin Ho

    2015-04-21

    Despite the potential of PbS quantum dots (QDs) as sensitizers for quantum-dot-sensitized solar cells (QDSSCs), achieving a high photocurrent density over 30 mA cm(-2) remains a challenging task in PbS-sensitized solar cells. In contrast to previous attempts, where Hg(2+)-doping or multi-step post-treatment is necessary, we are capable of achieving a high photocurrent exceeding 30 mA cm(-2) simply by manipulating the successive ionic layer adsorption and reaction (SILAR) method. We show that controlling temperature at which SILAR is performed is critical to obtain a higher and more uniform coverage of PbS QDs over a mesoporous TiO2 film. The deposition of a CdS inter-layer between TiO2 and PbS is found to be an effective means of ensuring high photocurrent and stability. Not only does this modification improve the light absorption capability of the photoanode, but it also has a significant effect on charge recombination and electron injection efficiency at the PbS/TiO2 interface according to our in-depth study using electrochemical impedance spectroscopy (EIS). The implication of subtle changes in the interfacial events via modified SILAR conditions for PbS-sensitized solar cells is discussed.

  17. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    NASA Astrophysics Data System (ADS)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  18. Solvent jet desorption capillary photoionization-mass spectrometry.

    PubMed

    Haapala, Markus; Teppo, Jaakko; Ollikainen, Elisa; Kiiski, Iiro; Vaikkinen, Anu; Kauppila, Tiina J; Kostiainen, Risto

    2015-03-17

    A new ambient mass spectrometry method, solvent jet desorption capillary photoionization (DCPI), is described. The method uses a solvent jet generated by a coaxial nebulizer operated at ambient conditions with nitrogen as nebulizer gas. The solvent jet is directed onto a sample surface, from which analytes are extracted into the solvent and ejected from the surface in secondary droplets formed in collisions between the jet and the sample surface. The secondary droplets are directed into the heated capillary photoionization (CPI) device, where the droplets are vaporized and the gaseous analytes are ionized by 10 eV photons generated by a vacuum ultraviolet (VUV) krypton discharge lamp. As the CPI device is directly connected to the extended capillary inlet of the MS, high ion transfer efficiency to the vacuum of MS is achieved. The solvent jet DCPI provides several advantages: high sensitivity for nonpolar and polar compounds with limit of detection down to low fmol levels, capability of analyzing small and large molecules, and good spatial resolution (250 μm). Two ionization mechanisms are involved in DCPI: atmospheric pressure photoionization, capable of ionizing polar and nonpolar compounds, and solvent assisted inlet ionization capable of ionizing larger molecules like peptides. The feasibility of DCPI was successfully tested in the analysis of polar and nonpolar compounds in sage leaves and chili pepper.

  19. System technology analysis of aeroassisted orbital transfer vehicles: Moderate lift/drag (0.75-1.5),volume 1B, part 1, study results

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Significant performance benefits can be realized via aerodynamic breaking and/or aerodynamic maneuvering on return from higher altitude orbits to low Earth orbit. This approach substantially reduces the mission propellant requirements by using the aerodynamic drag, D, to brake the vehicle to near circular velocity and the aerodynamic lift, L, to null out accumulated errors as well as change the orbital inclination to that required for rendezous with the Space Shuttle Orbiter. A study was completed where broad concept evaluations were performed and the technology requirements and sensitivities for aeroassisted Orbital Transfer Vehicles (AOTVs) over a range of vehicle hypersonic L/D from 0.75 to 1.5 were systematically identified and assessed. The AOTV is capable of evolving from an initial delivery only system to one eventually capable of supporting manned roundtrip missions to geosynchronous orbit. Concept screenings were conducted on numerous configurations spanning the L/D = 0.75 to 1.5 range, and several with attractive features were identified. Initial payload capability was evaluated for a baseline of delivery to GEO, six hour polar, and Molniya orbits with return and recovery of the AOTV at LEO. Evolutionary payload requirements that were assessed include a GEO servicing mission and a manned GEO mission.

  20. A ride in the time machine: information management capabilities health departments will need.

    PubMed

    Foldy, Seth; Grannis, Shaun; Ross, David; Smith, Torney

    2014-09-01

    We have proposed needed information management capabilities for future US health departments predicated on trends in health care reform and health information technology. Regardless of whether health departments provide direct clinical services (and many will), they will manage unprecedented quantities of sensitive information for the public health core functions of assurance and assessment, including population-level health surveillance and metrics. Absent improved capabilities, health departments risk vestigial status, with consequences for vulnerable populations. Developments in electronic health records, interoperability and information exchange, public information sharing, decision support, and cloud technologies can support information management if health departments have appropriate capabilities. The need for national engagement in and consensus on these capabilities and their importance to health department sustainability make them appropriate for consideration in the context of accreditation.

Top