Sample records for sensitivity analysis technique

  1. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  2. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  3. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  4. Computational aspects of sensitivity calculations in linear transient structural analysis. Ph.D. Thesis - Virginia Polytechnic Inst. and State Univ.

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1990-01-01

    A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.

  5. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  6. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  7. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  8. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  9. A Sensitivity Analysis of Circular Error Probable Approximation Techniques

    DTIC Science & Technology

    1992-03-01

    SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some

  10. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  11. Estimating Sobol Sensitivity Indices Using Correlations

    EPA Science Inventory

    Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...

  12. Characterization of emission microscopy and liquid crystal thermography in IC fault localization

    NASA Astrophysics Data System (ADS)

    Lau, C. K.; Sim, K. S.

    2013-05-01

    This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.

  13. New Uses for Sensitivity Analysis: How Different Movement Tasks Effect Limb Model Parameter Sensitivity

    NASA Technical Reports Server (NTRS)

    Winters, J. M.; Stark, L.

    1984-01-01

    Original results for a newly developed eight-order nonlinear limb antagonistic muscle model of elbow flexion and extension are presented. A wider variety of sensitivity analysis techniques are used and a systematic protocol is established that shows how the different methods can be used efficiently to complement one another for maximum insight into model sensitivity. It is explicitly shown how the sensitivity of output behaviors to model parameters is a function of the controller input sequence, i.e., of the movement task. When the task is changed (for instance, from an input sequence that results in the usual fast movement task to a slower movement that may also involve external loading, etc.) the set of parameters with high sensitivity will in general also change. Such task-specific use of sensitivity analysis techniques identifies the set of parameters most important for a given task, and even suggests task-specific model reduction possibilities.

  14. Computational Aspects of Sensitivity Calculations in Linear Transient Structural Analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1989-01-01

    A study has been performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semianalytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models.

  15. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  16. Sensitivity analysis for parametric generalized implicit quasi-variational-like inclusions involving P-[eta]-accretive mappings

    NASA Astrophysics Data System (ADS)

    Kazmi, K. R.; Khan, F. A.

    2008-01-01

    In this paper, using proximal-point mapping technique of P-[eta]-accretive mapping and the property of the fixed-point set of set-valued contractive mappings, we study the behavior and sensitivity analysis of the solution set of a parametric generalized implicit quasi-variational-like inclusion involving P-[eta]-accretive mapping in real uniformly smooth Banach space. Further, under suitable conditions, we discuss the Lipschitz continuity of the solution set with respect to the parameter. The technique and results presented in this paper can be viewed as extension of the techniques and corresponding results given in [R.P. Agarwal, Y.-J. Cho, N.-J. Huang, Sensitivity analysis for strongly nonlinear quasi-variational inclusions, Appl. MathE Lett. 13 (2002) 19-24; S. Dafermos, Sensitivity analysis in variational inequalities, Math. Oper. Res. 13 (1988) 421-434; X.-P. Ding, Sensitivity analysis for generalized nonlinear implicit quasi-variational inclusions, Appl. Math. Lett. 17 (2) (2004) 225-235; X.-P. Ding, Parametric completely generalized mixed implicit quasi-variational inclusions involving h-maximal monotone mappings, J. Comput. Appl. Math. 182 (2) (2005) 252-269; X.-P. Ding, C.L. Luo, On parametric generalized quasi-variational inequalities, J. Optim. Theory Appl. 100 (1999) 195-205; Z. Liu, L. Debnath, S.M. Kang, J.S. Ume, Sensitivity analysis for parametric completely generalized nonlinear implicit quasi-variational inclusions, J. Math. Anal. Appl. 277 (1) (2003) 142-154; R.N. Mukherjee, H.L. Verma, Sensitivity analysis of generalized variational inequalities, J. Math. Anal. Appl. 167 (1992) 299-304; M.A. Noor, Sensitivity analysis framework for general quasi-variational inclusions, Comput. Math. Appl. 44 (2002) 1175-1181; M.A. Noor, Sensitivity analysis for quasivariational inclusions, J. Math. Anal. Appl. 236 (1999) 290-299; J.Y. Park, J.U. Jeong, Parametric generalized mixed variational inequalities, Appl. Math. Lett. 17 (2004) 43-48].

  17. Error analysis applied to several inversion techniques used for the retrieval of middle atmospheric constituents from limb-scanning MM-wave spectroscopic measurements

    NASA Technical Reports Server (NTRS)

    Puliafito, E.; Bevilacqua, R.; Olivero, J.; Degenhardt, W.

    1992-01-01

    The formal retrieval error analysis of Rodgers (1990) allows the quantitative determination of such retrieval properties as measurement error sensitivity, resolution, and inversion bias. This technique was applied to five numerical inversion techniques and two nonlinear iterative techniques used for the retrieval of middle atmospheric constituent concentrations from limb-scanning millimeter-wave spectroscopic measurements. It is found that the iterative methods have better vertical resolution, but are slightly more sensitive to measurement error than constrained matrix methods. The iterative methods converge to the exact solution, whereas two of the matrix methods under consideration have an explicit constraint, the sensitivity of the solution to the a priori profile. Tradeoffs of these retrieval characteristics are presented.

  18. Sensitivity analysis for large-scale problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  19. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  20. Diagnostic features of Alzheimer's disease extracted from PET sinograms

    NASA Astrophysics Data System (ADS)

    Sayeed, A.; Petrou, M.; Spyrou, N.; Kadyrov, A.; Spinks, T.

    2002-01-01

    Texture analysis of positron emission tomography (PET) images of the brain is a very difficult task, due to the poor signal to noise ratio. As a consequence, very few techniques can be implemented successfully. We use a new global analysis technique known as the Trace transform triple features. This technique can be applied directly to the raw sinograms to distinguish patients with Alzheimer's disease (AD) from normal volunteers. FDG-PET images of 18 AD and 10 normal controls obtained from the same CTI ECAT-953 scanner were used in this study. The Trace transform triple feature technique was used to extract features that were invariant to scaling, translation and rotation, referred to as invariant features, as well as features that were sensitive to rotation but invariant to scaling and translation, referred to as sensitive features in this study. The features were used to classify the groups using discriminant function analysis. Cross-validation tests using stepwise discriminant function analysis showed that combining both sensitive and invariant features produced the best results, when compared with the clinical diagnosis. Selecting the five best features produces an overall accuracy of 93% with sensitivity of 94% and specificity of 90%. This is comparable with the classification accuracy achieved by Kippenhan et al (1992), using regional metabolic activity.

  1. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules

    PubMed Central

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A.M.

    2017-01-01

    Abstract Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. PMID:28077562

  2. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  3. Evaluation of bone marrow specimens with acute myelogenous leukemia for CD34, CD15, CD117, and myeloperoxidase.

    PubMed

    Dunphy, C H; Polski, J M; Evans, H L; Gardner, L J

    2001-08-01

    Immunophenotyping of bone marrow (BM) specimens with acute myelogenous leukemia (AML) may be performed by flow cytometric (FC) or immunohistochemical (IH) techniques. Some markers (CD34, CD15, and CD117) are available for both techniques. Myeloperoxidase (MPO) analysis may be performed by enzyme cytochemical (EC) or IH techniques. To determine the reliability of these markers and MPO by these techniques, we designed a study to compare the results of analyses of these markers and MPO by FC (CD34, CD15, and CD117), EC (MPO), and IH (CD34, CD15, CD117, and MPO) techniques. Twenty-nine AMLs formed the basis of the study. These AMLs all had been immunophenotyped previously by FC analysis; 27 also had had EC analysis performed. Of the AMLs, 29 had BM core biopsies and 26 had BM clots that could be evaluated. The paraffin blocks of the 29 BM core biopsies and 26 BM clots were stained for CD34, CD117, MPO, and CD15. These results were compared with results by FC analysis (CD34, CD15, and CD117) and EC analysis (MPO). Immunodetection of CD34 expression in AML had a similar sensitivity by FC and IH techniques. Immunodetection of CD15 and CD117 had a higher sensitivity by FC analysis than by IH analysis. Detection of MPO by IH analysis was more sensitive than by EC analysis. There was no correlation of French-American-British (FAB) subtype of AML with CD34 or CD117 expression. Expression of CD15 was associated with AMLs with a monocytic component. Myeloperoxidase reactivity by IH analysis was observed in AMLs originally FAB subtyped as M0. CD34 can be equally detected by FC and IH techniques. CD15 and CD117 are better detected by FC analysis and MPO is better detected by IH analysis.

  4. Sensitivity analysis of a wing aeroelastic response

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.

    1991-01-01

    A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.

  5. Chemical fingerprinting of Arabidopsis using Fourier transform infrared (FT-IR) spectroscopic approaches.

    PubMed

    Gorzsás, András; Sundberg, Björn

    2014-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a fast, sensitive, inexpensive, and nondestructive technique for chemical profiling of plant materials. In this chapter we discuss the instrumental setup, the basic principles of analysis, and the possibilities for and limitations of obtaining qualitative and semiquantitative information by FT-IR spectroscopy. We provide detailed protocols for four fully customizable techniques: (1) Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS): a sensitive and high-throughput technique for powders; (2) attenuated total reflectance (ATR) spectroscopy: a technique that requires no sample preparation and can be used for solid samples as well as for cell cultures; (3) microspectroscopy using a single element (SE) detector: a technique used for analyzing sections at low spatial resolution; and (4) microspectroscopy using a focal plane array (FPA) detector: a technique for rapid chemical profiling of plant sections at cellular resolution. Sample preparation, measurement, and data analysis steps are listed for each of the techniques to help the user collect the best quality spectra and prepare them for subsequent multivariate analysis.

  6. The art of spacecraft design: A multidisciplinary challenge

    NASA Technical Reports Server (NTRS)

    Abdi, F.; Ide, H.; Levine, M.; Austel, L.

    1989-01-01

    Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.

  7. Single-molecule detection: applications to ultrasensitive biochemical analysis

    NASA Astrophysics Data System (ADS)

    Castro, Alonso; Shera, E. Brooks

    1995-06-01

    Recent developments in laser-based detection of fluorescent molecules have made possible the implementation of very sensitive techniques for biochemical analysis. We present and discuss our experiments on the applications of our recently developed technique of single-molecule detection to the analysis of molecules of biological interest. These newly developed methods are capable of detecting and identifying biomolecules at the single-molecule level of sensitivity. In one case, identification is based on measuring fluorescence brightness from single molecules. In another, molecules are classified by determining their electrophoretic velocities.

  8. Parameter sensitivity analysis for pesticide impacts on honeybee colonies

    EPA Science Inventory

    We employ Monte Carlo simulation and linear sensitivity analysis techniques to describe the dynamics of a bee exposure model, VarroaPop. Daily simulations are performed that simulate hive population trajectories, taking into account queen strength, foraging success, weather, colo...

  9. Sobol’ sensitivity analysis for stressor impacts on honeybee colonies

    EPA Science Inventory

    We employ Monte Carlo simulation and nonlinear sensitivity analysis techniques to describe the dynamics of a bee exposure model, VarroaPop. Daily simulations are performed of hive population trajectories, taking into account queen strength, foraging success, mite impacts, weather...

  10. Improving the analysis of slug tests

    USGS Publications Warehouse

    McElwee, C.D.

    2002-01-01

    This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.

  11. Analysis of DNA Cytosine Methylation Patterns Using Methylation-Sensitive Amplification Polymorphism (MSAP).

    PubMed

    Guevara, María Ángeles; de María, Nuria; Sáez-Laguna, Enrique; Vélez, María Dolores; Cervera, María Teresa; Cabezas, José Antonio

    2017-01-01

    Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is the methylation-sensitive amplified polymorphism technique (MSAP) which is a modification of amplified fragment length polymorphism (AFLP). It has been used to study methylation of anonymous CCGG sequences in different fungi, plants, and animal species. The main variation of this technique resides on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent-cutter restriction enzyme. For each sample, MSAP analysis is performed using both EcoRI/HpaII- and EcoRI/MspI-digested samples. A comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) methylation-insensitive polymorphisms that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples and (2) methylation-sensitive polymorphisms which are associated with the amplified fragments that differ in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses the modifications that can be applied to adjust the technology to different species of interest.

  12. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  13. Single-tube analysis of DNA methylation with silica superparamagnetic beads.

    PubMed

    Bailey, Vasudev J; Zhang, Yi; Keeley, Brian P; Yin, Chao; Pelosky, Kristen L; Brock, Malcolm; Baylin, Stephen B; Herman, James G; Wang, Tza-Huei

    2010-06-01

    DNA promoter methylation is a signature for the silencing of tumor suppressor genes. Most widely used methods to detect DNA methylation involve 3 separate, independent processes: DNA extraction, bisulfite conversion, and methylation detection via a PCR method, such as methylation-specific PCR (MSP). This method includes many disconnected steps with associated losses of material, potentially reducing the analytical sensitivity required for analysis of challenging clinical samples. Methylation on beads (MOB) is a new technique that integrates DNA extraction, bisulfite conversion, and PCR in a single tube via the use of silica superparamagnetic beads (SSBs) as a common DNA carrier for facilitating cell debris removal and buffer exchange throughout the entire process. In addition, PCR buffer is used to directly elute bisulfite-treated DNA from SSBs for subsequent target amplifications. The diagnostic sensitivity of MOB was evaluated by methylation analysis of the CDKN2A [cyclin-dependent kinase inhibitor 2A (melanoma, p16, inhibits CDK4); also known as p16(INK4a)] promoter in serum DNA of lung cancer patients and compared with that of conventional methods. Methylation analysis consisting of DNA extraction followed by bisulfite conversion and MSP was successfully carried out within 9 h in a single tube. The median pre-PCR DNA yield was 6.61-fold higher with the MOB technique than with conventional techniques. Furthermore, MOB increased the diagnostic sensitivity in our analysis of the CDKN2A promoter in patient serum by successfully detecting methylation in 74% of cancer patients, vs the 45% detection rate obtained with conventional techniques. The MOB technique successfully combined 3 processes into a single tube, thereby allowing ease in handling and an increased detection throughput. The increased pre-PCR yield in MOB allowed efficient, diagnostically sensitive methylation detection.

  14. Sensitive and inexpensive digital DNA analysis by microfluidic enrichment of rolling circle amplified single-molecules.

    PubMed

    Kühnemund, Malte; Hernández-Neuta, Iván; Sharif, Mohd Istiaq; Cornaglia, Matteo; Gijs, Martin A M; Nilsson, Mats

    2017-05-05

    Single molecule quantification assays provide the ultimate sensitivity and precision for molecular analysis. However, most digital analysis techniques, i.e. droplet PCR, require sophisticated and expensive instrumentation for molecule compartmentalization, amplification and analysis. Rolling circle amplification (RCA) provides a simpler means for digital analysis. Nevertheless, the sensitivity of RCA assays has until now been limited by inefficient detection methods. We have developed a simple microfluidic strategy for enrichment of RCA products into a single field of view of a low magnification fluorescent sensor, enabling ultra-sensitive digital quantification of nucleic acids over a dynamic range from 1.2 aM to 190 fM. We prove the broad applicability of our analysis platform by demonstrating 5-plex detection of as little as ∼1 pg (∼300 genome copies) of pathogenic DNA with simultaneous antibiotic resistance marker detection, and the analysis of rare oncogene mutations. Our method is simpler, more cost-effective and faster than other digital analysis techniques and provides the means to implement digital analysis in any laboratory equipped with a standard fluorescent microscope. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Comparison of the sensitivity of mass spectrometry atmospheric pressure ionization techniques in the analysis of porphyrinoids.

    PubMed

    Swider, Paweł; Lewtak, Jan P; Gryko, Daniel T; Danikiewicz, Witold

    2013-10-01

    The porphyrinoids chemistry is greatly dependent on the data obtained in mass spectrometry. For this reason, it is essential to determine the range of applicability of mass spectrometry ionization methods. In this study, the sensitivity of three different atmospheric pressure ionization techniques, electrospray ionization, atmospheric pressure chemical ionization and atmospheric pressure photoionization, was tested for several porphyrinods and their metallocomplexes. Electrospray ionization method was shown to be the best ionization technique because of its high sensitivity for derivatives of cyanocobalamin, free-base corroles and porphyrins. In the case of metallocorroles and metalloporphyrins, atmospheric pressure photoionization with dopant proved to be the most sensitive ionization method. It was also shown that for relatively acidic compounds, particularly for corroles, the negative ion mode provides better sensitivity than the positive ion mode. The results supply a lot of relevant information on the methodology of porphyrinoids analysis carried out by mass spectrometry. The information can be useful in designing future MS or liquid chromatography-MS experiments. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Boundary formulations for sensitivity analysis without matrix derivatives

    NASA Technical Reports Server (NTRS)

    Kane, J. H.; Guru Prasad, K.

    1993-01-01

    A new hybrid approach to continuum structural shape sensitivity analysis employing boundary element analysis (BEA) is presented. The approach uses iterative reanalysis to obviate the need to factor perturbed matrices in the determination of surface displacement and traction sensitivities via a univariate perturbation/finite difference (UPFD) step. The UPFD approach makes it possible to immediately reuse existing subroutines for computation of BEA matrix coefficients in the design sensitivity analysis process. The reanalysis technique computes economical response of univariately perturbed models without factoring perturbed matrices. The approach provides substantial computational economy without the burden of a large-scale reprogramming effort.

  17. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  18. Testing of stack-unit/aquifer sensitivity analysis using contaminant plume distribution in the subsurface of Savannah River Site, South Carolina, USA

    USGS Publications Warehouse

    Rine, J.M.; Shafer, J.M.; Covington, E.; Berg, R.C.

    2006-01-01

    Published information on the correlation and field-testing of the technique of stack-unit/aquifer sensitivity mapping with documented subsurface contaminant plumes is rare. The inherent characteristic of stack-unit mapping, which makes it a superior technique to other analyses that amalgamate data, is the ability to deconstruct the sensitivity analysis on a unit-by-unit basis. An aquifer sensitivity map, delineating the relative sensitivity of the Crouch Branch aquifer of the Administrative/Manufacturing Area (A/M) at the Savannah River Site (SRS) in South Carolina, USA, incorporates six hydrostratigraphic units, surface soil units, and relevant hydrologic data. When this sensitivity map is compared with the distribution of the contaminant tetrachloroethylene (PCE), PCE is present within the Crouch Branch aquifer within an area classified as highly sensitive, even though the PCE was primarily released on the ground surface within areas classified with low aquifer sensitivity. This phenomenon is explained through analysis of the aquifer sensitivity map, the groundwater potentiometric surface maps, and the plume distributions within the area on a unit-by- unit basis. The results of this correlation show how the paths of the PCE plume are influenced by both the geology and the groundwater flow. ?? Springer-Verlag 2006.

  19. Separation and Analysis of Citral Isomers.

    ERIC Educational Resources Information Center

    Sacks, Jeff; And Others

    1983-01-01

    Provides background information, procedures, and results of an experiments designed to introduce undergraduates to the technique of steam distillation as a means of isolating thermally sensitive compounds. Chromatographic techniques (HPLC) and mass spectrometric analysis are used in the experiment which requires three laboratory periods. (JN)

  20. Phase sensitive spectral domain interferometry for label free biomolecular interaction analysis and biosensing applications

    NASA Astrophysics Data System (ADS)

    Chirvi, Sajal

    Biomolecular interaction analysis (BIA) plays vital role in wide variety of fields, which include biomedical research, pharmaceutical industry, medical diagnostics, and biotechnology industry. Study and quantification of interactions between natural biomolecules (proteins, enzymes, DNA) and artificially synthesized molecules (drugs) is routinely done using various labeled and label-free BIA techniques. Labeled BIA (Chemiluminescence, Fluorescence, Radioactive) techniques suffer from steric hindrance of labels on interaction site, difficulty of attaching labels to molecules, higher cost and time of assay development. Label free techniques with real time detection capabilities have demonstrated advantages over traditional labeled techniques. The gold standard for label free BIA is surface Plasmon resonance (SPR) that detects and quantifies the changes in refractive index of the ligand-analyte complex molecule with high sensitivity. Although SPR is a highly sensitive BIA technique, it requires custom-made sensor chips and is not well suited for highly multiplexed BIA required in high throughput applications. Moreover implementation of SPR on various biosensing platforms is limited. In this research work spectral domain phase sensitive interferometry (SD-PSI) has been developed for label-free BIA and biosensing applications to address limitations of SPR and other label free techniques. One distinct advantage of SD-PSI compared to other label-free techniques is that it does not require use of custom fabricated biosensor substrates. Laboratory grade, off-the-shelf glass or plastic substrates of suitable thickness with proper surface functionalization are used as biosensor chips. SD-PSI is tested on four separate BIA and biosensing platforms, which include multi-well plate, flow cell, fiber probe with integrated optics and fiber tip biosensor. Sensitivity of 33 ng/ml for anti-IgG is achieved using multi-well platform. Principle of coherence multiplexing for multi-channel label-free biosensing applications is introduced. Simultaneous interrogation of multiple biosensors is achievable with a single spectral domain phase sensitive interferometer by coding the individual sensograms in coherence-multiplexed channels. Experimental results demonstrating multiplexed quantitative biomolecular interaction analysis of antibodies binding to antigen coated functionalized biosensor chip surfaces on different platforms are presented.

  1. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  2. The application of sensitivity analysis to models of large scale physiological systems

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.

  3. Univariate and multivariate analysis of tannin-impregnated wood species using vibrational spectroscopy.

    PubMed

    Schnabel, Thomas; Musso, Maurizio; Tondi, Gianluca

    2014-01-01

    Vibrational spectroscopy is one of the most powerful tools in polymer science. Three main techniques--Fourier transform infrared spectroscopy (FT-IR), FT-Raman spectroscopy, and FT near-infrared (NIR) spectroscopy--can also be applied to wood science. Here, these three techniques were used to investigate the chemical modification occurring in wood after impregnation with tannin-hexamine preservatives. These spectroscopic techniques have the capacity to detect the externally added tannin. FT-IR has very strong sensitivity to the aromatic peak at around 1610 cm(-1) in the tannin-treated samples, whereas FT-Raman reflects the peak at around 1600 cm(-1) for the externally added tannin. This high efficacy in distinguishing chemical features was demonstrated in univariate analysis and confirmed via cluster analysis. Conversely, the results of the NIR measurements show noticeable sensitivity for small differences. For this technique, multivariate analysis is required and with this chemometric tool, it is also possible to predict the concentration of tannin on the surface.

  4. Diagnostic accuracy of magnetic resonance imaging techniques for treatment response evaluation in patients with high-grade glioma, a systematic review and meta-analysis.

    PubMed

    van Dijken, Bart R J; van Laar, Peter Jan; Holtman, Gea A; van der Hoorn, Anouk

    2017-10-01

    Treatment response assessment in high-grade gliomas uses contrast enhanced T1-weighted MRI, but is unreliable. Novel advanced MRI techniques have been studied, but the accuracy is not well known. Therefore, we performed a systematic meta-analysis to assess the diagnostic accuracy of anatomical and advanced MRI for treatment response in high-grade gliomas. Databases were searched systematically. Study selection and data extraction were done by two authors independently. Meta-analysis was performed using a bivariate random effects model when ≥5 studies were included. Anatomical MRI (five studies, 166 patients) showed a pooled sensitivity and specificity of 68% (95%CI 51-81) and 77% (45-93), respectively. Pooled apparent diffusion coefficients (seven studies, 204 patients) demonstrated a sensitivity of 71% (60-80) and specificity of 87% (77-93). DSC-perfusion (18 studies, 708 patients) sensitivity was 87% (82-91) with a specificity of 86% (77-91). DCE-perfusion (five studies, 207 patients) sensitivity was 92% (73-98) and specificity was 85% (76-92). The sensitivity of spectroscopy (nine studies, 203 patients) was 91% (79-97) and specificity was 95% (65-99). Advanced techniques showed higher diagnostic accuracy than anatomical MRI, the highest for spectroscopy, supporting the use in treatment response assessment in high-grade gliomas. • Treatment response assessment in high-grade gliomas with anatomical MRI is unreliable • Novel advanced MRI techniques have been studied, but diagnostic accuracy is unknown • Meta-analysis demonstrates that advanced MRI showed higher diagnostic accuracy than anatomical MRI • Highest diagnostic accuracy for spectroscopy and perfusion MRI • Supports the incorporation of advanced MRI in high-grade glioma treatment response assessment.

  5. Wide-Field Imaging of Single-Nanoparticle Extinction with Sub-nm2 Sensitivity

    NASA Astrophysics Data System (ADS)

    Payne, Lukas M.; Langbein, Wolfgang; Borri, Paola

    2018-03-01

    We report on a highly sensitive wide-field imaging technique for quantitative measurement of the optical extinction cross section σext of single nanoparticles. The technique is simple and high speed, and it enables the simultaneous acquisition of hundreds of nanoparticles for statistical analysis. Using rapid referencing, fast acquisition, and a deconvolution analysis, a shot-noise-limited sensitivity down to 0.4 nm2 is achieved. Measurements on a set of individual gold nanoparticles of 5 nm diameter using this method yield σext=(10.0 ±3.1 ) nm2, which is consistent with theoretical expectations and well above the background fluctuations of 0.9 nm2 .

  6. Singularity-sensitive gauge-based radar rainfall adjustment methods for urban hydrological applications

    NASA Astrophysics Data System (ADS)

    Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.

    2015-09-01

    Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system's dynamics, particularly of peak runoff flows.

  7. Near-surface compressional and shear wave speeds constrained by body-wave polarization analysis

    NASA Astrophysics Data System (ADS)

    Park, Sunyoung; Ishii, Miaki

    2018-06-01

    A new technique to constrain near-surface seismic structure that relates body-wave polarization direction to the wave speed immediately beneath a seismic station is presented. The P-wave polarization direction is only sensitive to shear wave speed but not to compressional wave speed, while the S-wave polarization direction is sensitive to both wave speeds. The technique is applied to data from the High-Sensitivity Seismograph Network in Japan, and the results show that the wave speed estimates obtained from polarization analysis are compatible with those from borehole measurements. The lateral variations in wave speeds correlate with geological and physical features such as topography and volcanoes. The technique requires minimal computation resources, and can be used on any number of three-component teleseismic recordings, opening opportunities for non-invasive and inexpensive study of the shallowest (˜100 m) crustal structures.

  8. Ecological Sensitivity Evaluation of Tourist Region Based on Remote Sensing Image - Taking Chaohu Lake Area as a Case Study

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Li, W. J.; Yu, J.; Wu, C. Z.

    2018-04-01

    Remote sensing technology is of significant advantages for monitoring and analysing ecological environment. By using of automatic extraction algorithm, various environmental resources information of tourist region can be obtained from remote sensing imagery. Combining with GIS spatial analysis and landscape pattern analysis, relevant environmental information can be quantitatively analysed and interpreted. In this study, taking the Chaohu Lake Basin as an example, Landsat-8 multi-spectral satellite image of October 2015 was applied. Integrated the automatic ELM (Extreme Learning Machine) classification results with the data of digital elevation model and slope information, human disturbance degree, land use degree, primary productivity, landscape evenness , vegetation coverage, DEM, slope and normalized water body index were used as the evaluation factors to construct the eco-sensitivity evaluation index based on AHP and overlay analysis. According to the value of eco-sensitivity evaluation index, by using of GIS technique of equal interval reclassification, the Chaohu Lake area was divided into four grades: very sensitive area, sensitive area, sub-sensitive areas and insensitive areas. The results of the eco-sensitivity analysis shows: the area of the very sensitive area was 4577.4378 km2, accounting for about 37.12 %, the sensitive area was 5130.0522 km2, accounting for about 37.12 %; the area of sub-sensitive area was 3729.9312 km2, accounting for 26.99 %; the area of insensitive area was 382.4399 km2, accounting for about 2.77 %. At the same time, it has been found that there were spatial differences in ecological sensitivity of the Chaohu Lake basin. The most sensitive areas were mainly located in the areas with high elevation and large terrain gradient. Insensitive areas were mainly distributed in slope of the slow platform area; the sensitive areas and the sub-sensitive areas were mainly agricultural land and woodland. Through the eco-sensitivity analysis of the study area, the automatic recognition and analysis techniques for remote sensing imagery are integrated into the ecological analysis and ecological regional planning, which can provide a reliable scientific basis for rational planning and regional sustainable development of the Chaohu Lake tourist area.

  9. Automatic differentiation evaluated as a tool for rotorcraft design and optimization

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.

    1995-01-01

    This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.

  10. Sensitivity of control-augmented structure obtained by a system decomposition method

    NASA Technical Reports Server (NTRS)

    Sobieszczanskisobieski, Jaroslaw; Bloebaum, Christina L.; Hajela, Prabhat

    1988-01-01

    The verification of a method for computing sensitivity derivatives of a coupled system is presented. The method deals with a system whose analysis can be partitioned into subsets that correspond to disciplines and/or physical subsystems that exchange input-output data with each other. The method uses the partial sensitivity derivatives of the output with respect to input obtained for each subset separately to assemble a set of linear, simultaneous, algebraic equations that are solved for the derivatives of the coupled system response. This sensitivity analysis is verified using an example of a cantilever beam augmented with an active control system to limit the beam's dynamic displacements under an excitation force. The verification shows good agreement of the method with reference data obtained by a finite difference technique involving entire system analysis. The usefulness of a system sensitivity method in optimization applications by employing a piecewise-linear approach to the same numerical example is demonstrated. The method's principal merits are its intrinsically superior accuracy in comparison with the finite difference technique, and its compatibility with the traditional division of work in complex engineering tasks among specialty groups.

  11. The dream of a one-stop-shop: Meta-analysis on myocardial perfusion CT.

    PubMed

    Pelgrim, Gert Jan; Dorrius, Monique; Xie, Xueqian; den Dekker, Martijn A M; Schoepf, U Joseph; Henzler, Thomas; Oudkerk, Matthijs; Vliegenthart, Rozemarijn

    2015-12-01

    To determine the diagnostic performance of computed tomography (CT) perfusion techniques for the detection of functionally relevant coronary artery disease (CAD) in comparison to reference standards, including invasive coronary angiography (ICA), single photon emission computed tomography (SPECT), and magnetic resonance imaging (MRI). PubMed, Web of Knowledge and Embase were searched from January 1, 1998 until July 1, 2014. The search yielded 9475 articles. After duplicate removal, 6041 were screened on title and abstract. The resulting 276 articles were independently analyzed in full-text by two reviewers, and included if the inclusion criteria were met. The articles reporting diagnostic parameters including true positive, true negative, false positive and false negative were subsequently evaluated for the meta-analysis. Results were pooled according to CT perfusion technique, namely snapshot techniques: single-phase rest, single-phase stress, single-phase dual-energy stress and combined coronary CT angiography [rest] and single-phase stress, as well the dynamic technique: dynamic stress CT perfusion. Twenty-two articles were included in the meta-analysis (1507 subjects). Pooled per-patient sensitivity and specificity of single-phase rest CT compared to rest SPECT were 89% (95% confidence interval [CI], 82-94%) and 88% (95% CI, 78-94%), respectively. Vessel-based sensitivity and specificity of single-phase stress CT compared to ICA-based >70% stenosis were 82% (95% CI, 64-92%) and 78% (95% CI, 61-89%). Segment-based sensitivity and specificity of single-phase dual-energy stress CT in comparison to stress MRI were 75% (95% CI, 60-85%) and 95% (95% CI, 80-99%). Segment-based sensitivity and specificity of dynamic stress CT perfusion compared to stress SPECT were 77% (95% CI, 67-85) and 89% (95% CI, 78-95%). For combined coronary CT angiography and single-phase stress CT, vessel-based sensitivity and specificity in comparison to ICA-based >50% stenosis were 84% (95% CI, 67-93%) and 93% (95% CI, 89-96%). This meta-analysis shows considerable variation in techniques and reference standards for CT of myocardial blood supply. While CT seems sensitive and specific for evaluation of hemodynamically relevant CAD, studies so far are limited in size. Standardization of myocardial perfusion CT technique is essential. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Factor weighting in DRASTIC modeling.

    PubMed

    Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F

    2015-02-01

    Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional expertise to be set up satisfactorily. Following a general criterion that weights must be proportional to the range of the ratings, Correspondence Analysis may be recommended as the best adjustment technique. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Quantitative mass spectrometry methods for pharmaceutical analysis

    PubMed Central

    Loos, Glenn; Van Schepdael, Ann

    2016-01-01

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982

  14. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  15. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  16. Cavity-enhanced resonant photoacoustic spectroscopy with optical feedback cw diode lasers: A novel technique for ultratrace gas analysis and high-resolution spectroscopy.

    PubMed

    Hippler, Michael; Mohr, Christian; Keen, Katherine A; McNaghten, Edward D

    2010-07-28

    Cavity-enhanced resonant photoacoustic spectroscopy with optical feedback cw diode lasers (OF-CERPAS) is introduced as a novel technique for ultratrace gas analysis and high-resolution spectroscopy. In the scheme, a single-mode cw diode laser (3 mW, 635 nm) is coupled into a high-finesse linear cavity and stabilized to the cavity by optical feedback. Inside the cavity, a build-up of laser power to at least 2.5 W occurs. Absorbing gas phase species inside the cavity are detected with high sensitivity by the photoacoustic effect using a microphone embedded in the cavity. To increase sensitivity further, coupling into the cavity is modulated at a frequency corresponding to a longitudinal resonance of an organ pipe acoustic resonator (f=1.35 kHz and Q approximately 10). The technique has been characterized by measuring very weak water overtone transitions near 635 nm. Normalized noise-equivalent absorption coefficients are determined as alpha approximately 4.4x10(-9) cm(-1) s(1/2) (1 s integration time) and 2.6x10(-11) cm(-1) s(1/2) W (1 s integration time and 1 W laser power). These sensitivities compare favorably with existing state-of-the-art techniques. As an advantage, OF-CERPAS is a "zero-background" method which increases selectivity and sensitivity, and its sensitivity scales with laser power.

  17. A special protection scheme utilizing trajectory sensitivity analysis in power transmission

    NASA Astrophysics Data System (ADS)

    Suriyamongkol, Dan

    In recent years, new measurement techniques have provided opportunities to improve the North American Power System observability, control and protection. This dissertation discusses the formulation and design of a special protection scheme based on a novel utilization of trajectory sensitivity techniques with inputs consisting of system state variables and parameters. Trajectory sensitivity analysis (TSA) has been used in previous publications as a method for power system security and stability assessment, and the mathematical formulation of TSA lends itself well to some of the time domain power system simulation techniques. Existing special protection schemes often have limited sets of goals and control actions. The proposed scheme aims to maintain stability while using as many control actions as possible. The approach here will use the TSA in a novel way by using the sensitivities of system state variables with respect to state parameter variations to determine the state parameter controls required to achieve the desired state variable movements. The initial application will operate based on the assumption that the modeled power system has full system observability, and practical considerations will be discussed.

  18. On Sensitivity Analysis within the 4DVAR Framework

    DTIC Science & Technology

    2014-02-01

    sitivity’’ (AS) approach, Lee et al. (2001) estimated the sensitivity of the Indonesian Throughflow to remote wind forcing, Losch and Heimbach ( 2007 ...of massive paral- lelization. The ensemble sensitivity (ES) analysis (e.g., Ancell and Hakim 2007 ; Torn and Hakim 2008) follows the basic principle of...variational assimila- tion techniques (e.g., Cao et al. 2007 ; Liu et al. 2008; Yaremchuk et al. 2009; Clayton et al. 2013). In particular, Yaremchuk

  19. Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.

    1998-01-01

    This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.

  20. Time-Distance Analysis of Deep Solar Convection

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.; Hanasoge, S. M.

    2011-01-01

    Recently it was shown by Hanasoge, Duvall, and DeRosa (2010) that the upper limit to convective flows for spherical harmonic degrees l

  1. Recent approaches in sensitive enantioseparations by CE.

    PubMed

    Sánchez-Hernández, Laura; Castro-Puyana, María; Marina, María Luisa; Crego, Antonio L

    2012-01-01

    The latest strategies and instrumental improvements for enhancing the detection sensitivity in chiral analysis by CE are reviewed in this work. Following the previous reviews by García-Ruiz et al. (Electrophoresis 2006, 27, 195-212) and Sánchez-Hernández et al. (Electrophoresis 2008, 29, 237-251; Electrophoresis 2010, 31, 28-43), this review includes those papers that were published during the period from June 2009 to May 2011. These works describe the use of offline and online sample treatment techniques, online sample preconcentration techniques based on electrophoretic principles, and alternative detection systems to UV-Vis to increase the detection sensitivity. The application of the above-mentioned strategies, either alone or combined, to improve the sensitivity in the enantiomeric analysis of a broad range of samples, such as pharmaceutical, biological, food and environmental samples, enables to decrease the limits of detection up to 10⁻¹² M. The use of microchips to achieve sensitive chiral separations is also discussed. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. An empirical comparison of a dynamic software testability metric to static cyclomatic complexity

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.

    1993-01-01

    This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.

  3. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  4. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  5. Analysis and design of optical systems by use of sensitivity analysis of skew ray tracing

    NASA Astrophysics Data System (ADS)

    Lin, Psang Dain; Lu, Chia-Hung

    2004-02-01

    Optical systems are conventionally evaluated by ray-tracing techniques that extract performance quantities such as aberration and spot size. Current optical analysis software does not provide satisfactory analytical evaluation functions for the sensitivity of an optical system. Furthermore, when functions oscillate strongly, the results are of low accuracy. Thus this work extends our earlier research on an advanced treatment of reflected or refracted rays, referred to as sensitivity analysis, in which differential changes of reflected or refracted rays are expressed in terms of differential changes of incident rays. The proposed sensitivity analysis methodology for skew ray tracing of reflected or refracted rays that cross spherical or flat boundaries is demonstrated and validated by the application of a cat's eye retroreflector to the design and by the image orientation of a system with noncoplanar optical axes. The proposed sensitivity analysis is projected as the nucleus of other geometrical optical computations.

  6. Analysis and Design of Optical Systems by Use of Sensitivity Analysis of Skew Ray Tracing

    NASA Astrophysics Data System (ADS)

    Dain Lin, Psang; Lu, Chia-Hung

    2004-02-01

    Optical systems are conventionally evaluated by ray-tracing techniques that extract performance quantities such as aberration and spot size. Current optical analysis software does not provide satisfactory analytical evaluation functions for the sensitivity of an optical system. Furthermore, when functions oscillate strongly, the results are of low accuracy. Thus this work extends our earlier research on an advanced treatment of reflected or refracted rays, referred to as sensitivity analysis, in which differential changes of reflected or refracted rays are expressed in terms of differential changes of incident rays. The proposed sensitivity analysis methodology for skew ray tracing of reflected or refracted rays that cross spherical or flat boundaries is demonstrated and validated by the application of a cat ?s eye retroreflector to the design and by the image orientation of a system with noncoplanar optical axes. The proposed sensitivity analysis is projected as the nucleus of other geometrical optical computations.

  7. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  8. Shape design sensitivity analysis and optimization of three dimensional elastic solids using geometric modeling and automatic regridding. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Yao, Tse-Min; Choi, Kyung K.

    1987-01-01

    An automatic regridding method and a three dimensional shape design parameterization technique were constructed and integrated into a unified theory of shape design sensitivity analysis. An algorithm was developed for general shape design sensitivity analysis of three dimensional eleastic solids. Numerical implementation of this shape design sensitivity analysis method was carried out using the finite element code ANSYS. The unified theory of shape design sensitivity analysis uses the material derivative of continuum mechanics with a design velocity field that represents shape change effects over the structural design. Automatic regridding methods were developed by generating a domain velocity field with boundary displacement method. Shape design parameterization for three dimensional surface design problems was illustrated using a Bezier surface with boundary perturbations that depend linearly on the perturbation of design parameters. A linearization method of optimization, LINRM, was used to obtain optimum shapes. Three examples from different engineering disciplines were investigated to demonstrate the accuracy and versatility of this shape design sensitivity analysis method.

  9. CAFNA{reg{underscore}sign}, coded aperture fast neutron analysis for contraband detection: Preliminary results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L.; Lanza, R.C.

    1999-12-01

    The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less

  10. Selective parathyroid venous sampling in primary hyperparathyroidism: A systematic review and meta-analysis.

    PubMed

    Ibraheem, Kareem; Toraih, Eman A; Haddad, Antoine B; Farag, Mahmoud; Randolph, Gregory W; Kandil, Emad

    2018-05-14

    Minimally invasive parathyroidectomy requires accurate preoperative localization techniques. There is considerable controversy about the effectiveness of selective parathyroid venous sampling (sPVS) in primary hyperparathyroidism (PHPT) patients. The aim of this meta-analysis is to examine the diagnostic accuracy of sPVS as a preoperative localization modality in PHPT. Studies evaluating the diagnostic accuracy of sPVS for PHPT were electronically searched in the PubMed, EMBASE, Web of Science, and Cochrane Controlled Trials Register databases. Two independent authors reviewed the studies, and revised quality assessment of diagnostic accuracy study tool was used for the quality assessment. Study heterogeneity and pooled estimates were calculated. Two hundred and two unique studies were identified. Of those, 12 studies were included in the meta-analysis. Pooled sensitivity, specificity, and positive likelihood ratio (PLR) of sPVS were 74%, 41%, and 1.55, respectively. The area-under-the-receiver operating characteristic curve was 0.684, indicating an average discriminatory ability of sPVS. On comparison between sPVS and noninvasive imaging modalities, sensitivity, PLR, and positive posttest probability were significantly higher in sPVS compared to noninvasive imaging modalities. Interestingly, super-selective venous sampling had the highest sensitivity, accuracy, and positive posttest probability compared to other parathyroid venous sampling techniques. This is the first meta-analysis to examine the accuracy of sPVS in PHPT. sPVS had higher pooled sensitivity when compared to noninvasive modalities in revision parathyroid surgery. However, the invasiveness of this technique does not favor its routine use for preoperative localization. Super-selective venous sampling was the most accurate among all other parathyroid venous sampling techniques. Laryngoscope, 2018. © 2018 The American Laryngological, Rhinological and Otological Society, Inc.

  11. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  12. First- and Second-Order Sensitivity Analysis of a P-Version Finite Element Equation Via Automatic Differentiation

    NASA Technical Reports Server (NTRS)

    Hou, Gene

    1998-01-01

    Sensitivity analysis is a technique for determining derivatives of system responses with respect to design parameters. Among many methods available for sensitivity analysis, automatic differentiation has been proven through many applications in fluid dynamics and structural mechanics to be an accurate and easy method for obtaining derivatives. Nevertheless, the method can be computational expensive and can require a high memory space. This project will apply an automatic differentiation tool, ADIFOR, to a p-version finite element code to obtain first- and second- order then-nal derivatives, respectively. The focus of the study is on the implementation process and the performance of the ADIFOR-enhanced codes for sensitivity analysis in terms of memory requirement, computational efficiency, and accuracy.

  13. Accurate evaluation of sensitivity for calibration between a LiDAR and a panoramic camera used for remote sensing

    NASA Astrophysics Data System (ADS)

    García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier

    2016-04-01

    Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.

  14. Application of neural networks and sensitivity analysis to improved prediction of trauma survival.

    PubMed

    Hunter, A; Kennedy, L; Henry, J; Ferguson, I

    2000-05-01

    The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.

  15. Installation Restoration General Environmental Technology Development. Task 6. Materials Handling of Explosive Contaminated Soil and Sediment.

    DTIC Science & Technology

    1985-06-01

    of chemical analysis and sensitivity testing on material samples . At this 4 time, these samples must be packaged and...preparation at a rate of three samples per hour. One analyst doing both sample preparation and the HPLC analysis can run 16 samples in an 8-hour day. II... study , sensitivity testing was reviewed to enable recommendations for complete analysis of contaminated soils. Materials handling techniques,

  16. Chromatographic-ICPMS methods for trace element and isotope analysis of water and biogenic calcite

    NASA Astrophysics Data System (ADS)

    Klinkhammer, G. P.; Haley, B. A.; McManus, J.; Palmer, M. R.

    2003-04-01

    ICP-MS is a powerful technique because of its sensitivity and speed of analysis. This is especially true for refractory elements that are notoriously difficult using TIMS and less energetic techniques. However, as ICP-MS instruments become more sensitive to elements of interest they also become more sensitive to interference. This becomes a pressing issue when analyzing samples with high total dissolved solids. This paper describes two trace element methods that overcome these problems by using chromatographic techniques to precondition samples prior to analysis by ICP-MS: separation of rare earth elements (REEs) from seawater using HPLC-ICPMS, and flow-through dissolution of foraminiferal calcite. Using HPLC in combination with ICP-MS it is possible to isolate the REEs from matrix, other transition elements, and each other. This method has been developed for small volume samples (5ml) making it possible to analyze sediment pore waters. As another example, subjecting foram shells to flow-through reagent addition followed by time-resolved analysis in the ICP-MS allows for systematic cleaning and dissolution of foram shells. This method provides information about the relationship between dissolution tendency and elemental composition. Flow-through is also amenable to automation thus yielding the high sample throughput required for paleoceanography, and produces a highly resolved elemental matrix that can be statistically analyzed.

  17. Combining vibrational biomolecular spectroscopy with chemometric techniques for the study of response and sensitivity of molecular structures/functional groups mainly related to lipid biopolymer to various processing applications.

    PubMed

    Yu, Gloria Qingyu; Yu, Peiqiang

    2015-09-01

    The objectives of this project were to (1) combine vibrational spectroscopy with chemometric multivariate techniques to determine the effect of processing applications on molecular structural changes of lipid biopolymer that mainly related to functional groups in green- and yellow-type Crop Development Centre (CDC) pea varieties [CDC strike (green-type) vs. CDC meadow (yellow-type)] that occurred during various processing applications; (2) relatively quantify the effect of processing applications on the antisymmetric CH3 ("CH3as") and CH2 ("CH2as") (ca. 2960 and 2923 cm(-1), respectively), symmetric CH3 ("CH3s") and CH2 ("CH2s") (ca. 2873 and 2954 cm(-1), respectively) functional groups and carbonyl C=O ester (ca. 1745 cm(-1)) spectral intensities as well as their ratios of antisymmetric CH3 to antisymmetric CH2 (ratio of CH3as to CH2as), ratios of symmetric CH3 to symmetric CH2 (ratio of CH3s to CH2s), and ratios of carbonyl C=O ester peak area to total CH peak area (ratio of C=O ester to CH); and (3) illustrate non-invasive techniques to detect the sensitivity of individual molecular functional group to the various processing applications in the recently developed different types of pea varieties. The hypothesis of this research was that processing applications modified the molecular structure profiles in the processed products as opposed to original unprocessed pea seeds. The results showed that the different processing methods had different impacts on lipid molecular functional groups. Different lipid functional groups had different sensitivity to various heat processing applications. These changes were detected by advanced molecular spectroscopy with chemometric techniques which may be highly related to lipid utilization and availability. The multivariate molecular spectral analyses, cluster analysis, and principal component analysis of original spectra (without spectral parameterization) are unable to fully distinguish the structural differences in the antisymmetric and symmetric CH3 and CH2 spectral region (ca. 3001-2799 cm(-1)) and carbonyl C=O ester band region (ca. 1771-1714 cm(-1)). This result indicated that the sensitivity to detect treatment difference by multivariate analysis of cluster analysis (CLA) and principal components analysis (PCA) might be lower compared with univariate molecular spectral analysis. In the future, other more sensitive techniques such as "discriminant analysis" could be considered for discriminating and classifying structural differences. Molecular spectroscopy can be used as non-invasive technique to study processing-induced structural changes that are related to lipid compound in legume seeds.

  18. Examination of the Relation between the Values of Adolescents and Virtual Sensitiveness

    ERIC Educational Resources Information Center

    Yilmaz, Hasan

    2013-01-01

    The aim of this study is to examine the relation between the values adolescents have and virtual sensitiveness. The study is carried out on 447 adolescents, 160 of whom are female, 287 males. The Humanistic Values Scale and Virtual Sensitiveness scale were used. Pearson Product Moment Coefficient and multiple regression analysis techniques were…

  19. A comparison of solute-transport solution techniques and their effect on sensitivity analysis and inverse modeling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2001-01-01

    Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.

  20. Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques

    NASA Astrophysics Data System (ADS)

    Elliott, Louie C.

    This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.

  1. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  2. To what degree does the missing-data technique influence the estimated growth in learning strategies over time? A tutorial example of sensitivity analysis for longitudinal data.

    PubMed

    Coertjens, Liesje; Donche, Vincent; De Maeyer, Sven; Vanthournout, Gert; Van Petegem, Peter

    2017-01-01

    Longitudinal data is almost always burdened with missing data. However, in educational and psychological research, there is a large discrepancy between methodological suggestions and research practice. The former suggests applying sensitivity analysis in order to the robustness of the results in terms of varying assumptions regarding the mechanism generating the missing data. However, in research practice, participants with missing data are usually discarded by relying on listwise deletion. To help bridge the gap between methodological recommendations and applied research in the educational and psychological domain, this study provides a tutorial example of sensitivity analysis for latent growth analysis. The example data concern students' changes in learning strategies during higher education. One cohort of students in a Belgian university college was asked to complete the Inventory of Learning Styles-Short Version, in three measurement waves. A substantial number of students did not participate on each occasion. Change over time in student learning strategies was assessed using eight missing data techniques, which assume different mechanisms for missingness. The results indicated that, for some learning strategy subscales, growth estimates differed between the models. Guidelines in terms of reporting the results from sensitivity analysis are synthesised and applied to the results from the tutorial example.

  3. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  4. Sensitivity analyses for simulating pesticide impacts on honey bee colonies

    EPA Science Inventory

    We employ Monte Carlo simulation and sensitivity analysis techniques to describe the population dynamics of pesticide exposure to a honey bee colony using the VarroaPop + Pesticide model. Simulations are performed of hive population trajectories with and without pesti...

  5. Progress in multidisciplinary design optimization at NASA Langley

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    1993-01-01

    Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.

  6. Classification of damage in structural systems using time series analysis and supervised and unsupervised pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2010-04-01

    Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.

  7. Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.

    PubMed

    Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao

    2018-01-01

    In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.

  8. Application of positron annihilation lineshape analysis to fatigue damage and thermal embrittlement for nuclear plant materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchida, M.; Ohta, Y.; Nakamura, N.

    1995-08-01

    Positron annihilation (PA) lineshape analysis is sensitive to detect microstructural defects such as vacancies and dislocations. The authors are developing a portable system and applying this technique to nuclear power plant material evaluations; fatigue damage in type 316 stainless steel and SA508 low alloy steel, and thermal embrittlement in duplex stainless steel. The PA technique was found to be sensitive in the early fatigue life (up to 10%), but showed a little sensitivity for later stages of the fatigue life in both type 316 stainless steel and SA508 ferritic steel. Type 316 steel showed a higher PA sensitivity than SA508more » since the initial SA508 microstructure already contained a high dislocation density in the as-received state. The PA parameter increased as a fraction of aging time in CF8M samples aged at 350 C and 400 C, but didn`t change much in CF8 samples.« less

  9. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  10. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  11. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection

    PubMed Central

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close “neighborhood” of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa. PMID:26327290

  12. Sensitivity Analysis of an ENteric Immunity SImulator (ENISI)-Based Model of Immune Responses to Helicobacter pylori Infection.

    PubMed

    Alam, Maksudul; Deng, Xinwei; Philipson, Casandra; Bassaganya-Riera, Josep; Bisset, Keith; Carbo, Adria; Eubank, Stephen; Hontecillas, Raquel; Hoops, Stefan; Mei, Yongguo; Abedi, Vida; Marathe, Madhav

    2015-01-01

    Agent-based models (ABM) are widely used to study immune systems, providing a procedural and interactive view of the underlying system. The interaction of components and the behavior of individual objects is described procedurally as a function of the internal states and the local interactions, which are often stochastic in nature. Such models typically have complex structures and consist of a large number of modeling parameters. Determining the key modeling parameters which govern the outcomes of the system is very challenging. Sensitivity analysis plays a vital role in quantifying the impact of modeling parameters in massively interacting systems, including large complex ABM. The high computational cost of executing simulations impedes running experiments with exhaustive parameter settings. Existing techniques of analyzing such a complex system typically focus on local sensitivity analysis, i.e. one parameter at a time, or a close "neighborhood" of particular parameter settings. However, such methods are not adequate to measure the uncertainty and sensitivity of parameters accurately because they overlook the global impacts of parameters on the system. In this article, we develop novel experimental design and analysis techniques to perform both global and local sensitivity analysis of large-scale ABMs. The proposed method can efficiently identify the most significant parameters and quantify their contributions to outcomes of the system. We demonstrate the proposed methodology for ENteric Immune SImulator (ENISI), a large-scale ABM environment, using a computational model of immune responses to Helicobacter pylori colonization of the gastric mucosa.

  13. UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E

    EPA Science Inventory

    A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...

  14. Acceleration and sensitivity analysis of lattice kinetic Monte Carlo simulations using parallel processing and rate constant rescaling

    NASA Astrophysics Data System (ADS)

    Núñez, M.; Robie, T.; Vlachos, D. G.

    2017-10-01

    Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).

  15. Sensitivity analysis for direct and indirect effects in the presence of exposure-induced mediator-outcome confounders

    PubMed Central

    Chiba, Yasutaka

    2014-01-01

    Questions of mediation are often of interest in reasoning about mechanisms, and methods have been developed to address these questions. However, these methods make strong assumptions about the absence of confounding. Even if exposure is randomized, there may be mediator-outcome confounding variables. Inference about direct and indirect effects is particularly challenging if these mediator-outcome confounders are affected by the exposure because in this case these effects are not identified irrespective of whether data is available on these exposure-induced mediator-outcome confounders. In this paper, we provide a sensitivity analysis technique for natural direct and indirect effects that is applicable even if there are mediator-outcome confounders affected by the exposure. We give techniques for both the difference and risk ratio scales and compare the technique to other possible approaches. PMID:25580387

  16. GC/HRSIR as a Complementary Technique to GC/ECNIMS

    EPA Science Inventory

    Gas chromatography/electron capture negative ion mass spectrometry (GC/ECNIMS) is a highly selective and sensitive technique for the analysis of appropriate analytes in complex matrices. Its major drawback is often the lack of fragmentation indicative of structure that can be use...

  17. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  18. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  19. Aptamer-based microspheres for highly sensitive protein detection using fluorescently-labeled DNA nanostructures.

    PubMed

    Han, Daehoon; Hong, Jinkee; Kim, Hyun Cheol; Sung, Jong Hwan; Lee, Jong Bum

    2013-11-01

    Many highly sensitive protein detection techniques have been developed and have played an important role in the analysis of proteins. Herein, we report a novel technique that can detect proteins sensitively and effectively using aptamer-based DNA nanostructures. Thrombin was used as a target protein and aptamer was used to capture fluorescent dye-labeled DNA nanobarcodes or thrombin on a microsphere. The captured DNA nanobarcodes were replaced by a thrombin and aptamer interaction. The detection ability of this approach was confirmed by flow cytometry with different concentrations of thrombin. Our detection method has great potential for rapid and simple protein detection with a variety of aptamers.

  20. Sensitivity analyses for simulating pesticide impacts on honey bee colonies

    USDA-ARS?s Scientific Manuscript database

    We employ Monte Carlo simulation and sensitivity analysis techniques to describe the population dynamics of pesticide exposure to a honey bee colony using the VarroaPop+Pesticide model. Simulations are performed of hive population trajectories with and without pesticide exposure to determine the eff...

  1. Visualization of Global Sensitivity Analysis Results Based on a Combination of Linearly Dependent and Independent Directions

    NASA Technical Reports Server (NTRS)

    Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.

  2. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  3. High order statistical signatures from source-driven measurements of subcritical fissile systems

    NASA Astrophysics Data System (ADS)

    Mattingly, John Kelly

    1998-11-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.

  4. Computer program for analysis of imperfection sensitivity of ring stiffened shells of revolution

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.

    1971-01-01

    A FORTRAN 4 digital computer program is presented for the initial postbuckling and imperfection sensitivity analysis of bifurcation buckling modes for ring-stiffened orthotropic multilayered shells of revolution. The boundary value problem for the second-order contribution to the buckled state was solved by the forward integration technique using the Runge-Kutta method. The effects of nonlinear prebuckling states and live pressure loadings are included.

  5. Application of sensitivity-analysis techniques to the calculation of topological quantities

    NASA Astrophysics Data System (ADS)

    Gilchrist, Stuart

    2017-08-01

    Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.

  6. Optical skin friction measurement technique in hypersonic wind tunnel

    NASA Astrophysics Data System (ADS)

    Chen, Xing; Yao, Dapeng; Wen, Shuai; Pan, Junjie

    2016-10-01

    Shear-sensitive liquid-crystal coatings (SSLCCs) have an optical characteristic that they are sensitive to the applied shear stress. Based on this, a novel technique is developed to measure the applied shear stress of the model surface regarding both its magnitude and direction in hypersonic flow. The system of optical skin friction measurement are built in China Academy of Aerospace Aerodynamics (CAAA). A series of experiments of hypersonic vehicle is performed in wind tunnel of CAAA. Global skin friction distribution of the model which shows complicated flow structures is discussed, and a brief mechanism analysis and an evaluation on optical measurement technique have been made.

  7. Analysis strategies for high-resolution UHF-fMRI data.

    PubMed

    Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce

    2018-03-01

    Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  9. Analysis of painted arts by energy sensitive radiographic techniques with the Pixel Detector Timepix

    NASA Astrophysics Data System (ADS)

    Zemlicka, J.; Jakubek, J.; Kroupa, M.; Hradil, D.; Hradilova, J.; Mislerova, H.

    2011-01-01

    Non-invasive techniques utilizing X-ray radiation offer a significant advantage in scientific investigations of painted arts and other cultural artefacts such as painted artworks or statues. In addition, there is also great demand for a mobile analytical and real-time imaging device given the fact that many fine arts cannot be transported. The highly sensitive hybrid semiconductor pixel detector, Timepix, is capable of detecting and resolving subtle and low-contrast differences in the inner composition of a wide variety of objects. Moreover, it is able to map the surface distribution of the contained elements. Several transmission and emission techniques are presented which have been proposed and tested for the analysis of painted artworks. This study focuses on the novel techniques of X-ray transmission radiography (conventional and energy sensitive) and X-ray induced fluorescence imaging (XRF) which can be realised at the table-top scale with the state-of-the-art pixel detector Timepix. Transmission radiography analyses the changes in the X-ray beam intensity caused by specific attenuation of different components in the sample. The conventional approach uses all energies from the source spectrum for the creation of the image while the energy sensitive alternative creates images in given energy intervals which enable identification and separation of materials. The XRF setup is based on the detection of characteristic radiation induced by X-ray photons through a pinhole geometry collimator. The XRF method is extremely sensitive to the material composition but it creates only surface maps of the elemental distribution. For the purpose of the analysis several sets of painted layers have been prepared in a restoration laboratory. The composition of these layers corresponds to those of real historical paintings from the 19th century. An overview of the current status of our methods will be given with respect to the instrumentation and the application in the field of cultural heritage.

  10. Use of a Smartphone as a Colorimetric Analyzer in Paper-based Devices for Sensitive and Selective Determination of Mercury in Water Samples.

    PubMed

    Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb

    2018-01-01

    A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.

  11. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  12. Digression and Value Concatenation to Enable Privacy-Preserving Regression.

    PubMed

    Li, Xiao-Bai; Sarkar, Sumit

    2014-09-01

    Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.

  13. Cavity-Enhanced Absorption Spectroscopy and Photoacoustic Spectroscopy for Human Breath Analysis

    NASA Astrophysics Data System (ADS)

    Wojtas, J.; Tittel, F. K.; Stacewicz, T.; Bielecki, Z.; Lewicki, R.; Mikolajczyk, J.; Nowakowski, M.; Szabra, D.; Stefanski, P.; Tarka, J.

    2014-12-01

    This paper describes two different optoelectronic detection techniques: cavity-enhanced absorption spectroscopy and photoacoustic spectroscopy. These techniques are designed to perform a sensitive analysis of trace gas species in exhaled human breath for medical applications. With such systems, the detection of pathogenic changes at the molecular level can be achieved. The presence of certain gases (biomarkers), at increased concentration levels, indicates numerous human diseases. Diagnosis of a disease in its early stage would significantly increase chances for effective therapy. Non-invasive, real-time measurements, and high sensitivity and selectivity, capable of minimum discomfort for patients, are the main advantages of human breath analysis. At present, monitoring of volatile biomarkers in breath is commonly useful for diagnostic screening, treatment for specific conditions, therapy monitoring, control of exogenous gases (such as bacterial and poisonous emissions), as well as for analysis of metabolic gases.

  14. A case study of the sensitivity of forecast skill to data and data analysis techniques

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.

    1983-01-01

    A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.

  15. Sensitivity analysis of a pulse nutrient addition technique for estimating nutrient uptake in large streams

    Treesearch

    Laurence Lin; J.R. Webster

    2012-01-01

    The constant nutrient addition technique has been used extensively to measure nutrient uptake in streams. However, this technique is impractical for large streams, and the pulse nutrient addition (PNA) has been suggested as an alternative. We developed a computer model to simulate Monod kinetics nutrient uptake in large rivers and used this model to evaluate the...

  16. Potential of far-ultraviolet absorption spectroscopy as a highly sensitive qualitative and quantitative analysis method for polymer films, part I: classification of commercial food wrap films.

    PubMed

    Sato, Harumi; Higashi, Noboru; Ikehata, Akifumi; Koide, Noriko; Ozaki, Yukihiro

    2007-07-01

    The aim of the present study is to propose a totally new technique for the utilization of far-ultraviolet (UV) spectroscopy in polymer thin film analysis. Far-UV spectra in the 120-300 nm region have been measured in situ for six kinds of commercial polymer wrap films by use of a novel type of far-UV spectrometer that does not need vacuum evaporation. These films can be straightforwardly classified into three groups, polyethylene (PE) films, polyvinyl chloride (PVC) films, and polyvinylidene chloride (PVDC) films, by using the raw spectra. The differences in the wavelength of the absorption band due to the sigma-sigma* transition of the C-C bond have been used for the classification of the six kinds of films. Using this method, it was easy to distinguish the three kinds of PE films and to separate the two kinds of PVDC films. Compared with other spectroscopic methods, the advantages of this technique include nondestructive analysis, easy spectral measurement, high sensitivity, and simple spectral analysis. The present study has demonstrated that far-UV spectroscopy is a very promising technique for polymer film analysis.

  17. Sensitivity analysis of add-on price estimate for select silicon wafering technologies

    NASA Technical Reports Server (NTRS)

    Mokashi, A. R.

    1982-01-01

    The cost of producing wafers from silicon ingots is a major component of the add-on price of silicon sheet. Economic analyses of the add-on price estimates and their sensitivity internal-diameter (ID) sawing, multiblade slurry (MBS) sawing and fixed-abrasive slicing technique (FAST) are presented. Interim price estimation guidelines (IPEG) are used for estimating a process add-on price. Sensitivity analysis of price is performed with respect to cost parameters such as equipment, space, direct labor, materials (blade life) and utilities, and the production parameters such as slicing rate, slices per centimeter and process yield, using a computer program specifically developed to do sensitivity analysis with IPEG. The results aid in identifying the important cost parameters and assist in deciding the direction of technology development efforts.

  18. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  19. Analysis of the stochastic excitability in the flow chemical reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashkirtseva, Irina

    2015-11-30

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  20. Analysis of the stochastic excitability in the flow chemical reactor

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina

    2015-11-01

    A dynamic model of the thermochemical process in the flow reactor is considered. We study an influence of the random disturbances on the stationary regime of this model. A phenomenon of noise-induced excitability is demonstrated. For the analysis of this phenomenon, a constructive technique based on the stochastic sensitivity functions and confidence domains is applied. It is shown how elaborated technique can be used for the probabilistic analysis of the generation of mixed-mode stochastic oscillations in the flow chemical reactor.

  1. Clinical Comparison of At-Home and In-Office Dental Bleaching Procedures: A Randomized Trial of a Split-Mouth Design.

    PubMed

    Machado, Lucas Silveira; Anchieta, Rodolfo Bruniera; dos Santos, Paulo Henrique; Briso, André Luiz; Tovar, Nick; Janal, Malvin N; Coelho, Paulo Guilherme; Sundfeld, Renato Herman

    2016-01-01

    The objective of this split-mouth clinical study was to compare a combination of in-office and at-home dental bleaching with at-home bleaching alone. Two applications of in-office bleaching were performed, with one appointment per week, using 38% hydrogen peroxide. At-home bleaching was performed with or without in-office bleaching using 10% carbamide peroxide in a custom-made tray every night for 2 weeks. The factor studied was the bleaching technique on two levels: Technique 1 (in-office bleaching combined with home bleaching) and Technique 2 (home bleaching only). The response variables were color change, dental sensitivity, morphology, and surface roughness. The maxillary right and left hemiarches of the participants were submitted to in-office placebo treatment and in-office bleaching, respectively (Phase 1), and at-home bleaching (Phase 2) treatment was performed on both hemiarches, characterizing a split-mouth design. Enamel surface changes and roughness were analyzed with scanning electron microscopy and optical interferometry using epoxy replicas. No statistically significant differences were observed between the bleaching techniques for either the visual or the digital analyses. There was a significant difference in dental sensitivity when both dental bleaching techniques were used, with in-office bleaching producing the highest levels of dental sensitivity after the baseline. Microscopic analysis of the morphology and roughness of the enamel surface showed no significant changes between the bleaching techniques. The two techniques produced similar results in color change, and the combination technique produced the highest levels of sensitivity. Neither technique promoted changes in morphology or surface roughness of enamel.

  2. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  3. Evaluation of radioisotope tracer and activation analysis techniques for contamination monitoring in space environment simulation chambers

    NASA Technical Reports Server (NTRS)

    Smathers, J. B.; Kuykendall, W. E., Jr.; Wright, R. E., Jr.; Marshall, J. R.

    1973-01-01

    Radioisotope measurement techniques and neutron activation analysis are evaluated for use in identifying and locating contamination sources in space environment simulation chambers. The alpha range method allows the determination of total contaminant concentration in vapor state and condensate state. A Cf-252 neutron activation analysis system for detecting oils and greases tagged with stable elements is described. While neutron activation analysis of tagged contaminants offers specificity, an on-site system is extremely costly to implement and provides only marginal detection sensitivity under even the most favorable conditions.

  4. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  5. Negative electrospray ionization on porous supporting tips for mass spectrometric analysis: electrostatic charging effect on detection sensitivity and its application to explosive detection.

    PubMed

    Wong, Melody Yee-Man; Man, Sin-Heng; Che, Chi-Ming; Lau, Kai-Chung; Ng, Kwan-Ming

    2014-03-21

    The simplicity and easy manipulation of a porous substrate-based ESI-MS technique have been widely applied to the direct analysis of different types of samples in positive ion mode. However, the study and application of this technique in negative ion mode are sparse. A key challenge could be due to the ease of electrical discharge on supporting tips upon the application of negative voltage. The aim of this study is to investigate the effect of supporting materials, including polyester, polyethylene and wood, on the detection sensitivity of a porous substrate-based negative ESI-MS technique. By using nitrobenzene derivatives and nitrophenol derivatives as the target analytes, it was found that the hydrophobic materials (i.e., polyethylene and polyester) with a higher tendency to accumulate negative charge could enhance the detection sensitivity towards nitrobenzene derivatives via electron-capture ionization; whereas, compounds with electron affinities lower than the cut-off value (1.13 eV) were not detected. Nitrophenol derivatives with pKa smaller than 9.0 could be detected in the form of deprotonated ions; whereas polar materials (i.e., wood), which might undergo competitive deprotonation with the analytes, could suppress the detection sensitivity. With the investigation of the material effects on the detection sensitivity, the porous substrate-based negative ESI-MS method was developed and applied to the direct detection of two commonly encountered explosives in complex samples.

  6. Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods

    PubMed Central

    Punshon, Tracy

    2015-01-01

    Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012

  7. Sensitivity analysis of automatic flight control systems using singular value concepts

    NASA Technical Reports Server (NTRS)

    Herrera-Vaillard, A.; Paduano, J.; Downing, D.

    1985-01-01

    A sensitivity analysis is presented that can be used to judge the impact of vehicle dynamic model variations on the relative stability of multivariable continuous closed-loop control systems. The sensitivity analysis uses and extends the singular-value concept by developing expressions for the gradients of the singular value with respect to variations in the vehicle dynamic model and the controller design. Combined with a priori estimates of the accuracy of the model, the gradients are used to identify the elements in the vehicle dynamic model and controller that could severely impact the system's relative stability. The technique is demonstrated for a yaw/roll damper stability augmentation designed for a business jet.

  8. Detection of cervical lesions by multivariate analysis of diffuse reflectance spectra: a clinical study.

    PubMed

    Prabitha, Vasumathi Gopala; Suchetha, Sambasivan; Jayanthi, Jayaraj Lalitha; Baiju, Kamalasanan Vijayakumary; Rema, Prabhakaran; Anuraj, Koyippurath; Mathews, Anita; Sebastian, Paul; Subhash, Narayanan

    2016-01-01

    Diffuse reflectance (DR) spectroscopy is a non-invasive, real-time, and cost-effective tool for early detection of malignant changes in squamous epithelial tissues. The present study aims to evaluate the diagnostic power of diffuse reflectance spectroscopy for non-invasive discrimination of cervical lesions in vivo. A clinical trial was carried out on 48 sites in 34 patients by recording DR spectra using a point-monitoring device with white light illumination. The acquired data were analyzed and classified using multivariate statistical analysis based on principal component analysis (PCA) and linear discriminant analysis (LDA). Diagnostic accuracies were validated using random number generators. The receiver operating characteristic (ROC) curves were plotted for evaluating the discriminating power of the proposed statistical technique. An algorithm was developed and used to classify non-diseased (normal) from diseased sites (abnormal) with a sensitivity of 72 % and specificity of 87 %. While low-grade squamous intraepithelial lesion (LSIL) could be discriminated from normal with a sensitivity of 56 % and specificity of 80 %, and high-grade squamous intraepithelial lesion (HSIL) from normal with a sensitivity of 89 % and specificity of 97 %, LSIL could be discriminated from HSIL with 100 % sensitivity and specificity. The areas under the ROC curves were 0.993 (95 % confidence interval (CI) 0.0 to 1) and 1 (95 % CI 1) for the discrimination of HSIL from normal and HSIL from LSIL, respectively. The results of the study show that DR spectroscopy could be used along with multivariate analytical techniques as a non-invasive technique to monitor cervical disease status in real time.

  9. Performance and sensitivity analysis of the generalized likelihood ratio method for failure detection. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bueno, R. A.

    1977-01-01

    Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.

  10. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  11. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  12. Detection of proteolytic activity by covalent tethering of fluorogenic substrates in zymogram gels.

    PubMed

    Deshmukh, Ameya A; Weist, Jessica L; Leight, Jennifer L

    2018-05-01

    Current zymographic techniques detect only a subset of known proteases due to the limited number of native proteins that have been optimized for incorporation into polyacrylamide gels. To address this limitation, we have developed a technique to covalently incorporate fluorescently labeled, protease-sensitive peptides using an azido-PEG3-maleimide crosslinker. Peptides incorporated into gels enabled measurement of MMP-2, -9, -14, and bacterial collagenase. Sensitivity analysis demonstrated that use of peptide functionalized gels could surpass detection limits of current techniques. Finally, electrophoresis of conditioned media from cultured cells resulted in the appearance of several proteolytic bands, some of which were undetectable by gelatin zymography. Taken together, these results demonstrate that covalent incorporation of fluorescent substrates can greatly expand the library of detectable proteases using zymographic techniques.

  13. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  14. Sensitivity of Forecast Skill to Different Objective Analysis Schemes

    NASA Technical Reports Server (NTRS)

    Baker, W. E.

    1979-01-01

    Numerical weather forecasts are characterized by rapidly declining skill in the first 48 to 72 h. Recent estimates of the sources of forecast error indicate that the inaccurate specification of the initial conditions contributes substantially to this error. The sensitivity of the forecast skill to the initial conditions is examined by comparing a set of real-data experiments whose initial data were obtained with two different analysis schemes. Results are presented to emphasize the importance of the objective analysis techniques used in the assimilation of observational data.

  15. Design and characterization of planar capacitive imaging probe based on the measurement sensitivity distribution

    NASA Astrophysics Data System (ADS)

    Yin, X.; Chen, G.; Li, W.; Huthchins, D. A.

    2013-01-01

    Previous work indicated that the capacitive imaging (CI) technique is a useful NDE tool which can be used on a wide range of materials, including metals, glass/carbon fibre composite materials and concrete. The imaging performance of the CI technique for a given application is determined by design parameters and characteristics of the CI probe. In this paper, a rapid method for calculating the whole probe sensitivity distribution based on the finite element model (FEM) is presented to provide a direct view of the imaging capabilities of the planar CI probe. Sensitivity distributions of CI probes with different geometries were obtained. Influencing factors on sensitivity distribution were studied. Comparisons between CI probes with point-to-point triangular electrode pair and back-to-back triangular electrode pair were made based on the analysis of the corresponding sensitivity distributions. The results indicated that the sensitivity distribution could be useful for optimising the probe design parameters and predicting the imaging performance.

  16. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  17. Applying Recursive Sensitivity Analysis to Multi-Criteria Decision Models to Reduce Bias in Defense Cyber Engineering Analysis

    DTIC Science & Technology

    2015-10-28

    techniques such as regression analysis, correlation, and multicollinearity assessment to identify the change and error on the input to the model...between many of the independent or predictor variables, the issue of multicollinearity may arise [18]. VII. SUMMARY Accurate decisions concerning

  18. A comparison of solute-transport solution techniques based on inverse modelling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2000-01-01

    Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.

  19. Evaluation of mobile digital light-emitting diode fluorescence microscopy in Hanoi, Viet Nam.

    PubMed

    Chaisson, L H; Reber, C; Phan, H; Switz, N; Nilsson, L M; Myers, F; Nhung, N V; Luu, L; Pham, T; Vu, C; Nguyen, H; Nguyen, A; Dinh, T; Nahid, P; Fletcher, D A; Cattamanchi, A

    2015-09-01

    Hanoi Lung Hospital, Hanoi, Viet Nam. To compare the accuracy of CellScopeTB, a manually operated mobile digital fluorescence microscope, with conventional microscopy techniques. Patients referred for sputum smear microscopy to the Hanoi Lung Hospital from May to September 2013 were included. Ziehl-Neelsen (ZN) smear microscopy, conventional light-emitting diode (LED) fluorescence microscopy (FM), CellScopeTB-based LED FM and Xpert(®) MTB/RIF were performed on sputum samples. The sensitivity and specificity of microscopy techniques were determined in reference to Xpert results, and differences were compared using McNemar's paired test of proportions. Of 326 patients enrolled, 93 (28.5%) were Xpert-positive for TB. The sensitivity of ZN microscopy, conventional LED FM, and CellScopeTB-based LED FM was respectively 37.6% (95%CI 27.8-48.3), 41.9% (95%CI 31.8-52.6), and 35.5% (95%CI 25.8-46.1). The sensitivity of CellScopeTB was similar to that of conventional LED FM (difference -6.5%, 95%CI -18.2 to 5.3, P = 0.33) and ZN microscopy (difference -2.2%, 95%CI -9.2 to 4.9, P = 0.73). The specificity was >99% for all three techniques. CellScopeTB performed similarly to conventional microscopy techniques in the hands of experienced TB microscopists. However, the sensitivity of all sputum microscopy techniques was low. Options enabled by digital microscopy, such as automated imaging with real-time computerized analysis, should be explored to increase sensitivity.

  20. Instrument performance of a radon measuring system with the alpha-track detection technique.

    PubMed

    Tokonami, S; Zhuo, W; Ryuo, H; Yonehara, H; Yamada, Y; Shimo, M

    2003-01-01

    An instrument performance test has been carried out for a radon measuring system made in Hungary. The system measures radon using the alpha-track detection technique. It consists of three parts: the passive detector, the etching unit and the evaluation unit. A CR-39 detector is used as the radiation detector. Alpha-track reading and data analysis are carried out after chemical etching. The following subjects were examined in the present study: (1) radon sensitivity, (2) performance of etching and evaluation processes and (3) thoron sensitivity. The radon sensitivity of 6.9 x 10(-4) mm(-2) (Bq m(-3) d)(-1) was acceptable for practical application. The thoron sensitivity was estimated to be as low as 3.3 x 10(-5) mm(-2) (Bq m(-3) d)(-1) from the experimental study.

  1. Evaluation of ALK gene rearrangement in central nervous system metastases of non-small-cell lung cancer using two-step RT-PCR technique.

    PubMed

    Nicoś, M; Krawczyk, P; Wojas-Krawczyk, K; Bożyk, A; Jarosz, B; Sawicki, M; Trojanowski, T; Milanowski, J

    2017-12-01

    RT-PCR technique has showed a promising value as pre-screening method for detection of mRNA containing abnormal ALK sequences, but its sensitivity and specificity is still discussable. Previously, we determined the incidence of ALK rearrangement in CNS metastases of NSCLC using IHC and FISH methods. We evaluated ALK gene rearrangement using two-step RT-PCR method with EML4-ALK Fusion Gene Detection Kit (Entrogen, USA). The studied group included 145 patients (45 females, 100 males) with CNS metastases of NSCLC and was heterogeneous in terms of histology and smoking status. 21% of CNS metastases of NSCLC (30/145) showed presence of mRNA containing abnormal ALK sequences. FISH and IHC tests confirmed the presence of ALK gene rearrangement and expression of ALK abnormal protein in seven patients with positive result of RT-PCR analysis (4.8% of all patients, 20% of RT-PCR positive patients). RT-PCR method compared to FISH analysis achieved 100% of sensitivity and only 82.7% of specificity. IHC method compared to FISH method indicated 100% of sensitivity and 97.8% of specificity. In comparison to IHC, RT-PCR showed identical sensitivity with high number of false positive results. Utility of RT-PCR technique in screening of ALK abnormalities and in qualification patients for molecularly targeted therapies needs further validation.

  2. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  3. Sobol' sensitivity analysis for stressor impacts on honeybee ...

    EPA Pesticide Factsheets

    We employ Monte Carlo simulation and nonlinear sensitivity analysis techniques to describe the dynamics of a bee exposure model, VarroaPop. Daily simulations are performed of hive population trajectories, taking into account queen strength, foraging success, mite impacts, weather, colony resources, population structure, and other important variables. This allows us to test the effects of defined pesticide exposure scenarios versus controlled simulations that lack pesticide exposure. The daily resolution of the model also allows us to conditionally identify sensitivity metrics. We use the variancebased global decomposition sensitivity analysis method, Sobol’, to assess firstand secondorder parameter sensitivities within VarroaPop, allowing us to determine how variance in the output is attributed to each of the input variables across different exposure scenarios. Simulations with VarroaPop indicate queen strength, forager life span and pesticide toxicity parameters are consistent, critical inputs for colony dynamics. Further analysis also reveals that the relative importance of these parameters fluctuates throughout the simulation period according to the status of other inputs. Our preliminary results show that model variability is conditional and can be attributed to different parameters depending on different timescales. By using sensitivity analysis to assess model output and variability, calibrations of simulation models can be better informed to yield more

  4. Development of techniques for the analysis of isoflavones in soy foods and nutraceuticals.

    PubMed

    Dentith, Susan; Lockwood, Brian

    2008-05-01

    For over 20 years, soy isoflavones have been investigated for their ability to prevent a wide range of cancers and cardiovascular problems, and numerous other disease states. This research is underpinned by the ability of researchers to analyse isoflavones in various forms in a range of raw materials and biological fluids. This review summarizes the techniques recently used in their analysis. The speed of high performance liquid chromatography analysis has been improved, allowing analysis of more samples, and increasing the sensitivity of detection techniques allows quantification of isoflavones down to nanomoles per litre levels in biological fluids. The combination of high-performance liquid chromatography with immunoassay has allowed identification and estimation of low-level soy isoflavones. The use of soy isoflavone supplements has shown an increase in their circulating levels in plasma and urine, aiding investigation of their biological effects. The significance of the metabolite equol has spurned research into new areas, and recently the specific enantiomers have been studied. High-performance liquid chromatography, capillary electrophoresis and gas chromatography are widely used with a range of detection systems. Increasingly, immunoassay is being used because of its high sensitivity and low cost.

  5. Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.

    PubMed

    Sisco, Edward; Dake, Jeffrey; Bridge, Candice

    2013-10-10

    Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Real-time In vivo Diagnosis of Nasopharyngeal Carcinoma Using Rapid Fiber-Optic Raman Spectroscopy.

    PubMed

    Lin, Kan; Zheng, Wei; Lim, Chwee Ming; Huang, Zhiwei

    2017-01-01

    We report the utility of a simultaneous fingerprint (FP) (i.e., 800-1800 cm -1 ) and high-wavenumber (HW) (i.e., 2800-3600 cm -1 ) fiber-optic Raman spectroscopy developed for real-time in vivo diagnosis of nasopharyngeal carcinoma (NPC) at endoscopy. A total of 3731 high-quality in vivo FP/HW Raman spectra (normal=1765; cancer=1966) were acquired in real-time from 204 tissue sites (normal=95; cancer=109) of 95 subjects (normal=57; cancer=38) undergoing endoscopic examination. FP/HW Raman spectra differ significantly between normal and cancerous nasopharyngeal tissues that could be attributed to changes of proteins, lipids, nucleic acids, and the bound water content in NPC. Principal components analysis (PCA) and linear discriminant analysis (LDA) together with leave-one subject-out, cross-validation (LOO-CV) were implemented to develop robust Raman diagnostic models. The simultaneous FP/HW Raman spectroscopy technique together with PCA-LDA and LOO-CV modeling provides a diagnostic accuracy of 93.1% (sensitivity of 93.6%; specificity of 92.6%) for nasopharyngeal cancer identification, which is superior to using either FP (accuracy of 89.2%; sensitivity of 89.9%; specificity of 88.4%) or HW (accuracy of 89.7%; sensitivity of 89.0%; specificity of 90.5%) Raman technique alone. Further receiver operating characteristic (ROC) analysis reconfirms the best performance of the simultaneous FP/HW Raman technique for in vivo diagnosis of NPC. This work demonstrates for the first time that simultaneous FP/HW fiber-optic Raman spectroscopy technique has great promise for enhancing real-time in vivo cancer diagnosis in the nasopharynx during endoscopic examination.

  7. Sensitive Multi-Species Emissions Monitoring: Infrared Laser-Based Detection of Trace-Level Contaminants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steill, Jeffrey D.; Huang, Haifeng; Hoops, Alexandra A.

    This report summarizes our development of spectroscopic chemical analysis techniques and spectral modeling for trace-gas measurements of highly-regulated low-concentration species present in flue gas emissions from utility coal boilers such as HCl under conditions of high humidity. Detailed spectral modeling of the spectroscopy of HCl and other important combustion and atmospheric species such as H 2 O, CO 2 , N 2 O, NO 2 , SO 2 , and CH 4 demonstrates that IR-laser spectroscopy is a sensitive multi-component analysis strategy. Experimental measurements from techniques based on IR laser spectroscopy are presented that demonstrate sub-ppm sensitivity levels to thesemore » species. Photoacoustic infrared spectroscopy is used to detect and quantify HCl at ppm levels with extremely high signal-to-noise even under conditions of high relative humidity. Additionally, cavity ring-down IR spectroscopy is used to achieve an extremely high sensitivity to combustion trace gases in this spectral region; ppm level CH 4 is one demonstrated example. The importance of spectral resolution in the sensitivity of a trace-gas measurement is examined by spectral modeling in the mid- and near-IR, and efforts to improve measurement resolution through novel instrument development are described. While previous project reports focused on benefits and complexities of the dual-etalon cavity ring-down infrared spectrometer, here details on steps taken to implement this unique and potentially revolutionary instrument are described. This report also illustrates and critiques the general strategy of IR- laser photodetection of trace gases leading to the conclusion that mid-IR laser spectroscopy techniques provide a promising basis for further instrument development and implementation that will enable cost-effective sensitive detection of multiple key contaminant species simultaneously.« less

  8. Sensitivity Analysis for Coupled Aero-structural Systems

    NASA Technical Reports Server (NTRS)

    Giunta, Anthony A.

    1999-01-01

    A novel method has been developed for calculating gradients of aerodynamic force and moment coefficients for an aeroelastic aircraft model. This method uses the Global Sensitivity Equations (GSE) to account for the aero-structural coupling, and a reduced-order modal analysis approach to condense the coupling bandwidth between the aerodynamic and structural models. Parallel computing is applied to reduce the computational expense of the numerous high fidelity aerodynamic analyses needed for the coupled aero-structural system. Good agreement is obtained between aerodynamic force and moment gradients computed with the GSE/modal analysis approach and the same quantities computed using brute-force, computationally expensive, finite difference approximations. A comparison between the computational expense of the GSE/modal analysis method and a pure finite difference approach is presented. These results show that the GSE/modal analysis approach is the more computationally efficient technique if sensitivity analysis is to be performed for two or more aircraft design parameters.

  9. Efficient sensitivity analysis method for chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Liao, Haitao

    2016-05-01

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results in an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.

  10. Development of a sensitive GC-C-IRMS method for the analysis of androgens.

    PubMed

    Polet, Michael; Van Gansbeke, Wim; Deventer, Koen; Van Eenoo, Peter

    2013-02-01

    The administration of anabolic steroids is one of the most important issues in doping control and is detectable through a change in the carbon isotopic composition of testosterone and/or its metabolites. Gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS), however, remains a very laborious and expensive technique and substantial amounts of urine are needed to meet the sensitivity requirements of the IRMS. This can be problematic because only a limited amount of urine is available for anti-doping analysis on a broad spectrum of substances. In this work we introduce a new type of injection that increases the sensitivity of GC-C-IRMS by a factor of 13 and reduces the limit of detection, simply by using solvent vent injections instead of splitless injection. This drastically reduces the amount of urine required. On top of that, by only changing the injection technique, the detection parameters of the IRMS are not affected and there is no loss in linearity. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Sensitivity analysis of hydrodynamic stability operators

    NASA Technical Reports Server (NTRS)

    Schmid, Peter J.; Henningson, Dan S.; Khorrami, Mehdi R.; Malik, Mujeeb R.

    1992-01-01

    The eigenvalue sensitivity for hydrodynamic stability operators is investigated. Classical matrix perturbation techniques as well as the concept of epsilon-pseudoeigenvalues are applied to show that parts of the spectrum are highly sensitive to small perturbations. Applications are drawn from incompressible plane Couette, trailing line vortex flow and compressible Blasius boundary layer flow. Parametric studies indicate a monotonically increasing effect of the Reynolds number on the sensitivity. The phenomenon of eigenvalue sensitivity is due to the non-normality of the operators and their discrete matrix analogs and may be associated with large transient growth of the corresponding initial value problem.

  12. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Tolson, Bryan

    2017-04-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. Subsequently, we focus on the model-independency by testing the frugal method using the hydrologic model mHM (www.ufz.de/mhm) with about 50 model parameters. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an efficient way. The appealing feature of this new technique is the necessity of no further model evaluation and therefore enables checking of already processed (and published) sensitivity results. This is one step towards reliable and transferable, published sensitivity results.

  13. Novel Multidimensional Cross-Correlation Data Comparison Techniques for Spectroscopic Discernment in a Volumetrically Sensitive, Moderating Type Neutron Spectrometer

    NASA Astrophysics Data System (ADS)

    Hoshor, Cory; Young, Stephan; Rogers, Brent; Currie, James; Oakes, Thomas; Scott, Paul; Miller, William; Caruso, Anthony

    2014-03-01

    A novel application of the Pearson Cross-Correlation to neutron spectral discernment in a moderating type neutron spectrometer is introduced. This cross-correlation analysis will be applied to spectral response data collected through both MCNP simulation and empirical measurement by the volumetrically sensitive spectrometer for comparison in 1, 2, and 3 spatial dimensions. The spectroscopic analysis methods discussed will be demonstrated to discern various common spectral and monoenergetic neutron sources.

  14. Photocleavable DNA barcode-antibody conjugates allow sensitive and multiplexed protein analysis in single cells.

    PubMed

    Agasti, Sarit S; Liong, Monty; Peterson, Vanessa M; Lee, Hakho; Weissleder, Ralph

    2012-11-14

    DNA barcoding is an attractive technology, as it allows sensitive and multiplexed target analysis. However, DNA barcoding of cellular proteins remains challenging, primarily because barcode amplification and readout techniques are often incompatible with the cellular microenvironment. Here we describe the development and validation of a photocleavable DNA barcode-antibody conjugate method for rapid, quantitative, and multiplexed detection of proteins in single live cells. Following target binding, this method allows DNA barcodes to be photoreleased in solution, enabling easy isolation, amplification, and readout. As a proof of principle, we demonstrate sensitive and multiplexed detection of protein biomarkers in a variety of cancer cells.

  15. A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blake, A. S. T.; Chapman, J. D.; Thomson, M. A.

    Tmore » his paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of ν μ charged-current interactions with an oscillatory dependence on L ν / E ν , where L ν is the neutrino propagation distance and E mrow is="true"> ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the L ν / E ν resolution varies significantly from event to event. he precision of the oscillation measurement can be improved by incorporating information on L ν / E ν resolution into the oscillation analysis. In the analysis presented here, a Bayesian technique is used to estimate the L ν / E ν resolution of observed atmospheric neutrinos on an event-by-event basis. By separating the events into bins of L ν / E ν resolution in the oscillation analysis, a significant improvement in oscillation sensitivity can be achieved.« less

  16. Comparison of two preparatory techniques for urine cytology.

    PubMed Central

    Dhundee, J; Rigby, H S

    1990-01-01

    Two methods of preparation of urine for cytology were compared retrospectively. In method 1 cells in the urine were fixed after the preparation of the smear; in method 2 the cells were fixed before smear preparation. Urine cytology reports were correlated with subsequent histological analysis. The specificities of urine cytology using both methods were high (99%). The sensitivity using method 1 was 87%; using method 2 it was 65%. This difference was significant. The cell preparation technique therefore significantly changes the sensitivity of urine cytology. Cellular fixation after smear preparation is preferable to smear preparation after fixation. PMID:2266176

  17. Quantitative polarized light microscopy using spectral multiplexing interferometry.

    PubMed

    Li, Chengshuai; Zhu, Yizheng

    2015-06-01

    We propose an interferometric spectral multiplexing method for measuring birefringent specimens with simple configuration and high sensitivity. The retardation and orientation of sample birefringence are simultaneously encoded onto two spectral carrier waves, generated interferometrically by a birefringent crystal through polarization mixing. A single interference spectrum hence contains sufficient information for birefringence determination, eliminating the need for mechanical rotation or electrical modulation. The technique is analyzed theoretically and validated experimentally on cellulose film. System simplicity permits the possibility of mitigating system birefringence background. Further analysis demonstrates the technique's exquisite sensitivity as high as ∼20  pm for retardation measurement.

  18. Pressurized capillary electrochromatographic analysis of water-soluble vitamins by combining with on-line concentration technique.

    PubMed

    Jia, Li; Liu, Yaling; Du, Yanyan; Xing, Da

    2007-06-22

    A pressurized capillary electrochromatography (pCEC) system was developed for the separation of water-soluble vitamins, in which UV absorbance was used as the detection method and a monolithic silica-ODS column as the separation column. The parameters (type and content of organic solvent in the mobile phase, type and concentration of electrolyte, pH of the electrolyte buffer, applied voltage and flow rate) affecting the separation resolution were evaluated. The combination of two on-line concentration techniques, namely, solvent gradient zone sharpening effect and field-enhanced sample stacking, was utilized to improve detection sensitivity, which proved to be beneficial to enhance the detection sensitivity by enabling the injection of large volumes of samples. Coupling electrokinetic injection with the on-line concentration techniques was much more beneficial for the concentration of positively charged vitamins. Comparing with the conventional injection mode, the enhancement in the detection sensitivities of water-soluble vitamins using the on-line concentration technique is in the range of 3 to 35-fold. The developed pCEC method was applied to evaluate water-soluble vitamins in corns.

  19. Application of advanced multidisciplinary analysis and optimization methods to vehicle design synthesis

    NASA Technical Reports Server (NTRS)

    Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw

    1990-01-01

    Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.

  20. Flight testing techniques for the evaluation of light aircraft stability derivatives: A review and analysis

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Summery, D. C.; Johnson, W. D.

    1972-01-01

    Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.

  1. Electron spin resonance as a high sensitivity technique for environmental magnetism: determination of contamination in carbonate sediments

    NASA Astrophysics Data System (ADS)

    Crook, Nigel P.; Hoon, Stephen R.; Taylor, Kevin G.; Perry, Chris T.

    2002-05-01

    This study investigates the application of high sensitivity electron spin resonance (ESR) to environmental magnetism in conjunction with the more conventional techniques of magnetic susceptibility, vibrating sample magnetometry (VSM) and chemical compositional analysis. Using these techniques we have studied carbonate sediment samples from Discovery Bay, Jamaica, which has been impacted to varying degrees by a bauxite loading facility. The carbonate sediment samples contain magnetic minerals ranging from moderate to low concentrations. The ESR spectra for all sites essentially contain three components. First, a six-line spectra centred around g = 2 resulting from Mn2+ ions within a carbonate matrix; second a g = 4.3 signal from isolated Fe3+ ions incorporated as impurities within minerals such as gibbsite, kaolinite or quartz; third a ferrimagnetic resonance with a maxima at 230 mT resulting from the ferrimagnetic minerals present within the bauxite contamination. Depending upon the location of the sites within the embayment these signals vary in their relative amplitude in a systematic manner related to the degree of bauxite input. Analysis of the ESR spectral components reveals linear relationships between the amplitude of the Mn2+ and ferrimagnetic signals and total Mn and Fe concentrations. To assist in determining the origin of the ESR signals coral and bauxite reference samples were employed. Coral representative of the matrix of the sediment was taken remote from the bauxite loading facility whilst pure bauxite was collected from nearby mining facilities. We find ESR to be a very sensitive technique particularly appropriate to magnetic analysis of ferri- and para-magnetic components within environmental samples otherwise dominated by diamagnetic (carbonate) minerals. When employing typical sample masses of 200 mg the practical detection limit of ESR to ferri- and para-magnetic minerals within a diamagnetic carbonate matrix is of the order of 1 ppm and 1 ppb respectively, approximately 102 and 105 times the sensitivity achievable employing the VSM in our laboratory.

  2. Topics in Chemical Instrumentation.

    ERIC Educational Resources Information Center

    Settle, Frank A. Jr., Ed.

    1989-01-01

    Using Fourier transformation methods in nuclear resonance has made possible increased sensitivity in chemical analysis. This article describes data acquisition, data processing, and the frequency spectrum as they relate to this technique. (CW)

  3. Accelerated Sensitivity Analysis in High-Dimensional Stochastic Reaction Networks

    PubMed Central

    Arampatzis, Georgios; Katsoulakis, Markos A.; Pantazis, Yannis

    2015-01-01

    Existing sensitivity analysis approaches are not able to handle efficiently stochastic reaction networks with a large number of parameters and species, which are typical in the modeling and simulation of complex biochemical phenomena. In this paper, a two-step strategy for parametric sensitivity analysis for such systems is proposed, exploiting advantages and synergies between two recently proposed sensitivity analysis methodologies for stochastic dynamics. The first method performs sensitivity analysis of the stochastic dynamics by means of the Fisher Information Matrix on the underlying distribution of the trajectories; the second method is a reduced-variance, finite-difference, gradient-type sensitivity approach relying on stochastic coupling techniques for variance reduction. Here we demonstrate that these two methods can be combined and deployed together by means of a new sensitivity bound which incorporates the variance of the quantity of interest as well as the Fisher Information Matrix estimated from the first method. The first step of the proposed strategy labels sensitivities using the bound and screens out the insensitive parameters in a controlled manner. In the second step of the proposed strategy, a finite-difference method is applied only for the sensitivity estimation of the (potentially) sensitive parameters that have not been screened out in the first step. Results on an epidermal growth factor network with fifty parameters and on a protein homeostasis with eighty parameters demonstrate that the proposed strategy is able to quickly discover and discard the insensitive parameters and in the remaining potentially sensitive parameters it accurately estimates the sensitivities. The new sensitivity strategy can be several times faster than current state-of-the-art approaches that test all parameters, especially in “sloppy” systems. In particular, the computational acceleration is quantified by the ratio between the total number of parameters over the number of the sensitive parameters. PMID:26161544

  4. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  5. An advanced dual labeled gold nanoparticles probe to detect Cryptosporidium parvum using rapid immuno-dot blot assay.

    PubMed

    Thiruppathiraja, Chinnasamy; Kamatchiammal, Senthilkumar; Adaikkappan, Periyakaruppan; Alagar, Muthukaruppan

    2011-07-15

    The zoonotic protozoan parasite Cryptosporidium parvum poses a significant risk to public health. Due to the low infectious dose of C. parvum, remarkably sensitive detection methods are required for water and food industries analysis. However PCR affirmed sensing method of the causative nucleic acid has numerous advantages, still criterion demands for simple techniques and expertise understanding to extinguish its routine use. In contrast, protein based immuno detecting techniques are simpler to perform by a commoner, but lack of sensitivity due to inadequate signal amplification. In this paper, we focused on the development of a mere sensitive immuno detection method by coupling anti-cyst antibody and alkaline phosphatase on gold nanoparticle for C. parvum is described. Outcome of the sensitivity in an immuno-dot blot assay detection is enhanced by 500 fold (using conventional method) and visually be able to detect up to 10 oocysts/mL with minimal processing period. Techniques reported in this paper substantiate the convenience of immuno-dot blot assay for the routine screening of C. parvum in water/environmental examines and most importantly, demonstrates the potential of a prototype development of simple and inexpensive diagnostic technique. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Application of differential pulse voltammetry to determine the efficiency of stripping tocopherols from commercial fish oil

    USDA-ARS?s Scientific Manuscript database

    There has been an increase in the use of electrochemical methods for monitoring antioxidant levels in a variety of disciplines due to the sensitivity, low detection limits, ease of use, low cost and rapid analysis time offered by these techniques. One technique that has received specific attention i...

  7. Raman imaging from microscopy to macroscopy: Quality and safety control of biological materials

    USDA-ARS?s Scientific Manuscript database

    Raman imaging can analyze biological materials by generating detailed chemical images. Over the last decade, tremendous advancements in Raman imaging and data analysis techniques have overcome problems such as long data acquisition and analysis times and poor sensitivity. This review article introdu...

  8. Assessing direct analysis in real-time-mass spectrometry (DART-MS) for the rapid identification of additives in food packaging.

    PubMed

    Ackerman, L K; Noonan, G O; Begley, T H

    2009-12-01

    The ambient ionization technique direct analysis in real time (DART) was characterized and evaluated for the screening of food packaging for the presence of packaging additives using a benchtop mass spectrometer (MS). Approximate optimum conditions were determined for 13 common food-packaging additives, including plasticizers, anti-oxidants, colorants, grease-proofers, and ultraviolet light stabilizers. Method sensitivity and linearity were evaluated using solutions and characterized polymer samples. Additionally, the response of a model additive (di-ethyl-hexyl-phthalate) was examined across a range of sample positions, DART, and MS conditions (temperature, voltage and helium flow). Under optimal conditions, molecular ion (M+H+) was the major ion for most additives. Additive responses were highly sensitive to sample and DART source orientation, as well as to DART flow rates, temperatures, and MS inlet voltages, respectively. DART-MS response was neither consistently linear nor quantitative in this setting, and sensitivity varied by additive. All additives studied were rapidly identified in multiple food-packaging materials by DART-MS/MS, suggesting this technique can be used to screen food packaging rapidly. However, method sensitivity and quantitation requires further study and improvement.

  9. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  10. Addressing Curse of Dimensionality in Sensitivity Analysis: How Can We Handle High-Dimensional Problems?

    NASA Astrophysics Data System (ADS)

    Safaei, S.; Haghnegahdar, A.; Razavi, S.

    2016-12-01

    Complex environmental models are now the primary tool to inform decision makers for the current or future management of environmental resources under the climate and environmental changes. These complex models often contain a large number of parameters that need to be determined by a computationally intensive calibration procedure. Sensitivity analysis (SA) is a very useful tool that not only allows for understanding the model behavior, but also helps in reducing the number of calibration parameters by identifying unimportant ones. The issue is that most global sensitivity techniques are highly computationally demanding themselves for generating robust and stable sensitivity metrics over the entire model response surface. Recently, a novel global sensitivity analysis method, Variogram Analysis of Response Surfaces (VARS), is introduced that can efficiently provide a comprehensive assessment of global sensitivity using the Variogram concept. In this work, we aim to evaluate the effectiveness of this highly efficient GSA method in saving computational burden, when applied to systems with extra-large number of input factors ( 100). We use a test function and a hydrological modelling case study to demonstrate the capability of VARS method in reducing problem dimensionality by identifying important vs unimportant input factors.

  11. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  12. Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector

    NASA Astrophysics Data System (ADS)

    Lenel, U. R.; Davies, D. G. S.; Moore, M. A.

    An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.

  13. Global DNA methylation analysis using methyl-sensitive amplification polymorphism (MSAP).

    PubMed

    Yaish, Mahmoud W; Peng, Mingsheng; Rothstein, Steven J

    2014-01-01

    DNA methylation is a crucial epigenetic process which helps control gene transcription activity in eukaryotes. Information regarding the methylation status of a regulatory sequence of a particular gene provides important knowledge of this transcriptional control. DNA methylation can be detected using several methods, including sodium bisulfite sequencing and restriction digestion using methylation-sensitive endonucleases. Methyl-Sensitive Amplification Polymorphism (MSAP) is a technique used to study the global DNA methylation status of an organism and hence to distinguish between two individuals based on the DNA methylation status determined by the differential digestion pattern. Therefore, this technique is a useful method for DNA methylation mapping and positional cloning of differentially methylated genes. In this technique, genomic DNA is first digested with a methylation-sensitive restriction enzyme such as HpaII, and then the DNA fragments are ligated to adaptors in order to facilitate their amplification. Digestion using a methylation-insensitive isoschizomer of HpaII, MspI is used in a parallel digestion reaction as a loading control in the experiment. Subsequently, these fragments are selectively amplified by fluorescently labeled primers. PCR products from different individuals are compared, and once an interesting polymorphic locus is recognized, the desired DNA fragment can be isolated from a denaturing polyacrylamide gel, sequenced and identified based on DNA sequence similarity to other sequences available in the database. We will use analysis of met1, ddm1, and atmbd9 mutants and wild-type plants treated with a cytidine analogue, 5-azaC, or zebularine to demonstrate how to assess the genetic modulation of DNA methylation in Arabidopsis. It should be noted that despite the fact that MSAP is a reliable technique used to fish for polymorphic methylated loci, its power is limited to the restriction recognition sites of the enzymes used in the genomic DNA digestion.

  14. A Resampling Analysis of Federal Family Assistance Program Quality Control Data: An Application of the Bootstrap.

    ERIC Educational Resources Information Center

    Hand, Michael L.

    1990-01-01

    Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…

  15. Sensitive Carbohydrate Detection using Surface Enhanced Raman Tagging

    PubMed Central

    Vangala, Karthikeshwar; Yanney, Michael; Hsiao, Cheng-Te; Wu, Wells W.; Shen, Rong-Fong; Zou, Sige; Sygula, Andrzej; Zhang, Dongmao

    2010-01-01

    Glycomic analysis is an increasingly important field in biological and biomedical research as glycosylation is one of the most important protein post-translational modifications. We have developed a new technique to detect carbohydrates using surface enhanced Raman spectroscopy (SERS) by designing and applying a Rhodamine B derivative as the SERS tag. Using a reductive amination reaction, the Rhodamine-based tag (RT) was successfully conjugated to three model carbohydrates (glucose, lactose and glucuronic acid). SERS detection limits obtained with 632 nm HeNe laser were ~1 nM in concentration for all the RT-carbohydrate conjugates and ~10 fmol in total sample consumption. The dynamic range of the SERS method is about 4 orders of magnitude, spanning from 1 nM to 5 µM. Ratiometric SERS quantification using isotope-substituted SERS internal references also allows comparative quantifications of carbohydrates labeled with RT and deuterium/hydrogen substituted RT tags, respectively. In addition to enhancing the SERS detection of the tagged carbohydrates, the Rhodamine tagging facilitates fluorescence and mass spectrometric detection of carbohydrates. Current fluorescence sensitivity of RT-carbohydrates is ~ 3 nM in concentration while the mass spectrometry (MS) sensitivity is about 1 fmol that was achieved with linear ion trap electrospray ionization (ESI)-MS instrument. Potential applications that take advantage of the high SERS, fluorescence and MS sensitivity of this SERS tagging strategy are discussed for practical glycomic analysis where carbohydrates may be quantified with a fluorescence and SERS technique, and then identified with ESI-MS techniques. PMID:21082777

  16. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  17. Surface Modification Enhanced Reflection Intensity of Quartz Crystal Microbalance Sensors upon Molecular Adsorption.

    PubMed

    Kojima, Taisuke

    2018-01-01

    Molecular adsorption on a sensing surface involves molecule-substrate and molecule-molecule interactions. Combining optical systems and a quartz crystal microbalance (QCM) on the same sensing surface allows the quantification of such interactions and reveals the physicochemical properties of the adsorbed molecules. However, low sensitivity of the current reflection-based techniques compared to the QCM technique hinders the quantitative analysis of the adsorption events. Here, a layer-by-layer surface modification of a QCM sensor is studied to increase the optical sensitivity. The intermediate layers of organic-inorganic molecules and metal-metal oxide were explored on a gold (Au) surface of a QCM sensor. First, polyhedral oligomeric silsesquioxane-derivatives that served as the organic-inorganic intermediate layer were synthesized and modified on the Au-QCM surface. Meanwhile, titanium oxide, fabricated by anodic oxidation of titanium, was used as a metal-metal oxide intermediate layer on a titanium-coated QCM surface. The developed technique enabled interrogation of the molecular adsorption owing to the enhanced optical sensitivity.

  18. Characterization of Signal Quality Monitoring Techniques for Multipath Detection in GNSS Applications.

    PubMed

    Pirsiavash, Ali; Broumandan, Ali; Lachapelle, Gérard

    2017-07-05

    The performance of Signal Quality Monitoring (SQM) techniques under different multipath scenarios is analyzed. First, SQM variation profiles are investigated as critical requirements in evaluating the theoretical performance of SQM metrics. The sensitivity and effectiveness of SQM approaches for multipath detection and mitigation are then defined and analyzed by comparing SQM profiles and multipath error envelopes for different discriminators. Analytical discussions includes two discriminator strategies, namely narrow and high resolution correlator techniques for BPSK(1), and BOC(1,1) signaling schemes. Data analysis is also carried out for static and kinematic scenarios to validate the SQM profiles and examine SQM performance in actual multipath environments. Results show that although SQM is sensitive to medium and long-delay multipath, its effectiveness in mitigating these ranges of multipath errors varies based on tracking strategy and signaling scheme. For short-delay multipath scenarios, the multipath effect on pseudorange measurements remains mostly undetected due to the low sensitivity of SQM metrics.

  19. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  20. Ultra-sensitive high performance liquid chromatography-laser-induced fluorescence based proteomics for clinical applications.

    PubMed

    Patil, Ajeetkumar; Bhat, Sujatha; Pai, Keerthilatha M; Rai, Lavanya; Kartha, V B; Chidangil, Santhosh

    2015-09-08

    An ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique has been developed by our group at Manipal, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from volunteers (normal, and different pre-malignant/malignant conditions) were recorded using this set-up. The protein profiles were analyzed using principal component analysis (PCA) to achieve objective detection and classification of malignant, premalignant and healthy conditions with high sensitivity and specificity. The HPLC-LIF protein profiling combined with PCA, as a routine method for screening, diagnosis, and staging of cervical cancer and oral cancer, is discussed in this paper. In recent years, proteomics techniques have advanced tremendously in life sciences and medical sciences for the detection and identification of proteins in body fluids, tissue homogenates and cellular samples to understand biochemical mechanisms leading to different diseases. Some of the methods include techniques like high performance liquid chromatography, 2D-gel electrophoresis, MALDI-TOF-MS, SELDI-TOF-MS, CE-MS and LC-MS techniques. We have developed an ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from healthy and volunteers with different malignant conditions were recorded by using this set-up. The protein profile data were analyzed using principal component analysis (PCA) for objective classification and detection of malignant, premalignant and healthy conditions. The method is extremely sensitive to detect proteins with limit of detection of the order of femto-moles. The HPLC-LIF combined with PCA as a potential proteomic method for the diagnosis of oral cancer and cervical cancer has been discussed in this paper. This article is part of a Special Issue entitled: Proteomics in India. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  2. Counterflow Dielectrophoresis for Trypanosome Enrichment and Detection in Blood

    NASA Astrophysics Data System (ADS)

    Menachery, Anoop; Kremer, Clemens; Wong, Pui E.; Carlsson, Allan; Neale, Steven L.; Barrett, Michael P.; Cooper, Jonathan M.

    2012-10-01

    Human African trypanosomiasis or sleeping sickness is a deadly disease endemic in sub-Saharan Africa, caused by single-celled protozoan parasites. Although it has been targeted for elimination by 2020, this will only be realized if diagnosis can be improved to enable identification and treatment of afflicted patients. Existing techniques of detection are restricted by their limited field-applicability, sensitivity and capacity for automation. Microfluidic-based technologies offer the potential for highly sensitive automated devices that could achieve detection at the lowest levels of parasitemia and consequently help in the elimination programme. In this work we implement an electrokinetic technique for the separation of trypanosomes from both mouse and human blood. This technique utilises differences in polarisability between the blood cells and trypanosomes to achieve separation through opposed bi-directional movement (cell counterflow). We combine this enrichment technique with an automated image analysis detection algorithm, negating the need for a human operator.

  3. Biotechnical use of polymerase chain reaction for microbiological analysis of biological samples.

    PubMed

    Lantz, P G; Abu al-Soud, W; Knutsson, R; Hahn-Hägerdal, B; Rådström, P

    2000-01-01

    Since its introduction in the mid-80s, polymerase chain reaction (PCR) technology has been recognised as a rapid, sensitive and specific molecular diagnostic tool for the analysis of micro-organisms in clinical, environmental and food samples. Although this technique can be extremely effective with pure solutions of nucleic acids, it's sensitivity may be reduced dramatically when applied directly to biological samples. This review describes PCR technology as a microbial detection method, PCR inhibitors in biological samples and various sample preparation techniques that can be used to facilitate PCR detection, by either separating the micro-organisms from PCR inhibitors and/or by concentrating the micro-organisms to detectable concentrations. Parts of this review are updated and based on a doctoral thesis by Lantz [1] and on a review discussing methods to overcome PCR inhibition in foods [2].

  4. Pulsed quantum cascade laser-based cavity ring-down spectroscopy for ammonia detection in breath.

    PubMed

    Manne, Jagadeeshwari; Sukhorukov, Oleksandr; Jäger, Wolfgang; Tulip, John

    2006-12-20

    Breath analysis can be a valuable, noninvasive tool for the clinical diagnosis of a number of pathological conditions. The detection of ammonia in exhaled breath is of particular interest for it has been linked to kidney malfunction and peptic ulcers. Pulsed cavity ringdown spectroscopy in the mid-IR region has developed into a sensitive analytical technique for trace gas analysis. A gas analyzer based on a pulsed mid-IR quantum cascade laser operating near 970 cm(-1) has been developed for the detection of ammonia levels in breath. We report a sensitivity of approximately 50 parts per billion with a 20 s time resolution for ammonia detection in breath with this system. The challenges and possible solutions for the quantification of ammonia in human breath by the described technique are discussed.

  5. Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations.

    PubMed

    Kamran, Faisal; Andersen, Peter E

    2015-08-10

    Oblique incidence reflectometry has developed into an effective, noncontact, and noninvasive measurement technology for the quantification of both the reduced scattering and absorption coefficients of a sample. The optical properties are deduced by analyzing only the shape of the reflectance profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical properties in which system demands vary to be able to detect subtle changes in the structure of the medium, translated as measured optical properties. Effects of variation in anisotropy are discussed and results presented. Finally, experimental data of milk products with different fat content are considered as examples for comparison.

  6. Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.

    PubMed

    Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline

    2017-01-01

    Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.

  7. Low energy analysis techniques for CUORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  8. Low energy analysis techniques for CUORE

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-12-12

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  9. Measurement Consistency from Magnetic Resonance Images

    PubMed Central

    Chung, Dongjun; Chung, Moo K.; Durtschi, Reid B.; Lindell, R. Gentry; Vorperian, Houri K.

    2010-01-01

    Rationale and Objectives In quantifying medical images, length-based measurements are still obtained manually. Due to possible human error, a measurement protocol is required to guarantee the consistency of measurements. In this paper, we review various statistical techniques that can be used in determining measurement consistency. The focus is on detecting a possible measurement bias and determining the robustness of the procedures to outliers. Materials and Methods We review correlation analysis, linear regression, Bland-Altman method, paired t-test, and analysis of variance (ANOVA). These techniques were applied to measurements, obtained by two raters, of head and neck structures from magnetic resonance images (MRI). Results The correlation analysis and the linear regression were shown to be insufficient for detecting measurement inconsistency. They are also very sensitive to outliers. The widely used Bland-Altman method is a visualization technique so it lacks the numerical quantification. The paired t-test tends to be sensitive to small measurement bias. On the other hand, ANOVA performs well even under small measurement bias. Conclusion In almost all cases, using only one method is insufficient and it is recommended to use several methods simultaneously. In general, ANOVA performs the best. PMID:18790405

  10. Sample preparation techniques for the determination of trace residues and contaminants in foods.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2007-06-15

    The determination of trace residues and contaminants in complex matrices, such as food, often requires extensive sample extraction and preparation prior to instrumental analysis. Sample preparation is often the bottleneck in analysis and there is a need to minimise the number of steps to reduce both time and sources of error. There is also a move towards more environmentally friendly techniques, which use less solvent and smaller sample sizes. Smaller sample size becomes important when dealing with real life problems, such as consumer complaints and alleged chemical contamination. Optimal sample preparation can reduce analysis time, sources of error, enhance sensitivity and enable unequivocal identification, confirmation and quantification. This review considers all aspects of sample preparation, covering general extraction techniques, such as Soxhlet and pressurised liquid extraction, microextraction techniques such as liquid phase microextraction (LPME) and more selective techniques, such as solid phase extraction (SPE), solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The applicability of each technique in food analysis, particularly for the determination of trace organic contaminants in foods is discussed.

  11. Discrete sensitivity derivatives of the Navier-Stokes equations with a parallel Krylov solver

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Taylor, Arthur C., III

    1994-01-01

    This paper solves an 'incremental' form of the sensitivity equations derived by differentiating the discretized thin-layer Navier Stokes equations with respect to certain design variables of interest. The equations are solved with a parallel, preconditioned Generalized Minimal RESidual (GMRES) solver on a distributed-memory architecture. The 'serial' sensitivity analysis code is parallelized by using the Single Program Multiple Data (SPMD) programming model, domain decomposition techniques, and message-passing tools. Sensitivity derivatives are computed for low and high Reynolds number flows over a NACA 1406 airfoil on a 32-processor Intel Hypercube, and found to be identical to those computed on a single-processor Cray Y-MP. It is estimated that the parallel sensitivity analysis code has to be run on 40-50 processors of the Intel Hypercube in order to match the single-processor processing time of a Cray Y-MP.

  12. Sensitivity and specificity of indocyanine green near-infrared fluorescence imaging in detection of metastatic lymph nodes in colorectal cancer: Systematic review and meta-analysis.

    PubMed

    Emile, Sameh H; Elfeki, Hossam; Shalaby, Mostafa; Sakr, Ahmad; Sileri, Pierpaolo; Laurberg, Søren; Wexner, Steven D

    2017-11-01

    This review aimed to determine the overall sensitivity and specificity of indocyanine green (ICG) near-infrared (NIR) fluorescence in sentinel lymph node (SLN) detection in Colorectal cancer (CRC). A systematic search in electronic databases was conducted. Twelve studies including 248 patients were reviewed. The median sensitivity, specificity, and accuracy rates were 73.7, 100, and 75.7. The pooled sensitivity and specificity rates were 71% and 84.6%. In conclusion, ICG-NIR fluorescence is a promising technique for detecting SLNs in CRC. © 2017 Wiley Periodicals, Inc.

  13. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-10-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.

  14. Spatially resolved δ13C analysis using laser ablation isotope ratio mass spectrometry

    NASA Astrophysics Data System (ADS)

    Moran, J.; Riha, K. M.; Nims, M. K.; Linley, T. J.; Hess, N. J.; Nico, P. S.

    2014-12-01

    Inherent geochemical, organic matter, and microbial heterogeneity over small spatial scales can complicate studies of carbon dynamics through soils. Stable isotope analysis has a strong history of helping track substrate turnover, delineate rhizosphere activity zones, and identifying transitions in vegetation cover, but most traditional isotope approaches are limited in spatial resolution by a combination of physical separation techniques (manual dissection) and IRMS instrument sensitivity. We coupled laser ablation sampling with isotope measurement via IRMS to enable spatially resolved analysis over solid surfaces. Once a targeted sample region is ablated the resulting particulates are entrained in a helium carrier gas and passed through a combustion reactor where carbon is converted to CO2. Cyrotrapping of the resulting CO2 enables a reduction in carrier gas flow which improves overall measurement sensitivity versus traditional, high flow sample introduction. Currently we are performing sample analysis at 50 μm resolution, require 65 ng C per analysis, and achieve measurement precision consistent with other continuous flow techniques. We will discuss applications of the laser ablation IRMS (LA-IRMS) system to microbial communities and fish ecology studies to demonstrate the merits of this technique and how similar analytical approaches can be transitioned to soil systems. Preliminary efforts at analyzing soil samples will be used to highlight strengths and limitations of the LA-IRMS approach, paying particular attention to sample preparation requirements, spatial resolution, sample analysis time, and the types of questions most conducive to analysis via LA-IRMS.

  15. Automatic Target Recognition Classification System Evaluation Methodology

    DTIC Science & Technology

    2002-09-01

    Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in

  16. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  17. Measurement techniques for trace metals in coal-plant effluents: A brief review

    NASA Technical Reports Server (NTRS)

    Singh, J. J.

    1979-01-01

    The strong features and limitations of techniques for determining trace elements in aerosols emitted from coal plants are discussed. Techniques reviewed include atomic absorption spectroscopy, charged particle scattering and activation, instrumental neutron activation analysis, gas/liquid chromatography, gas chromatographic/mass spectrometric methods, X-ray fluorescence, and charged-particle-induced X-ray emission. The latter two methods are emphasized. They provide simultaneous, sensitive multielement analyses and lend themselves readily to depth profiling. It is recommended that whenever feasible, two or more complementary techniques should be used for analyzing environmental samples.

  18. Relative performance of academic departments using DEA with sensitivity analysis.

    PubMed

    Tyagi, Preeti; Yadav, Shiv Prasad; Singh, S P

    2009-05-01

    The process of liberalization and globalization of Indian economy has brought new opportunities and challenges in all areas of human endeavor including education. Educational institutions have to adopt new strategies to make best use of the opportunities and counter the challenges. One of these challenges is how to assess the performance of academic programs based on multiple criteria. Keeping this in view, this paper attempts to evaluate the performance efficiencies of 19 academic departments of IIT Roorkee (India) through data envelopment analysis (DEA) technique. The technique has been used to assess the performance of academic institutions in a number of countries like USA, UK, Australia, etc. But we are using it first time in Indian context to the best of our knowledge. Applying DEA models, we calculate technical, pure technical and scale efficiencies and identify the reference sets for inefficient departments. Input and output projections are also suggested for inefficient departments to reach the frontier. Overall performance, research performance and teaching performance are assessed separately using sensitivity analysis.

  19. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    NASA Technical Reports Server (NTRS)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  20. Biochemical component identification by plasmonic improved whispering gallery mode optical resonance based sensor

    NASA Astrophysics Data System (ADS)

    Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas

    2014-05-01

    Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.

  1. Electron microprobe analysis and histochemical examination of the calcium distribution in human bone trabeculae: a methodological study using biopsy specimens from post-traumatic osteopenia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obrant, K.J.; Odselius, R.

    1984-01-01

    Energy dispersive X-ray microanalysis (EDX) (or electron microprobe analysis) of the relative intensity for calcium in different bone trabeculae from the tibia epiphysis, and in different parts of one and the same trabecula, was performed on 3 patients who had earlier had a fracture of the ipsilateral tibia-diaphysis. The variation in intensity was compared with the histochemical patterns obtained with both the Goldner and the von Kossa staining techniques for detecting calcium in tissues. Previously reported calcium distribution features, found to be typical for posttraumatic osteopenia, such as striated mineralization patterns in individual trabeculae and large differences in mineralization levelmore » between different trabeculae, could be verified both by means of the two histochemical procedures and from the electron microprobe analysis. A pronounced difference was observed, however, between the two histochemical staining techniques as regards their sensitivity to detect calcium. To judge from the values obtained from the EDX measurements, the sensitivity of the Goldner technique should be more than ten times higher than that of von Kossa. The EDX measurements gave more detailed information than either of the two histochemical techniques: great variations in the intensity of the calcium peak were found in trabeculae stained as unmineralized as well as mineralized.« less

  2. Nanoscale deformation analysis with high-resolution transmission electron microscopy and digital image correlation

    DOE PAGES

    Wang, Xueju; Pan, Zhipeng; Fan, Feifei; ...

    2015-09-10

    We present an application of the digital image correlation (DIC) method to high-resolution transmission electron microscopy (HRTEM) images for nanoscale deformation analysis. The combination of DIC and HRTEM offers both the ultrahigh spatial resolution and high displacement detection sensitivity that are not possible with other microscope-based DIC techniques. We demonstrate the accuracy and utility of the HRTEM-DIC technique through displacement and strain analysis on amorphous silicon. Two types of error sources resulting from the transmission electron microscopy (TEM) image noise and electromagnetic-lens distortions are quantitatively investigated via rigid-body translation experiments. The local and global DIC approaches are applied for themore » analysis of diffusion- and reaction-induced deformation fields in electrochemically lithiated amorphous silicon. As a result, the DIC technique coupled with HRTEM provides a new avenue for the deformation analysis of materials at the nanometer length scales.« less

  3. Rapid, quantitative and sensitive immunochromatographic assay based on stripping voltammetric detection of a metal ion label

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Fang; Wang, Kaihua; Lin, Yuehe

    2005-10-10

    A novel, sensitive immunochromatographic electrochemical biosensor (IEB) which combines an immunochromatographic strip technique with an electrochemical detection technique is demonstrated. The IEB takes advantages of the speed and low-cost of the conventional immunochromatographic test kits and high-sensitivity of stripping voltammetry. Bismuth ions (Bi3+) have been coupled with the antibody through the bifunctional chelating agent diethylenetriamine pentaacetic acid (DTPA). After immunoreactions, Bi3+ was released and quantified by anodic stripping voltammetry at a built-in single-use screen-printed electrode. As an example for the applications of such novel device, the detection of human chorionic gonadotronphin (HCG) in a specimen was performed. This biosensor providesmore » a more user-friendly, rapid, clinically accurate, and less expensive immunoassay for such analysis in specimens than currently available test kits.« less

  4. Biomedical application of MALDI mass spectrometry for small-molecule analysis.

    PubMed

    van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M

    2011-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.

  5. Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions

    ERIC Educational Resources Information Center

    Vevea, Jack L.; Woods, Carol M.

    2005-01-01

    Publication bias, sometimes known as the "file-drawer problem" or "funnel-plot asymmetry," is common in empirical research. The authors review the implications of publication bias for quantitative research synthesis (meta-analysis) and describe existing techniques for detecting and correcting it. A new approach is proposed that is suitable for…

  6. Trace analysis in the food and beverage industry by capillary gas chromatography: system performance and maintenance.

    PubMed

    Hayes, M A

    1988-04-01

    Gas chromatography (GC) is the most widely used analytical technique in the food and beverage industry. This paper addresses the problems of sample preparation and system maintenance to ensure the most sensitive, durable, and efficient results for trace analysis by GC in this industry.

  7. Quantitative sensory testing response patterns to capsaicin- and ultraviolet-B–induced local skin hypersensitization in healthy subjects: a machine-learned analysis

    PubMed Central

    Lötsch, Jörn; Geisslinger, Gerd; Heinemann, Sarah; Lerch, Florian; Oertel, Bruno G.; Ultsch, Alfred

    2018-01-01

    Abstract The comprehensive assessment of pain-related human phenotypes requires combinations of nociceptive measures that produce complex high-dimensional data, posing challenges to bioinformatic analysis. In this study, we assessed established experimental models of heat hyperalgesia of the skin, consisting of local ultraviolet-B (UV-B) irradiation or capsaicin application, in 82 healthy subjects using a variety of noxious stimuli. We extended the original heat stimulation by applying cold and mechanical stimuli and assessing the hypersensitization effects with a clinically established quantitative sensory testing (QST) battery (German Research Network on Neuropathic Pain). This study provided a 246 × 10-sized data matrix (82 subjects assessed at baseline, following UV-B application, and following capsaicin application) with respect to 10 QST parameters, which we analyzed using machine-learning techniques. We observed statistically significant effects of the hypersensitization treatments in 9 different QST parameters. Supervised machine-learned analysis implemented as random forests followed by ABC analysis pointed to heat pain thresholds as the most relevantly affected QST parameter. However, decision tree analysis indicated that UV-B additionally modulated sensitivity to cold. Unsupervised machine-learning techniques, implemented as emergent self-organizing maps, hinted at subgroups responding to topical application of capsaicin. The distinction among subgroups was based on sensitivity to pressure pain, which could be attributed to sex differences, with women being more sensitive than men. Thus, while UV-B and capsaicin share a major component of heat pain sensitization, they differ in their effects on QST parameter patterns in healthy subjects, suggesting a lack of redundancy between these models. PMID:28700537

  8. Evaluation of a New Immunochromatography Technology Test (LDBio Diagnostics) To Detect Toxoplasma IgG and IgM: Comparison with the Routine Architect Technique

    PubMed Central

    Flori, Pierre; Delaunay, Edouard; Guillerme, Cécile; Charaoui, Sana; Raberin, Hélène; Hafid, Jamal; L'Ollivier, Coralie

    2017-01-01

    ABSTRACT A study comparing the ICT (immunochromatography technology) Toxoplasma IgG and IgM rapid diagnostic test (LDBio Diagnostics, France) with a fully automated system, Architect, was performed on samples from university hospitals of Marseille and Saint-Etienne. A total of 767 prospective sera and 235 selected sera were collected. The panels were selected to test various IgG and IgM parameters. The reference technique, Toxoplasma IgGII Western blot analysis (LDBio Diagnostics), was used to confirm the IgG results, and commercial kits Platelia Toxo IgM (Bio-Rad) and Toxo-ISAgA (bioMérieux) were used in Saint-Etienne and Marseille, respectively, as the IgM reference techniques. Sensitivity and specificity of the ICT and the Architect IgG assays were compared using a prospective panel. Sensitivity was 100% for the ICT test and 92.1% for Architect (cutoff at 1.6 IU/ml). The low-IgG-titer serum results confirmed that ICT sensitivity was superior to that of Architect. Specificity was 98.7% (ICT) and 99.8% (Architect IgG). The ICT test is also useful for detecting IgM without IgG and is both sensitive (100%) and specific (100%), as it can distinguish nonspecific IgM from specific Toxoplasma IgM. In comparison, IgM sensitivity and specificity on Architect are 96.1% and 99.6%, respectively (cutoff at 0.5 arbitrary units [AU]/ml). To conclude, this new test overcomes the limitations of automated screening techniques, which are not sensitive enough for IgG and lack specificity for IgM (rare IgM false-positive cases). PMID:28954897

  9. Principles of ESCA and applications to metal corrosion, coating and lubrication. [Electron Spectroscopy for Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Wheeler, D. R.

    1978-01-01

    The principles of ESCA (electron spectroscopy for chemical analysis) are described by comparison with other spectroscopic techniques. The advantages and disadvantages of ESCA as compared to other surface sensitive analytical techniques are evaluated. The use of ESCA is illustrated by actual applications to oxidation of steel and Rene 41, the chemistry of lubricant additives on steel, and the composition of sputter deposited hard coatings. Finally, a bibliography of material that is useful for further study of ESCA is presented and commented upon.

  10. In Situ Analysis of DNA Methylation in Plants.

    PubMed

    Kathiria, Palak; Kovalchuk, Igor

    2017-01-01

    Epigenetic regulation in the plant genome is associated with the determination of expression patterns of various genes. Methylation of DNA at cytosine residues is one of the mechanisms of epigenetic regulation and has been a subject of various studies. Various techniques have been developed to analyze DNA methylation, most of which involve isolation of chromatin from cells and further in vitro studies. Limited techniques are available for in situ study of DNA methylation in plants. Here, we present such an in situ method for DNA methylation analysis which has high sensitivity and good reproducibility.

  11. Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.

    2013-05-01

    Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.

  12. Genetic diversity analysis of Jatropha curcas L. (Euphorbiaceae) based on methylation-sensitive amplification polymorphism.

    PubMed

    Kanchanaketu, T; Sangduen, N; Toojinda, T; Hongtrakul, V

    2012-04-13

    Genetic analysis of 56 samples of Jatropha curcas L. collected from Thailand and other countries was performed using the methylation-sensitive amplification polymorphism (MSAP) technique. Nine primer combinations were used to generate MSAP fingerprints. When the data were interpreted as amplified fragment length polymorphism (AFLP) markers, 471 markers were scored. All 56 samples were classified into three major groups: γ-irradiated, non-toxic and toxic accessions. Genetic similarity among the samples was extremely high, ranging from 0.95 to 1.00, which indicated very low genetic diversity in this species. The MSAP fingerprint was further analyzed for DNA methylation polymorphisms. The results revealed differences in the DNA methylation level among the samples. However, the samples collected from saline areas and some species hybrids showed specific DNA methylation patterns. AFLP data were used, together with methylation-sensitive AFLP (MS-AFLP) data, to construct a phylogenetic tree, resulting in higher efficiency to distinguish the samples. This combined analysis separated samples previously grouped in the AFLP analysis. This analysis also distinguished some hybrids. Principal component analysis was also performed; the results confirmed the separation in the phylogenetic tree. Some polymorphic bands, involving both nucleotide and DNA methylation polymorphism, that differed between toxic and non-toxic samples were identified, cloned and sequenced. BLAST analysis of these fragments revealed differences in DNA methylation in some known genes and nucleotide polymorphism in chloroplast DNA. We conclude that MSAP is a powerful technique for the study of genetic diversity for organisms that have a narrow genetic base.

  13. Civil and mechanical engineering applications of sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komkov, V.

    1985-07-01

    In this largely tutorial presentation, the historical development of optimization theories has been outlined as they applied to mechanical and civil engineering designs and the development of modern sensitivity techniques during the last 20 years has been traced. Some of the difficulties and the progress made in overcoming them have been outlined. Some of the recently developed theoretical methods have been stressed to indicate their importance to computer-aided design technology.

  14. Sensitive Spectroscopic Analysis of Biomarkers in Exhaled Breath

    NASA Astrophysics Data System (ADS)

    Bicer, A.; Bounds, J.; Zhu, F.; Kolomenskii, A. A.; Kaya, N.; Aluauee, E.; Amani, M.; Schuessler, H. A.

    2018-06-01

    We have developed a novel optical setup which is based on a high finesse cavity and absorption laser spectroscopy in the near-IR spectral region. In pilot experiments, spectrally resolved absorption measurements of biomarkers in exhaled breath, such as methane and acetone, were carried out using cavity ring-down spectroscopy (CRDS). With a 172-cm-long cavity, an efficient optical path of 132 km was achieved. The CRDS technique is well suited for such measurements due to its high sensitivity and good spectral resolution. The detection limits for methane of 8 ppbv and acetone of 2.1 ppbv with spectral sampling of 0.005 cm-1 were achieved, which allowed to analyze multicomponent gas mixtures and to observe absorption peaks of 12CH4 and 13CH4. Further improvements of the technique have the potential to realize diagnostics of health conditions based on a multicomponent analysis of breath samples.

  15. Reproducibility of EEG-fMRI results in a patient with fixation-off sensitivity.

    PubMed

    Formaggio, Emanuela; Storti, Silvia Francesca; Galazzo, Ilaria Boscolo; Bongiovanni, Luigi Giuseppe; Cerini, Roberto; Fiaschi, Antonio; Manganotti, Paolo

    2014-07-01

    Blood oxygenation level-dependent (BOLD) activation associated with interictal epileptiform discharges in a patient with fixation-off sensitivity (FOS) was studied using a combined electroencephalography-functional magnetic resonance imaging (EEG-fMRI) technique. An automatic approach for combined EEG-fMRI analysis and a subject-specific hemodynamic response function was used to improve general linear model analysis of the fMRI data. The EEG showed the typical features of FOS, with continuous epileptiform discharges during elimination of central vision by eye opening and closing and fixation; modification of this pattern was clearly visible and recognizable. During all 3 recording sessions EEG-fMRI activations indicated a BOLD signal decrease related to epileptiform activity in the parietal areas. This study can further our understanding of this EEG phenomenon and can provide some insight into the reliability of the EEG-fMRI technique in localizing the irritative zone.

  16. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  17. Actinic Flux Calculations: A Model Sensitivity Study

    NASA Technical Reports Server (NTRS)

    Krotkov, Nickolay A.; Flittner, D.; Ahmad, Z.; Herman, J. R.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    calculate direct and diffuse surface irradiance and actinic flux (downwelling (2p) and total (4p)) for the reference model. Sensitivity analysis has shown that the accuracy of the radiative transfer flux calculations for a unit ETS (i.e. atmospheric transmittance) together with a numerical interpolation technique for the constituents' vertical profiles is better than 1% for SZA less than 70(sub o) and wavelengths longer than 310 nm. The differences increase for shorter wavelengths and larger SZA, due to the differences in pseudo-spherical correction techniques and vertical discretetization among the codes. Our sensitivity study includes variation of ozone cross-sections, ETS spectra and the effects of wavelength shifts between vacuum and air scales. We also investigate the effects of aerosols on the spectral flux components in the UV and visible spectral regions. The "aerosol correction factors" (ACFs) were calculated at discrete wavelengths and different SZAs for each flux component (direct, diffuse, reflected) and prescribed IPMMI aerosol parameters. Finally, the sensitivity study was extended to calculation of selected photolysis rates coefficients.

  18. An Innovative Technique to Assess Spontaneous Baroreflex Sensitivity with Short Data Segments: Multiple Trigonometric Regressive Spectral Analysis.

    PubMed

    Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf

    2018-01-01

    Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.

  19. Detection of monoclonal immunoglobulin heavy chain gene rearrangement (FR3) in Thai malignant lymphoma by High Resolution Melting curve analysis.

    PubMed

    Kummalue, Tanawan; Chuphrom, Anchalee; Sukpanichanant, Sanya; Pongpruttipan, Tawatchai; Sukpanichanant, Sathien

    2010-05-19

    Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction followed by heteroduplex has currently become standard whereas fluorescent fragment analysis (GeneScan) has been used for confirmation test. In this study, three techniques had been compared: thermocycler polymerase chain reaction (PCR) followed by heteroduplex and polyacrylamide gel electrophoresis, GeneScan analysis, and real time PCR with High Resolution Melting curve analysis (HRM). The comparison was carried out with DNA extracted from paraffin embedded tissues diagnosed as B- cell non-Hodgkin lymphoma. Specific PCR primers sequences for IgH gene variable region 3, including fluorescence labeled IgH primers were used and results were compared with HRM. In conclusion, the detection IgH gene rearrangement by HRM in the LightCycler System showed potential for distinguishing monoclonality from polyclonality in B-cell non-Hodgkin lymphoma. Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The incidence rate as reported by Ministry of Public Health is 3.1 per 100,000 population in female whereas the rate in male is 4.5 per 100,000 population 1. At Siriraj Hospital, the new cases diagnosed as malignant lymphoma were 214.6 cases/year 2. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Therefore, detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction (PCR) assay has recently become a standard laboratory test for discrimination of reactive from malignant clonal lymphoproliferation 34. Analyzing DNA extracted from formalin-fixed, paraffin-embedded tissues by multiplex PCR techniques is more rapid, accurate and highly sensitive. Measuring the size of the amplicon from PCR analysis could be used to diagnose malignant lymphoma with monoclonal pattern showing specific and distinct bands detected on acrylamide gel electrophoresis. However, this technique has some limitations and some patients might require a further confirmation test such as GeneScan or fragment analysis 56.GeneScan technique or fragment analysis reflects size and peak of DNA by using capillary gel electrophoresis. This technique is highly sensitive and can detect 0.5-1% of clonal lymphoid cells. It measures the amplicons by using various fluorescently labeled primers at forward or reverse sides and a specific size standard. Using a Genetic Analyzer machine and GeneMapper software (Applied Bioscience, USA), the monoclonal pattern revealed one single, sharp and high peak at the specific size corresponding to acrylamide gel pattern, whereas the polyclonal pattern showed multiple and small peak condensed at the same size standard. This technique is the most sensitive and accurate technique; however, it usually requires high technical experience and is also of high cost 7. Therefore, rapid and more cost effective technique are being sought.LightCycler PCR performs the diagnostic detection of amplicon via melting curve analysis within 2 hours with the use of a specific dye 89. This dye consists of two types: one known as SYBR-Green I which is non specific and the other named as High Resolution Melting analysis (HRM) which is highly sensitive, more accurate and stable. Several reports demonstrated that this new instrument combined with DNA intercalating dyes can be used to discriminate sequence changes in PCR amplicon without manual handling of PCR product 1011. Therefore, current investigations using melting curve analysis are being developed 1213.In this study, three different techniques were compared to evaluate the suitability of LightCycler PCR with HRM as the clonal diagnostic tool for IgH gene rearrangement in B-cell non-Hogdkin lymphoma, i.e. thermocycler PCR followed by heteroduplex analysis and PAGE, GeneScan analysis and LightCycler PCR with HRM.

  20. Electrochemical biosensors for hormone analyses.

    PubMed

    Bahadır, Elif Burcu; Sezgintürk, Mustafa Kemal

    2015-06-15

    Electrochemical biosensors have a unique place in determination of hormones due to simplicity, sensitivity, portability and ease of operation. Unlike chromatographic techniques, electrochemical techniques used do not require pre-treatment. Electrochemical biosensors are based on amperometric, potentiometric, impedimetric, and conductometric principle. Amperometric technique is a commonly used one. Although electrochemical biosensors offer a great selectivity and sensitivity for early clinical analysis, the poor reproducible results, difficult regeneration steps remain primary challenges to the commercialization of these biosensors. This review summarizes electrochemical (amperometric, potentiometric, impedimetric and conductometric) biosensors for hormone detection for the first time in the literature. After a brief description of the hormones, the immobilization steps and analytical performance of these biosensors are summarized. Linear ranges, LODs, reproducibilities, regenerations of developed biosensors are compared. Future outlooks in this area are also discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Determination of fluorine concentrations using wavelength dispersive X-ray fluorescence (WDXRF) spectrometry to analyze fluoride precipitates.

    NASA Astrophysics Data System (ADS)

    Lee, H. A.; Lee, J.; Kwon, E.; Kim, D.; Yoon, H. O.

    2015-12-01

    In recent times, fluorine has been receiving increasing attention due to the possibility for chemical (HF) leakage accidents and its high toxicity to human and environment. In this respect, a novel approach for the determination of fluorine concentrations in water samples using wavelength dispersive X-ray fluorescence (WDXRF) spectrometry was investigated in this study. The main disadvantage of WDXRF technique for fluorine analysis is low analytical sensitivity for light elements with atomic number (Z) less than 15. To overcome this problem, we employed the precipitation reaction which fluoride is reacted with cation such as Al3+ and/or Ca2+ prior to WDXRF analysis because of their high analytical sensitivity. The cation was added in fluoride solutions to form precipitate (AlF3 and/or CaF2) and then the solution was filtered through Whatman filter. After drying at 60 °C for 5 min, the filter was coated with X-ray film and directly analyzed using WDXRF spectrometry. Consequently, we analyzed the cation on filter and subsequently fluorine concentration was calculated inversely based on chemical form of precipitate. This method can improve the analytical sensitivity of WDXRF technique for fluorine analysis and be applicable to various elements that can make precipitate.

  2. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  3. Nano-LC/MALDI-MS using a column-integrated spotting probe for analysis of complex biomolecule samples.

    PubMed

    Hioki, Yusaku; Tanimura, Ritsuko; Iwamoto, Shinichi; Tanaka, Koichi

    2014-03-04

    Nanoflow liquid chromatography (nano-LC) is an essential technique for highly sensitive analysis of complex biological samples, and matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is advantageous for rapid identification of proteins and in-depth analysis of post-translational modifications (PTMs). A combination of nano-LC and MALDI-MS (nano-LC/MALDI-MS) is useful for highly sensitive and detailed analysis in life sciences. However, the existing system does not fully utilize the advantages of each technique, especially in the interface of eluate transfer from nano-LC to a MALDI plate. To effectively combine nano-LC with MALDI-MS, we integrated a nano-LC column and a deposition probe for the first time (column probe) and incorporated it into a nano-LC/MALDI-MS system. Spotting nanoliter eluate droplets directly from the column onto the MALDI plate prevents postcolumn diffusion and preserves the chromatographic resolution. A DHB prespotted plate was prepared to suit the fabricated column probe to concentrate the droplets of nano-LC eluate. The performance of the advanced nano-LC/MALDI-MS system was substantiated by analyzing protein digests. When the system was coupled with multidimensional liquid chromatography (MDLC), trace amounts of glycopeptides that spiked into complex samples were successfully detected. Thus, a nano-LC/MALDI-MS direct-spotting system that eliminates postcolumn diffusion was constructed, and the efficacy of the system was demonstrated through highly sensitive analysis of the protein digests or spiked glycopeptides.

  4. Extracranial glioblastoma diagnosed by examination of pleural effusion using the cell block technique: case report.

    PubMed

    Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko

    2018-06-01

    Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.

  5. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work ismore » focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.« less

  6. The application of supported liquid extraction in the analysis of benzodiazepines using surface enhanced Raman spectroscopy.

    PubMed

    Doctor, Erika L; McCord, Bruce

    2015-11-01

    Benzodiazepines are among the most frequently prescribed medicines for anxiety disorders and are present in many toxicological screens. These drugs are often administered in the commission of drug facilitated sexual assaults due their effects on the central nervous system. Due to the potency of the drugs, only small amounts are usually given to victims; therefore, the target detection limit for these compounds in biological samples has been set at 50 ng/mL. Currently the standard screening method for detection of this class of drug is the immunoassay; however, screening methods that are more sensitive and selective than immunoassays are needed to encompass the wide range of structural variants of this class of compounds. Surface enhanced Raman spectroscopy (SERS) can be highly sensitive and has been shown to permit analysis of various benzodiazepines with limits of detection as low as 6 ng/mL. This technique permits analytical results in less than 2 min when used on pure drug samples. For biological samples, a key issue for analysis by SERS is removal of exogenous salts and matrix components. In this paper we examine supported liquid extraction as a useful preparation technique for SERS detection. Supported liquid extraction has many of the benefits of liquid-liquid extraction along with the ability to be automated. This technique provides a fast and clean extraction for benzodiazepines from urine at a pH of 5.0, and does not produce large quantities of solvent waste. To validate this procedure we have determined figures of merit and examined simulated urine samples prepared with commonly appearing interferences. It was shown that at a pH 5.0 many drugs that are prevalent in urine samples can be removed, permitting a selective detection of the benzodiazepine of interest. This technique has been shown to provide rapid (less than 20 min), sensitive, and specific detection of benzodiazepines with limits of detection between 32 and 600 ng/mL and dynamic range of 32-25,000 ng/mL. It provides the forensic community with a sensitive and specific screening technique for the detection of benzodiazepines in drug facilitated assault cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  8. Protein purification and analysis: next generation Western blotting techniques.

    PubMed

    Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V

    2017-11-01

    Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.

  9. Elemental Analysis in Biological Matrices Using ICP-MS.

    PubMed

    Hansen, Matthew N; Clogston, Jeffrey D

    2018-01-01

    The increasing exploration of metallic nanoparticles for use as cancer therapeutic agents necessitates a sensitive technique to track the clearance and distribution of the material once introduced into a living system. Inductively coupled plasma mass spectrometry (ICP-MS) provides a sensitive and selective tool for tracking the distribution of metal components from these nanotherapeutics. This chapter presents a standardized method for processing biological matrices, ensuring complete homogenization of tissues, and outlines the preparation of appropriate standards and controls. The method described herein utilized gold nanoparticle-treated samples; however, the method can easily be applied to the analysis of other metals.

  10. Integrated planar terahertz resonators for femtomolar sensitivity label-free detection of DNA hybridization.

    PubMed

    Nagel, Michael; Bolivar, Peter Haring; Brucherseifer, Martin; Kurz, Heinrich; Bosserhoff, Anja; Büttner, Reinhard

    2002-04-01

    A promising label-free approach for the analysis of genetic material by means of detecting the hybridization of polynucleotides with electromagnetic waves at terahertz (THz) frequencies is presented. Using an integrated waveguide approach, incorporating resonant THz structures as sample carriers and transducers for the analysis of the DNA molecules, we achieve a sensitivity down to femtomolar levels. The approach is demonstrated with time-domain ultrafast techniques based on femtosecond laser pulses for generating and electro-optically detecting broadband THz signals, although the principle can certainly be transferred to other THz technologies.

  11. Biomagnetic separation of Salmonella Typhimurium with high affine and specific ligand peptides isolated by phage display technique

    NASA Astrophysics Data System (ADS)

    Steingroewer, Juliane; Bley, Thomas; Bergemann, Christian; Boschke, Elke

    2007-04-01

    Analyses of food-borne pathogens are of great importance in order to minimize the health risk for customers. Thus, very sensitive and rapid detection methods are required. Current conventional culture techniques are very time consuming. Modern immunoassays and biochemical analysis also require pre-enrichment steps resulting in a turnaround time of at least 24 h. Biomagnetic separation (BMS) is a promising more rapid method. In this study we describe the isolation of high affine and specific peptides from a phage-peptide library, which combined with BMS allows the detection of Salmonella spp. with a similar sensitivity as that of immunomagnetic separation using antibodies.

  12. Comparing Laser Interferometry and Atom Interferometry Approaches to Space-Based Gravitational-Wave Measurement

    NASA Technical Reports Server (NTRS)

    Baker, John; Thorpe, Ira

    2012-01-01

    Thoroughly studied classic space-based gravitational-wave missions concepts such as the Laser Interferometer Space Antenna (LISA) are based on laser-interferometry techniques. Ongoing developments in atom-interferometry techniques have spurred recently proposed alternative mission concepts. These different approaches can be understood on a common footing. We present an comparative analysis of how each type of instrument responds to some of the noise sources which may limiting gravitational-wave mission concepts. Sensitivity to laser frequency instability is essentially the same for either approach. Spacecraft acceleration reference stability sensitivities are different, allowing smaller spacecraft separations in the atom interferometry approach, but acceleration noise requirements are nonetheless similar. Each approach has distinct additional measurement noise issues.

  13. Assessment of Sentinel Node Biopsies With Full-Field Optical Coherence Tomography.

    PubMed

    Grieve, Kate; Mouslim, Karima; Assayag, Osnath; Dalimier, Eugénie; Harms, Fabrice; Bruhat, Alexis; Boccara, Claude; Antoine, Martine

    2016-04-01

    Current techniques for the intraoperative analysis of sentinel lymph nodes during breast cancer surgery present drawbacks such as time and tissue consumption. Full-field optical coherence tomography is a novel noninvasive, high-resolution, fast imaging technique. This study investigated the use of full-field optical coherence tomography as an alternative technique for the intraoperative analysis of sentinel lymph nodes. Seventy-one axillary lymph nodes from 38 patients at Tenon Hospital were imaged minutes after excision with full-field optical coherence tomography in the pathology laboratory, before being handled for histological analysis. A pathologist performed a blind diagnosis (benign/malignant), based on the full-field optical coherence tomography images alone, which resulted in a sensitivity of 92% and a specificity of 83% (n = 65 samples). Regular feedback was given during the blind diagnosis, with thorough analysis of the images, such that features of normal and suspect nodes were identified in the images and compared with histology. A nonmedically trained imaging expert also performed a blind diagnosis aided by the reading criteria defined by the pathologist, which resulted in 85% sensitivity and 90% specificity (n = 71 samples). The number of false positives of the pathologist was reduced by 3 in a second blind reading a few months later. These results indicate that following adequate training, full-field optical coherence tomography can be an effective noninvasive diagnostic tool for extemporaneous sentinel node biopsy qualification. © The Author(s) 2015.

  14. It's Nolan Ryan: A Historiography Teaching Technique.

    ERIC Educational Resources Information Center

    Mackey, Thomas

    1991-01-01

    Presents a plan for teaching historiography through analysis of baseball cards. Explains that students can learn about society, culture, discrimination, and inference. Reports that the lesson increased student interest, motivation, and sensitivity to the importance of historical sources. (DK)

  15. Is High Resolution Melting Analysis (HRMA) Accurate for Detection of Human Disease-Associated Mutations? A Meta Analysis

    PubMed Central

    Ma, Feng-Li; Jiang, Bo; Song, Xiao-Xiao; Xu, An-Gao

    2011-01-01

    Background High Resolution Melting Analysis (HRMA) is becoming the preferred method for mutation detection. However, its accuracy in the individual clinical diagnostic setting is variable. To assess the diagnostic accuracy of HRMA for human mutations in comparison to DNA sequencing in different routine clinical settings, we have conducted a meta-analysis of published reports. Methodology/Principal Findings Out of 195 publications obtained from the initial search criteria, thirty-four studies assessing the accuracy of HRMA were included in the meta-analysis. We found that HRMA was a highly sensitive test for detecting disease-associated mutations in humans. Overall, the summary sensitivity was 97.5% (95% confidence interval (CI): 96.8–98.5; I2 = 27.0%). Subgroup analysis showed even higher sensitivity for non-HR-1 instruments (sensitivity 98.7% (95%CI: 97.7–99.3; I2 = 0.0%)) and an eligible sample size subgroup (sensitivity 99.3% (95%CI: 98.1–99.8; I2 = 0.0%)). HRMA specificity showed considerable heterogeneity between studies. Sensitivity of the techniques was influenced by sample size and instrument type but by not sample source or dye type. Conclusions/Significance These findings show that HRMA is a highly sensitive, simple and low-cost test to detect human disease-associated mutations, especially for samples with mutations of low incidence. The burden on DNA sequencing could be significantly reduced by the implementation of HRMA, but it should be recognized that its sensitivity varies according to the number of samples with/without mutations, and positive results require DNA sequencing for confirmation. PMID:22194806

  16. Detection of Genetically Altered Copper Levels in Drosophila Tissues by Synchrotron X-Ray Fluorescence Microscopy

    PubMed Central

    Lye, Jessica C.; Hwang, Joab E. C.; Paterson, David; de Jonge, Martin D.; Howard, Daryl L.; Burke, Richard

    2011-01-01

    Tissue-specific manipulation of known copper transport genes in Drosophila tissues results in phenotypes that are presumably due to an alteration in copper levels in the targeted cells. However direct confirmation of this has to date been technically challenging. Measures of cellular copper content such as expression levels of copper-responsive genes or cuproenzyme activity levels, while useful, are indirect. First-generation copper-sensitive fluorophores show promise but currently lack the sensitivity required to detect subtle changes in copper levels. Moreover such techniques do not provide information regarding other relevant biometals such as zinc or iron. Traditional techniques for measuring elemental composition such as inductively coupled plasma mass spectroscopy are not sensitive enough for use with the small tissue amounts available in Drosophila research. Here we present synchrotron x-ray fluorescence microscopy analysis of two different Drosophila tissues, the larval wing imaginal disc, and sectioned adult fly heads and show that this technique can be used to detect changes in tissue copper levels caused by targeted manipulation of known copper homeostasis genes. PMID:22053217

  17. Validation of diffuse correlation spectroscopy sensitivity to nicotinamide-induced blood flow elevation in the murine hindlimb using the fluorescent microsphere technique

    NASA Astrophysics Data System (ADS)

    Proctor, Ashley R.; Ramirez, Gabriel A.; Han, Songfeng; Liu, Ziping; Bubel, Tracy M.; Choe, Regine

    2018-03-01

    Nicotinamide has been shown to affect blood flow in both tumor and normal tissues, including skeletal muscle. Intraperitoneal injection of nicotinamide was used as a simple intervention to test the sensitivity of noninvasive diffuse correlation spectroscopy (DCS) to changes in blood flow in the murine left quadriceps femoris skeletal muscle. DCS was then compared with the gold-standard fluorescent microsphere (FM) technique for validation. The nicotinamide dose-response experiment showed that relative blood flow measured by DCS increased following treatment with 500- and 1000-mg / kg nicotinamide. The DCS and FM technique comparison showed that blood flow index measured by DCS was correlated with FM counts quantified by image analysis. The results of this study show that DCS is sensitive to nicotinamide-induced blood flow elevation in the murine left quadriceps femoris. Additionally, the results of the comparison were consistent with similar studies in higher-order animal models, suggesting that mouse models can be effectively employed to investigate the utility of DCS for various blood flow measurement applications.

  18. Approximate analysis for repeated eigenvalue problems with applications to controls-structure integrated design

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Hou, Gene J. W.

    1994-01-01

    A method for eigenvalue and eigenvector approximate analysis for the case of repeated eigenvalues with distinct first derivatives is presented. The approximate analysis method developed involves a reparameterization of the multivariable structural eigenvalue problem in terms of a single positive-valued parameter. The resulting equations yield first-order approximations to changes in the eigenvalues and the eigenvectors associated with the repeated eigenvalue problem. This work also presents a numerical technique that facilitates the definition of an eigenvector derivative for the case of repeated eigenvalues with repeated eigenvalue derivatives (of all orders). Examples are given which demonstrate the application of such equations for sensitivity and approximate analysis. Emphasis is placed on the application of sensitivity analysis to large-scale structural and controls-structures optimization problems.

  19. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  20. Coherent Doppler Lidar for Boundary Layer Studies and Wind Energy

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya

    This thesis outlines the development of a vector retrieval technique, based on data assimilation, for a coherent Doppler LIDAR (Light Detection and Ranging). A detailed analysis of the Optimal Interpolation (OI) technique for vector retrieval is presented. Through several modifications to the OI technique, it is shown that the modified technique results in significant improvement in velocity retrieval accuracy. These modifications include changes to innovation covariance portioning, covariance binning, and analysis increment calculation. It is observed that the modified technique is able to make retrievals with better accuracy, preserves local information better, and compares well with tower measurements. In order to study the error of representativeness and vector retrieval error, a lidar simulator was constructed. Using the lidar simulator a thorough sensitivity analysis of the lidar measurement process and vector retrieval is carried out. The error of representativeness as a function of scales of motion and sensitivity of vector retrieval to look angle is quantified. Using the modified OI technique, study of nocturnal flow in Owens' Valley, CA was carried out to identify and understand uncharacteristic events on the night of March 27th 2006. Observations from 1030 UTC to 1230 UTC (0230 hr local time to 0430 hr local time) on March 27 2006 are presented. Lidar observations show complex and uncharacteristic flows such as sudden bursts of westerly cross-valley wind mixing with the dominant up-valley wind. Model results from Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS RTM) and other in-situ instrumentations are used to corroborate and complement these observations. The modified OI technique is used to identify uncharacteristic and extreme flow events at a wind development site. Estimates of turbulence and shear from this technique are compared to tower measurements. A formulation for equivalent wind speed in the presence of variations in wind speed and direction, combined with shear is developed and used to determine wind energy content in presence of turbulence.

  1. Breath analysis using external cavity diode lasers: a review

    NASA Astrophysics Data System (ADS)

    Bayrakli, Ismail

    2017-04-01

    Most techniques that are used for diagnosis and therapy of diseases are invasive. Reliable noninvasive methods are always needed for the comfort of patients. Owing to its noninvasiveness, ease of use, and easy repeatability, exhaled breath analysis is a very good candidate for this purpose. Breath analysis can be performed using different techniques, such as gas chromatography mass spectrometry (MS), proton transfer reaction-MS, and selected ion flow tube-MS. However, these devices are bulky and require complicated procedures for sample collection and preconcentration. Therefore, these are not practical for routine applications in hospitals. Laser-based techniques with small size, robustness, low cost, low response time, accuracy, precision, high sensitivity, selectivity, low detection limit, real-time, and point-of-care detection have a great potential for routine use in hospitals. In this review paper, the recent advances in the fields of external cavity lasers and breath analysis for detection of diseases are presented.

  2. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  3. Diagnosis of 25 genotypes of human papillomaviruses for their physical statuses in cervical precancerous/cancerous lesions: a comparison of E2/E6E7 ratio-based vs. multiple E1-L1/E6E7 ratio-based detection techniques.

    PubMed

    Zhang, Rong; He, Yi-feng; Chen, Mo; Chen, Chun-mei; Zhu, Qiu-jing; Lu, Huan; Wei, Zhen-hong; Li, Fang; Zhang, Xiao-xin; Xu, Cong-jian; Yu, Long

    2014-10-02

    Cervical lesions caused by integrated human papillomavirus (HPV) infection are highly dangerous because they can quickly develop into invasive cancers. However, clinicians are currently hampered by the lack of a quick, convenient and precise technique to detect integrated/mixed infections of various genotypes of HPVs in the cervix. This study aimed to develop a practical tool to determine the physical status of different HPVs and evaluate its clinical significance. The target population comprised 1162 women with an HPV infection history of > six months and an abnormal cervical cytological finding. The multiple E1-L1/E6E7 ratio analysis, a novel technique, was developed based on determining the ratios of E1/E6E7, E2/E6E7, E4E5/E6E7, L2/E6E7 and L1/E6E7 within the viral genome. Any imbalanced ratios indicate integration. Its diagnostic and predictive performances were compared with those of E2/E6E7 ratio analysis. The detection accuracy of both techniques was evaluated using the gold-standard technique "detection of integrated papillomavirus sequences" (DIPS). To realize a multigenotypic detection goal, a primer and probe library was established. The integration rate of a particular genotype of HPV was correlated with its tumorigenic potential and women with higher lesion grades often carried lower viral loads. The E1-L1/E6E7 ratio analysis achieved 92.7% sensitivity and 99.0% specificity in detecting HPV integration, while the E2/E6E7 ratio analysis showed a much lower sensitivity (75.6%) and a similar specificity (99.3%). Interference due to episomal copies was observed in both techniques, leading to false-negative results. However, some positive results of E1-L1/E6E7 ratio analysis were missed by DIPS due to its stochastic detection nature. The E1-L1/E6E7 ratio analysis is more efficient than E2/E6E7 ratio analysis and DIPS in predicting precancerous/cancerous lesions, in which both positive predictive values (36.7%-82.3%) and negative predictive values (75.9%-100%) were highest (based on the results of three rounds of biopsies). The multiple E1-L1/E6E7 ratio analysis is more sensitive and predictive than E2/E6E7 ratio analysis as a triage test for detecting HPV integration. It can effectively narrow the range of candidates for colposcopic examination and cervical biopsy, thereby lowering the expense of cervical cancer prevention.

  4. Molecular wake shield gas analyzer

    NASA Technical Reports Server (NTRS)

    Hoffman, J. H.

    1980-01-01

    Techniques for measuring and characterizing the ultrahigh vacuum in the wake of an orbiting spacecraft are studied. A high sensitivity mass spectrometer that contains a double mass analyzer consisting of an open source miniature magnetic sector field neutral gas analyzer and an identical ion analyzer is proposed. These are configured to detect and identify gas and ion species of hydrogen, helium, nitrogen, oxygen, nitric oxide, and carbon dioxide and any other gas or ion species in the 1 to 46 amu mass range. This range covers the normal atmospheric constituents. The sensitivity of the instrument is sufficient to measure ambient gases and ion with a particle density of the order of one per cc. A chemical pump, or getter, is mounted near the entrance aperture of the neutral gas analyzer which integrates the absorption of ambient gases for a selectable period of time for subsequent release and analysis. The sensitivity is realizable for all but rare gases using this technique.

  5. Determination of complex formation constants by phase sensitive alternating current polarography: Cadmium-polymethacrylic acid and cadmium-polygalacturonic acid.

    PubMed

    Garrigosa, Anna Maria; Gusmão, Rui; Ariño, Cristina; Díaz-Cruz, José Manuel; Esteban, Miquel

    2007-10-15

    The use of phase sensitive alternating current polarography (ACP) for the evaluation of complex formation constants of systems where electrodic adsorption is present has been proposed. The applicability of the technique implies the previous selection of the phase angle where contribution of capacitive current is minimized. This is made using Multivariate Curve Resolution by Alternating Least Squares (MCR-ALS) in the analysis of ACP measurements at different phase angles. The method is checked by the study of the complexation of Cd by polymethacrylic (PMA) and polygalacturonic (PGA) acids, and the optimal phase angles have been ca. -10 degrees for Cd-PMA and ca. -15 degrees for Cd-PGA systems. The goodness of phase sensitive ACP has been demonstrated comparing the determined complex formation constants with those obtained by reverse pulse polarography, a technique that minimizes the electrode adsorption effects on the measured currents.

  6. Multi-modal approach using Raman spectroscopy and optical coherence tomography for the discrimination of colonic adenocarcinoma from normal colon

    PubMed Central

    Ashok, Praveen C.; Praveen, Bavishna B.; Bellini, Nicola; Riches, Andrew; Dholakia, Kishan; Herrington, C. Simon

    2013-01-01

    We report a multimodal optical approach using both Raman spectroscopy and optical coherence tomography (OCT) in tandem to discriminate between colonic adenocarcinoma and normal colon. Although both of these non-invasive techniques are capable of discriminating between normal and tumour tissues, they are unable individually to provide both the high specificity and high sensitivity required for disease diagnosis. We combine the chemical information derived from Raman spectroscopy with the texture parameters extracted from OCT images. The sensitivity obtained using Raman spectroscopy and OCT individually was 89% and 78% respectively and the specificity was 77% and 74% respectively. Combining the information derived using the two techniques increased both sensitivity and specificity to 94% demonstrating that combining complementary optical information enhances diagnostic accuracy. These data demonstrate that multimodal optical analysis has the potential to achieve accurate non-invasive cancer diagnosis. PMID:24156073

  7. A comparison between DART-MS and DSA-MS in the forensic analysis of writing inks.

    PubMed

    Drury, Nicholas; Ramotowski, Robert; Moini, Mehdi

    2018-05-23

    Ambient ionization mass spectrometry is gaining momentum in forensic science laboratories because of its high speed of analysis, minimal sample preparation, and information-rich results. One such application of ambient ionization methodology includes the analysis of writing inks from questioned documents where colorants of interest may not be soluble in common solvents, rendering thin layer chromatography (TLC) and separation-mass spectrometry methods such as LC/MS (-MS) impractical. Ambient ionization mass spectrometry uses a variety of ionization techniques such as penning ionization in Direct Analysis in Real Time (DART), and atmospheric pressure chemical ionization in Direct Sample Analysis (DSA), and electrospray ionization in Desorption Electrospray Ionization (DESI). In this manuscript, two of the commonly used ambient ionization techniques are compared: Perkin Elmer DSA-MS and IonSense DART in conjunction with a JEOL AccuTOF MS. Both technologies were equally successful in analyzing writing inks and produced similar spectra. DSA-MS produced less background signal likely because of its closed source configuration; however, the open source configuration of DART-MS provided more flexibility for sample positioning for optimum sensitivity and thereby allowing smaller piece of paper containing writing ink to be analyzed. Under these conditions, the minimum sample required for DART-MS was 1mm strokes of ink on paper, whereas DSA-MS required a minimum of 3mm. Moreover, both techniques showed comparable repeatability. Evaluation of the analytical figures of merit, including sensitivity, linear dynamic range, and repeatability, for DSA-MS and DART-MS analysis is provided. To the forensic context of the technique, DART-MS was applied to the analysis of United States Secret Service ink samples directly on a sampling mesh, and the results were compared with DSA-MS of the same inks on paper. Unlike analysis using separation mass spectrometry, which requires sample preparation, both DART-MS and DSA-MS successfully analyzed writing inks with minimal sample preparation. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. Isoschizomers and amplified fragment length polymorphism for the detection of specific cytosine methylation changes.

    PubMed

    Ruiz-García, Leonor; Cabezas, Jose Antonio; de María, Nuria; Cervera, María-Teresa

    2010-01-01

    Different molecular techniques have been developed to study either the global level of methylated cytosines or methylation at specific gene sequences. One of them is a modification of the Amplified Fragment Length Polymorphism (AFLP) technique that has been used to study methylation of anonymous CCGG sequences in different fungi, plant and animal species. The main variation of this technique is based on the use of isoschizomers with different methylation sensitivity (such as HpaII and MspI) as a frequent cutter restriction enzyme. For each sample, AFLP analysis is performed using both EcoRI/HpaII and EcoRI/MspI digested samples. Comparative analysis between EcoRI/HpaII and EcoRI/MspI fragment patterns allows the identification of two types of polymorphisms: (1) "Methylation-insensitive polymorphisms" that show common EcoRI/HpaII and EcoRI/MspI patterns but are detected as polymorphic amplified fragments among samples; and (2) "Methylation-sensitive polymorphisms" that are associated with amplified fragments differing in their presence or absence or in their intensity between EcoRI/HpaII and EcoRI/MspI patterns. This chapter describes a detailed protocol of this technique and discusses modifications that can be applied to adjust the technology to different species of interest.

  11. Fundamental and assessment of concrete structure monitoring by using acoustic emission technique testing: A review

    NASA Astrophysics Data System (ADS)

    Desa, M. S. M.; Ibrahim, M. H. W.; Shahidan, S.; Ghadzali, N. S.; Misri, Z.

    2018-04-01

    Acoustic emission (AE) technique is one of the non-destructive (NDT) testing, where it can be used to determine the damage of concrete structures such as crack, corrosion, stability, sensitivity, as structure monitoring and energy formed within cracking opening growth in the concrete structure. This article gives a comprehensive review of the acoustic emission (AE) technique testing due to its application in concrete structure for structural health monitoring (SHM). Assessment of AE technique used for structural are reviewed to give the perception of its structural engineering such as dam, bridge and building, where the previous research has been reviewed based on AE application. The assessment of AE technique focusing on basic fundamental of parametric and signal waveform analysis during analysis process and its capability in structural monitoring. Moreover, the assessment and application of AE due to its function have been summarized and highlighted for future references

  12. Comparison of DWI and 18F-FDG PET/CT for assessing preoperative N-staging in gastric cancer: evidence from a meta-analysis.

    PubMed

    Luo, Mingxu; Song, Hongmei; Liu, Gang; Lin, Yikai; Luo, Lintao; Zhou, Xin; Chen, Bo

    2017-10-13

    The diagnostic values of diffusion weighted imaging (DWI) and 18 F-fluorodeoxyglucose positron emission tomography/computed tomography ( 18 F-FDG PET/CT) for N-staging of gastric cancer (GC) were identified and compared. After a systematic search to identify relevant articles, meta-analysis was used to summarize the sensitivities, specificities, and areas under curves (AUCs) for DWI and PET/CT. To better understand the diagnostic utility of DWI and PET/CT for N-staging, the performance of multi-detector computed tomography (MDCT) was used as a reference. Fifteen studies were analyzed. The pooled sensitivity, specificity, and AUC with 95% confidence intervals of DWI were 0.79 (0.73-0.85), 0.69 (0.61-0.77), and 0.81 (0.77-0.84), respectively. For PET/CT, the corresponding values were 0.52 (0.39-0.64), 0.88 (0.61-0.97), and 0.66 (0.62-0.70), respectively. Comparison of the two techniques revealed DWI had higher sensitivity and AUC, but no difference in specificity. DWI exhibited higher sensitivity but lower specificity than MDCT, and 18 F-FDG PET/CT had lower sensitivity and equivalent specificity. Overall, DWI performed better than 18 F-FDG PET/CT for preoperative N-staging in GC. When the efficacy of MDCT was taken as a reference, DWI represented a complementary imaging technique, while 18 F-FDG PET/CT had limited utility for preoperative N-staging.

  13. A comprehensive approach to identify dominant controls of the behavior of a land surface-hydrology model across various hydroclimatic conditions

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, Amin; Elshamy, Mohamed; Yassin, Fuad; Razavi, Saman; Wheater, Howard; Pietroniro, Al

    2017-04-01

    Complex physically-based environmental models are being increasingly used as the primary tool for watershed planning and management due to advances in computation power and data acquisition. Model sensitivity analysis plays a crucial role in understanding the behavior of these complex models and improving their performance. Due to the non-linearity and interactions within these complex models, Global sensitivity analysis (GSA) techniques should be adopted to provide a comprehensive understanding of model behavior and identify its dominant controls. In this study we adopt a multi-basin multi-criteria GSA approach to systematically assess the behavior of the Modélisation Environmentale-Surface et Hydrologie (MESH) across various hydroclimatic conditions in Canada including areas in the Great Lakes Basin, Mackenzie River Basin, and South Saskatchewan River Basin. MESH is a semi-distributed physically-based coupled land surface-hydrology modelling system developed by Environment and Climate Change Canada (ECCC) for various water resources management purposes in Canada. We use a novel method, called Variogram Analysis of Response Surfaces (VARS), to perform sensitivity analysis. VARS is a variogram-based GSA technique that can efficiently provide a spectrum of sensitivity information across a range of scales within the parameter space. We use multiple metrics to identify dominant controls of model response (e.g. streamflow) to model parameters under various conditions such as high flows, low flows, and flow volume. We also investigate the influence of initial conditions on model behavior as part of this study. Our preliminary results suggest that this type of GSA can significantly help with estimating model parameters, decreasing calibration computational burden, and reducing prediction uncertainty.

  14. Online immunoaffinity LC/MS/MS. A general method to increase sensitivity and specificity: How do you do it and what do you need?

    PubMed

    Dufield, Dawn R; Radabaugh, Melissa R

    2012-02-01

    There is an increased emphasis on hyphenated techniques such as immunoaffinity LC/MS/MS (IA-LC/MS/MS) or IA-LC/MRM. These techniques offer competitive advantages with respect to sensitivity and selectivity over traditional LC/MS and are complementary to ligand binding assays (LBA) or ELISA's. However, these techniques are not entirely straightforward and there are several tips and tricks to routine sample analysis. We describe here our methods and procedures for how to perform online IA-LC/MS/MS including a detailed protocol for the preparation of antibody (Ab) enrichment columns. We have included sample trapping and Ab methods. Furthermore, we highlight tips, tricks, minimal and optimal approaches. This technology has been shown to be viable for several applications, species and fluids from small molecules to proteins and biomarkers to PK assays. Copyright © 2011 Elsevier Inc. All rights reserved.

  15. Fast and sensitive trace analysis of malachite green using a surface-enhanced Raman microfluidic sensor.

    PubMed

    Lee, Sangyeop; Choi, Junghyun; Chen, Lingxin; Park, Byungchoon; Kyong, Jin Burm; Seong, Gi Hun; Choo, Jaebum; Lee, Yeonjung; Shin, Kyung-Hoon; Lee, Eun Kyu; Joo, Sang-Woo; Lee, Kyeong-Hee

    2007-05-08

    A rapid and highly sensitive trace analysis technique for determining malachite green (MG) in a polydimethylsiloxane (PDMS) microfluidic sensor was investigated using surface-enhanced Raman spectroscopy (SERS). A zigzag-shaped PDMS microfluidic channel was fabricated for efficient mixing between MG analytes and aggregated silver colloids. Under the optimal condition of flow velocity, MG molecules were effectively adsorbed onto silver nanoparticles while flowing along the upper and lower zigzag-shaped PDMS channel. A quantitative analysis of MG was performed based on the measured peak height at 1615 cm(-1) in its SERS spectrum. The limit of detection, using the SERS microfluidic sensor, was found to be below the 1-2 ppb level and this low detection limit is comparable to the result of the LC-Mass detection method. In the present study, we introduce a new conceptual detection technology, using a SERS microfluidic sensor, for the highly sensitive trace analysis of MG in water.

  16. Shape reanalysis and sensitivities utilizing preconditioned iterative boundary solvers

    NASA Technical Reports Server (NTRS)

    Guru Prasad, K.; Kane, J. H.

    1992-01-01

    The computational advantages associated with the utilization of preconditined iterative equation solvers are quantified for the reanalysis of perturbed shapes using continuum structural boundary element analysis (BEA). Both single- and multi-zone three-dimensional problems are examined. Significant reductions in computer time are obtained by making use of previously computed solution vectors and preconditioners in subsequent analyses. The effectiveness of this technique is demonstrated for the computation of shape response sensitivities required in shape optimization. Computer times and accuracies achieved using the preconditioned iterative solvers are compared with those obtained via direct solvers and implicit differentiation of the boundary integral equations. It is concluded that this approach employing preconditioned iterative equation solvers in reanalysis and sensitivity analysis can be competitive with if not superior to those involving direct solvers.

  17. Fundamental study of flow field generated by rotorcraft blades using wide-field shadowgraph

    NASA Technical Reports Server (NTRS)

    Parthasarathy, S. P.; Cho, Y. I.; Back, L. H.

    1985-01-01

    The vortex trajectory and vortex wake generated by helicopter rotors are visualized using a wide-field shadowgraph technique. Use of a retro-reflective Scotchlite screen makes it possible to investigate the flow field generated by full-scale rotors. Tip vortex trajectories are visible in shadowgraphs for a range of tip Mach number of 0.38 to 0.60. The effect of the angle of attack is substantial. At an angle of attack greater than 8 degrees, the visibility of the vortex core is significant even at relatively low tip Mach numbers. The theoretical analysis of the sensitivity is carried out for a rotating blade. This analysis demonstrates that the sensitivity decreases with increasing dimensionless core radius and increases with increasing tip Mach number. The threshold value of the sensitivity is found to be 0.0015, below which the vortex core is not visible and above which it is visible. The effect of the optical path length is also discussed. Based on this investigation, it is concluded that the application of this wide-field shadowgraph technique to a large wind tunnel test should be feasible. In addition, two simultaneous shadowgraph views would allow three-dimensional reconstruction of vortex trajectories.

  18. Ultra-high resolution, polarization sensitive transversal optical coherence tomography for structural analysis and strain mapping

    NASA Astrophysics Data System (ADS)

    Wiesauer, Karin; Pircher, Michael; Goetzinger, Erich; Hitzenberger, Christoph K.; Engelke, Rainer; Ahrens, Gisela; Pfeiffer, Karl; Ostrzinski, Ute; Gruetzner, Gabi; Oster, Reinhold; Stifter, David

    2006-02-01

    Optical coherence tomography (OCT) is a contactless and non-invasive technique nearly exclusively applied for bio-medical imaging of tissues. Besides the internal structure, additionally strains within the sample can be mapped when OCT is performed in a polarization sensitive (PS) way. In this work, we demonstrate the benefits of PS-OCT imaging for non-biological applications. We have developed the OCT technique beyond the state-of-the-art: based on transversal ultra-high resolution (UHR-)OCT, where an axial resolution below 2 μm within materials is obtained using a femtosecond laser as light source, we have modified the setup for polarization sensitive measurements (transversal UHR-PS-OCT). We perform structural analysis and strain mapping for different types of samples: for a highly strained elastomer specimen we demonstrate the necessity of UHR-imaging. Furthermore, we investigate epoxy waveguide structures, photoresist moulds for the fabrication of micro-electromechanical parts (MEMS), and the glass-fibre composite outer shell of helicopter rotor blades where cracks are present. For these examples, transversal scanning UHR-PS-OCT is shown to provide important information about the structural properties and the strain distribution within the samples.

  19. Analysis of DNA methylation in Arabidopsis thaliana based on methylation-sensitive AFLP markers.

    PubMed

    Cervera, M T; Ruiz-García, L; Martínez-Zapater, J M

    2002-12-01

    AFLP analysis using restriction enzyme isoschizomers that differ in their sensitivity to methylation of their recognition sites has been used to analyse the methylation state of anonymous CCGG sequences in Arabidopsis thaliana. The technique was modified to improve the quality of fingerprints and to visualise larger numbers of scorable fragments. Sequencing of amplified fragments indicated that detection was generally associated with non-methylation of the cytosine to which the isoschizomer is sensitive. Comparison of EcoRI/ HpaII and EcoRI/ MspI patterns in different ecotypes revealed that 35-43% of CCGG sites were differentially digested by the isoschizomers. Interestingly, the pattern of digestion among different plants belonging to the same ecotype is highly conserved, with the rate of intra-ecotype methylation-sensitive polymorphisms being less than 1%. However, pairwise comparisons of methylation patterns between samples belonging to different ecotypes revealed differences in up to 34% of the methylation-sensitive polymorphisms. The lack of correlation between inter-ecotype similarity matrices based on methylation-insensitive or methylation-sensitive polymorphisms suggests that whatever the mechanisms regulating methylation may be, they are not related to nucleotide sequence variation.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Haitao, E-mail: liaoht@cae.ac.cn

    The direct differentiation and improved least squares shadowing methods are both developed for accurately and efficiently calculating the sensitivity coefficients of time averaged quantities for chaotic dynamical systems. The key idea is to recast the time averaged integration term in the form of differential equation before applying the sensitivity analysis method. An additional constraint-based equation which forms the augmented equations of motion is proposed to calculate the time averaged integration variable and the sensitivity coefficients are obtained as a result of solving the augmented differential equations. The application of the least squares shadowing formulation to the augmented equations results inmore » an explicit expression for the sensitivity coefficient which is dependent on the final state of the Lagrange multipliers. The LU factorization technique to calculate the Lagrange multipliers leads to a better performance for the convergence problem and the computational expense. Numerical experiments on a set of problems selected from the literature are presented to illustrate the developed methods. The numerical results demonstrate the correctness and effectiveness of the present approaches and some short impulsive sensitivity coefficients are observed by using the direct differentiation sensitivity analysis method.« less

  1. Prostate lesion detection and localization based on locality alignment discriminant analysis

    NASA Astrophysics Data System (ADS)

    Lin, Mingquan; Chen, Weifu; Zhao, Mingbo; Gibson, Eli; Bastian-Jordan, Matthew; Cool, Derek W.; Kassam, Zahra; Chow, Tommy W. S.; Ward, Aaron; Chiu, Bernard

    2017-03-01

    Prostatic adenocarcinoma is one of the most commonly occurring cancers among men in the world, and it also the most curable cancer when it is detected early. Multiparametric MRI (mpMRI) combines anatomic and functional prostate imaging techniques, which have been shown to produce high sensitivity and specificity in cancer localization, which is important in planning biopsies and focal therapies. However, in previous investigations, lesion localization was achieved mainly by manual segmentation, which is time-consuming and prone to observer variability. Here, we developed an algorithm based on locality alignment discriminant analysis (LADA) technique, which can be considered as a version of linear discriminant analysis (LDA) localized to patches in the feature space. Sensitivity, specificity and accuracy generated by the proposed algorithm in five prostates by LADA were 52.2%, 89.1% and 85.1% respectively, compared to 31.3%, 85.3% and 80.9% generated by LDA. The delineation accuracy attainable by this tool has a potential in increasing the cancer detection rate in biopsies and in minimizing collateral damage of surrounding tissues in focal therapies.

  2. Simulation studies of wide and medium field of view earth radiation data analysis

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1978-01-01

    A parameter estimation technique is presented to estimate the radiative flux distribution over the earth from radiometer measurements at satellite altitude. The technique analyzes measurements from a wide field of view (WFOV), horizon to horizon, nadir pointing sensor with a mathematical technique to derive the radiative flux estimates at the top of the atmosphere for resolution elements smaller than the sensor field of view. A computer simulation of the data analysis technique is presented for both earth-emitted and reflected radiation. Zonal resolutions are considered as well as the global integration of plane flux. An estimate of the equator-to-pole gradient is obtained from the zonal estimates. Sensitivity studies of the derived flux distribution to directional model errors are also presented. In addition to the WFOV results, medium field of view results are presented.

  3. One way Doppler extractor. Volume 1: Vernier technique

    NASA Technical Reports Server (NTRS)

    Blasco, R. W.; Klein, S.; Nossen, E. J.; Starner, E. R.; Yanosov, J. A.

    1974-01-01

    A feasibility analysis, trade-offs, and implementation for a One Way Doppler Extraction system are discussed. A Doppler error analysis shows that quantization error is a primary source of Doppler measurement error. Several competing extraction techniques are compared and a Vernier technique is developed which obtains high Doppler resolution with low speed logic. Parameter trade-offs and sensitivities for the Vernier technique are analyzed, leading to a hardware design configuration. A detailed design, operation, and performance evaluation of the resulting breadboard model is presented which verifies the theoretical performance predictions. Performance tests have verified that the breadboard is capable of extracting Doppler, on an S-band signal, to an accuracy of less than 0.02 Hertz for a one second averaging period. This corresponds to a range rate error of no more than 3 millimeters per second.

  4. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  5. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples

    NASA Astrophysics Data System (ADS)

    Ahmed, Hytham M.; Ebeid, Wael B.

    2015-05-01

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200 ng/ml respectively and recovery was >98% (n = 5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000 IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period.

  6. Sensitivity analysis of dynamic biological systems with time-delays.

    PubMed

    Wu, Wu Hsiung; Wang, Feng Sheng; Chang, Maw Shang

    2010-10-15

    Mathematical modeling has been applied to the study and analysis of complex biological systems for a long time. Some processes in biological systems, such as the gene expression and feedback control in signal transduction networks, involve a time delay. These systems are represented as delay differential equation (DDE) models. Numerical sensitivity analysis of a DDE model by the direct method requires the solutions of model and sensitivity equations with time-delays. The major effort is the computation of Jacobian matrix when computing the solution of sensitivity equations. The computation of partial derivatives of complex equations either by the analytic method or by symbolic manipulation is time consuming, inconvenient, and prone to introduce human errors. To address this problem, an automatic approach to obtain the derivatives of complex functions efficiently and accurately is necessary. We have proposed an efficient algorithm with an adaptive step size control to compute the solution and dynamic sensitivities of biological systems described by ordinal differential equations (ODEs). The adaptive direct-decoupled algorithm is extended to solve the solution and dynamic sensitivities of time-delay systems describing by DDEs. To save the human effort and avoid the human errors in the computation of partial derivatives, an automatic differentiation technique is embedded in the extended algorithm to evaluate the Jacobian matrix. The extended algorithm is implemented and applied to two realistic models with time-delays: the cardiovascular control system and the TNF-α signal transduction network. The results show that the extended algorithm is a good tool for dynamic sensitivity analysis on DDE models with less user intervention. By comparing with direct-coupled methods in theory, the extended algorithm is efficient, accurate, and easy to use for end users without programming background to do dynamic sensitivity analysis on complex biological systems with time-delays.

  7. Recent approaches for enhancing sensitivity in enantioseparations by CE.

    PubMed

    Sánchez-Hernández, Laura; García-Ruiz, Carmen; Luisa Marina, María; Luis Crego, Antonio

    2010-01-01

    This article reviews the latest methodological and instrumental improvements for enhancing sensitivity in chiral analysis by CE. The review covers literature from March 2007 until May 2009, that is, the works published after the appearance of the latest review article on the same topic by Sánchez-Hernández et al. [Electrophoresis 2008, 29, 237-251]. Off-line and on-line sample treatment techniques, on-line sample preconcentration strategies based on electrophoretic and chromatographic principles, and alternative detection systems to the widely employed UV/Vis detection in CE are the most relevant approaches discussed for improving sensitivity. Microchip technologies are also included since they can open up great possibilities to achieve sensitive and fast enantiomeric separations.

  8. Denaturing high-performance liquid chromatography for mutation detection and genotyping.

    PubMed

    Fackenthal, Donna Lee; Chen, Pei Xian; Howe, Ted; Das, Soma

    2013-01-01

    Denaturing high-performance liquid chromatography (DHPLC) is an accurate and efficient screening technique used for detecting DNA sequence changes by heteroduplex analysis. It can also be used for genotyping of single nucleotide polymorphisms (SNPs). The high sensitivity of DHPLC has made this technique one of the most reliable approaches to mutation analysis and, therefore, used in various areas of genetics, both in the research and clinical arena. This chapter describes the methods used for mutation detection analysis and the genotyping of SNPs by DHPLC on the WAVE™ system from Transgenomic Inc. ("WAVE" and "DNASep" are registered trademarks, and "Navigator" is a trademark, of Transgenomic, used with permission. All other trademarks are property of the respective owners).

  9. Advances in Mid-Infrared Spectroscopy for Chemical Analysis

    NASA Astrophysics Data System (ADS)

    Haas, Julian; Mizaikoff, Boris

    2016-06-01

    Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.

  10. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  11. A sensitivity analysis of process design parameters, commodity prices and robustness on the economics of odour abatement technologies.

    PubMed

    Estrada, José M; Kraakman, N J R Bart; Lebrero, Raquel; Muñoz, Raúl

    2012-01-01

    The sensitivity of the economics of the five most commonly applied odour abatement technologies (biofiltration, biotrickling filtration, activated carbon adsorption, chemical scrubbing and a hybrid technology consisting of a biotrickling filter coupled with carbon adsorption) towards design parameters and commodity prices was evaluated. Besides, the influence of the geographical location on the Net Present Value calculated for a 20 years lifespan (NPV20) of each technology and its robustness towards typical process fluctuations and operational upsets were also assessed. This comparative analysis showed that biological techniques present lower operating costs (up to 6 times) and lower sensitivity than their physical/chemical counterparts, with the packing material being the key parameter affecting their operating costs (40-50% of the total operating costs). The use of recycled or partially treated water (e.g. secondary effluent in wastewater treatment plants) offers an opportunity to significantly reduce costs in biological techniques. Physical/chemical technologies present a high sensitivity towards H2S concentration, which is an important drawback due to the fluctuating nature of malodorous emissions. The geographical analysis evidenced high NPV20 variations around the world for all the technologies evaluated, but despite the differences in wage and price levels, biofiltration and biotrickling filtration are always the most cost-efficient alternatives (NPV20). When, in an economical evaluation, the robustness is as relevant as the overall costs (NPV20), the hybrid technology would move up next to BTF as the most preferred technologies. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Electrochemical Quartz Crystal Nanobalance (EQCN) Based Biosensor for Sensitive Detection of Antibiotic Residues in Milk.

    PubMed

    Bhand, Sunil; Mishra, Geetesh K

    2017-01-01

    An electrochemical quartz crystal nanobalance (EQCN), which provides real-time analysis of dynamic surface events, is a valuable tool for analyzing biomolecular interactions. EQCN biosensors are based on mass-sensitive measurements that can detect small mass changes caused by chemical binding to small piezoelectric crystals. Among the various biosensors, the piezoelectric biosensor is considered one of the most sensitive analytical techniques, capable of detecting antigens at picogram levels. EQCN is an effective monitoring technique for regulation of the antibiotics below the maximum residual limit (MRL). The analysis of antibiotic residues requires high sensitivity, rapidity, reliability and cost effectiveness. For analytical purposes the general approach is to take advantage of the piezoelectric effect by immobilizing a biosensing layer on top of the piezoelectric crystal. The sensing layer usually comprises a biological material such as an antibody, enzymes, or aptamers having high specificity and selectivity for the target molecule to be detected. The biosensing layer is usually functionalized using surface chemistry modifications. When these bio-functionalized quartz crystals are exposed to a particular substance of interest (e.g., a substrate, inhibitor, antigen or protein), binding interaction occurs. This causes a frequency or mass change that can be used to determine the amount of material interacted or bound. EQCN biosensors can easily be automated by using a flow injection analysis (FIA) setup coupled through automated pumps and injection valves. Such FIA-EQCN biosensors have great potential for the detection of different analytes such as antibiotic residues in various matrices such as water, waste water, and milk.

  13. Nonlinear Acoustic and Ultrasonic NDT of Aeronautical Components

    NASA Astrophysics Data System (ADS)

    Van Den Abeele, Koen; Katkowski, Tomasz; Mattei, Christophe

    2006-05-01

    In response to the demand for innovative microdamage inspection systems, with high sensitivity and undoubted accuracy, we are currently investigating the use and robustness of several acoustic and ultrasonic NDT techniques based on Nonlinear Elastic Wave Spectroscopy (NEWS) for the characterization of microdamage in aeronautical components. In this report, we illustrate the results of an amplitude dependent analysis of the resonance behaviour, both in time (signal reverberation) and in frequency (sweep) domain. The technique is applied to intact and damaged samples of Carbon Fiber Reinforced Plastics (CFRP) composites after thermal loading or mechanical fatigue. The method shows a considerable gain in sensitivity and an incontestable interpretation of the results for nonlinear signatures in comparison with the linear characteristics. For highly fatigued samples, slow dynamical effects are observed.

  14. A technique for using radio jets as extended gravitational lensing probes

    NASA Technical Reports Server (NTRS)

    Kronberg, Philipp P.; Dyer, Charles C.; Burbidge, E. Margaret; Junkkarinen, Vesa T.

    1991-01-01

    A new and potentially powerful method of measuring the mass of a galaxy (or dark matter concentration) which lies close in position to a background polarized radio jet is proposed. Using the fact that the polarization angle is not changed by lensing, an 'alignment-breaking parameter' is defined which is a sensitive indicator of gravitational distortion. The method remains sensitive over a wide redshift range of the gravitational lens. This technique is applied to the analysis of polarimetric observations of the jet of 3C 9 at z = 2.012, combined with a newly discovered 20.3 mag foreground galaxy at z = 0.2538 to 'weigh' the galaxy and obtain an approximate upper limit to the mass-to-light ratio.

  15. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  16. Sensitivity analysis of discrete structural systems: A survey

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.

    1984-01-01

    Methods for calculating sensitivity derivatives for discrete structural systems are surveyed, primarily covering literature published during the past two decades. Methods are described for calculating derivatives of static displacements and stresses, eigenvalues and eigenvectors, transient structural response, and derivatives of optimum structural designs with respect to problem parameters. The survey is focused on publications addressed to structural analysis, but also includes a number of methods developed in nonstructural fields such as electronics, controls, and physical chemistry which are directly applicable to structural problems. Most notable among the nonstructural-based methods are the adjoint variable technique from control theory, and the Green's function and FAST methods from physical chemistry.

  17. Non-volatile analysis in fruits by laser resonant ionization spectrometry: application to resveratrol (3,5,4'-trihydroxystilbene) in grapes

    NASA Astrophysics Data System (ADS)

    Montero, C.; Orea, J. M.; Soledad Muñoz, M.; Lobo, R. F. M.; González Ureña, A.

    A laser desorption (LD) coupled with resonance-enhanced multiphoton ionisation (REMPI) and time-of-flight mass spectrometry (TOFMS) technique for non-volatile trace analysis compounds is presented. Essential features are: (a) an enhanced desorption yield due to the mixing of metal powder with the analyte in the sample preparation, (b) a high resolution, great sensitivity and low detection limit due to laser resonant ionisation and mass spectrometry detection. Application to resveratrol content in grapes demonstrated the capability of the analytical method with a sensitivity of 0.2 pg per single laser shot and a detection limit of 5 ppb.

  18. Modified GMDH-NN algorithm and its application for global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Song, Shufang; Wang, Lu

    2017-11-01

    Global sensitivity analysis (GSA) is a very useful tool to evaluate the influence of input variables in the whole distribution range. Sobol' method is the most commonly used among variance-based methods, which are efficient and popular GSA techniques. High dimensional model representation (HDMR) is a popular way to compute Sobol' indices, however, its drawbacks cannot be ignored. We show that modified GMDH-NN algorithm can calculate coefficients of metamodel efficiently, so this paper aims at combining it with HDMR and proposes GMDH-HDMR method. The new method shows higher precision and faster convergent rate. Several numerical and engineering examples are used to confirm its advantages.

  19. Real-time in vivo diagnosis of laryngeal carcinoma with rapid fiber-optic Raman spectroscopy

    PubMed Central

    Lin, Kan; Zheng, Wei; Lim, Chwee Ming; Huang, Zhiwei

    2016-01-01

    We assess the clinical utility of a unique simultaneous fingerprint (FP) (i.e., 800-1800 cm−1) and high-wavenumber (HW) (i.e., 2800-3600 cm−1) fiber-optic Raman spectroscopy for in vivo diagnosis of laryngeal cancer at endoscopy. A total of 2124 high-quality in vivo FP/HW Raman spectra (normal = 1321; cancer = 581) were acquired from 101 tissue sites (normal = 71; cancer = 30) of 60 patients (normal = 44; cancer = 16) undergoing routine endoscopic examination. FP/HW Raman spectra differ significantly between normal and cancerous laryngeal tissue that could be attributed to changes of proteins, lipids, nucleic acids, and the bound water content in the larynx. Partial least squares-discriminant analysis and leave-one tissue site-out, cross-validation were employed on the in vivo FP/HW tissue Raman spectra acquired, yielding a diagnostic accuracy of 91.1% (sensitivity: 93.3% (28/30); specificity: 90.1% (64/71)) for laryngeal cancer identification, which is superior to using either FP (accuracy: 86.1%; sensitivity: 86.7% (26/30); specificity: 85.9% (61/71)) or HW (accuracy: 84.2%; sensitivity: 76.7% (23/30); specificity: 87.3% (62/71)) Raman technique alone. Further receiver operating characteristic analysis reconfirms the best performance of the simultaneous FP/HW Raman technique for laryngeal cancer diagnosis. We demonstrate for the first time that the simultaneous FP/HW Raman spectroscopy technique can be used for improving real-time in vivo diagnosis of laryngeal carcinoma during endoscopic examination. PMID:27699131

  20. Development of MRM-based assays for the absolute quantitation of plasma proteins.

    PubMed

    Kuzyk, Michael A; Parker, Carol E; Domanski, Dominik; Borchers, Christoph H

    2013-01-01

    Multiple reaction monitoring (MRM), sometimes called selected reaction monitoring (SRM), is a directed tandem mass spectrometric technique performed on to triple quadrupole mass spectrometers. MRM assays can be used to sensitively and specifically quantify proteins based on peptides that are specific to the target protein. Stable-isotope-labeled standard peptide analogues (SIS peptides) of target peptides are added to enzymatic digests of samples, and quantified along with the native peptides during MRM analysis. Monitoring of the intact peptide and a collision-induced fragment of this peptide (an ion pair) can be used to provide information on the absolute peptide concentration of the peptide in the sample and, by inference, the concentration of the intact protein. This technique provides high specificity by selecting for biophysical parameters that are unique to the target peptides: (1) the molecular weight of the peptide, (2) the generation of a specific fragment from the peptide, and (3) the HPLC retention time during LC/MRM-MS analysis. MRM is a highly sensitive technique that has been shown to be capable of detecting attomole levels of target peptides in complex samples such as tryptic digests of human plasma. This chapter provides a detailed description of how to develop and use an MRM protein assay. It includes sections on the critical "first step" of selecting the target peptides, as well as optimization of MRM acquisition parameters for maximum sensitivity of the ion pairs that will be used in the final method, and characterization of the final MRM assay.

  1. FDTD based model of ISOCT imaging for validation of nanoscale sensitivity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Eid, Aya; Zhang, Di; Yi, Ji; Backman, Vadim

    2017-02-01

    Many of the earliest structural changes associated with neoplasia occur on the micro and nanometer scale, and thus appear histologically normal. Our group has established Inverse Spectroscopic OCT (ISOCT), a spectral based technique to extract nanoscale sensitive metrics derived from the OCT signal. Thus, there is a need to model light transport through relatively large volumes (< 50 um^3) of media with nanoscale level resolution. Finite Difference Time Domain (FDTD) is an iterative approach which directly solves Maxwell's equations to robustly estimate the electric and magnetic fields propagating through a sample. The sample's refractive index for every spatial voxel and wavelength are specified upon a grid with voxel sizes on the order of λ/20, making it an ideal modelling technique for nanoscale structure analysis. Here, we utilize the FDTD technique to validate the nanoscale sensing ability of ISOCT. The use of FDTD for OCT modelling requires three components: calculating the source beam as it propagates through the optical system, computing the sample's scattered field using FDTD, and finally propagating the scattered field back through the optical system. The principles of Fourier optics are employed to focus this interference field through a 4f optical system and onto the detector. Three-dimensional numerical samples are generated from a given refractive index correlation function with known parameters, and subsequent OCT images and mass density correlation function metrics are computed. We show that while the resolvability of the OCT image remains diffraction limited, spectral analysis allows nanoscale sensitive metrics to be extracted.

  2. Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis

    NASA Technical Reports Server (NTRS)

    Velez-Reyes, Miguel; Joiner, Joanna

    1998-01-01

    In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.

  3. Sensitivity analysis of infectious disease models: methods, advances and their application

    PubMed Central

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  4. A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.

    PubMed

    Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi

    2016-10-01

    We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.

  5. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  6. Capillary electrophoresis with laser-induced fluorescence detection: a sensitive method for monitoring extracellular concentrations of amino acids in the periaqueductal grey matter.

    PubMed

    Bergquist, J; Vona, M J; Stiller, C O; O'Connor, W T; Falkenberg, T; Ekman, R

    1996-03-01

    The use of capillary electrophoresis with laser-induced fluorescence detection (CE-LIF) for the analysis of microdialysate samples from the periaqueductal grey matter (PAG) of freely moving rats is described. By employing 3-(4-carboxybenzoyl)-2-quinoline-carboxaldehyde (CBQCA) as a derivatization agent, we simultaneously monitored the concentrations of 8 amino acids (arginine, glutamine, valine, gamma-amino-n-butyric acid (GABA), alanine, glycine, glutamate, and aspartate), with nanomolar and subnanomolar detection limits. Two of the amino acids (GABA and glutamate) were analysed in parallel by conventional high-performance liquid chromatography (HPLC) in order to directly compare the two analytical methods. Other CE methods for analysis of microdialysate have been previously described, and this improved method offers greater sensitivity, ease of use, and the possibility to monitor several amino acids simultaneously. By using this technique together with an optimised form of microdialysis technique, the tiny sample consumption and the improved detection limits permit the detection of fast and transient transmitter changes.

  7. A Fast and Sensitive New Satellite SO2 Retrieval Algorithm based on Principal Component Analysis: Application to the Ozone Monitoring Instrument

    NASA Technical Reports Server (NTRS)

    Li, Can; Joiner, Joanna; Krotkov, A.; Bhartia, Pawan K.

    2013-01-01

    We describe a new algorithm to retrieve SO2 from satellite-measured hyperspectral radiances. We employ the principal component analysis technique in regions with no significant SO2 to capture radiance variability caused by both physical processes (e.g., Rayleigh and Raman scattering and ozone absorption) and measurement artifacts. We use the resulting principal components and SO2 Jacobians calculated with a radiative transfer model to directly estimate SO2 vertical column density in one step. Application to the Ozone Monitoring Instrument (OMI) radiance spectra in 310.5-340 nm demonstrates that this approach can greatly reduce biases in the operational OMI product and decrease the noise by a factor of 2, providing greater sensitivity to anthropogenic emissions. The new algorithm is fast, eliminates the need for instrument-specific radiance correction schemes, and can be easily adapted to other sensors. These attributes make it a promising technique for producing longterm, consistent SO2 records for air quality and climate research.

  8. Assessment of insulin sensitivity by the hyperinsulinemic euglycemic clamp: Comparison with the spectral analysis of photoplethysmography.

    PubMed

    De Souza, Aglecio Luiz; Batista, Gisele Almeida; Alegre, Sarah Monte

    2017-01-01

    We compare spectral analysis of photoplethysmography (PTG) with insulin resistance measured by the hyperinsulinemic euglycemic clamp (HEC) technique. A total of 100 nondiabetic subjects, 43 men and 57 women aged 20-63years, 30 lean, 42 overweight and 28 obese were enrolled in the study. These patients underwent an examination with HEC, and an examination with the PTG spectral analysis and calculation of the PTG Total Power (PTG-TP). Receiver-operating characteristic (ROC) curves were constructed to determine the specificity and sensitivity of PTG-TP in the assessment of insulin resistance. There is a moderate correlation between insulin sensitivity (M-value) and PTG-TP (r=- 0.64, p<0.0001). The ROC curves showed that the most relevant cutoff to the whole study group was a PTG-TP>406.2. This cut-off had a sensitivity=95.7%, specificity =84,4% and the area under the ROC curve (AUC)=0.929 for identifying insulin resistance. All AUC ROC curve analysis were significant (p<0.0001). The use of the PTG-TP marker measured from the PTG spectral analysis is a useful tool in screening and follow up of IR, especially in large-scale studies. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Recent trends in atomic fluorescence spectrometry towards miniaturized instrumentation-A review.

    PubMed

    Zou, Zhirong; Deng, Yujia; Hu, Jing; Jiang, Xiaoming; Hou, Xiandeng

    2018-08-17

    Atomic fluorescence spectrometry (AFS), as one of the common atomic spectrometric techniques with high sensitivity, simple instrumentation, and low acquisition and running cost, has been widely used in various fields for trace elemental analysis, notably the determination of hydride-forming elements by hydride generation atomic fluorescence spectrometry (HG-AFS). In recent years, the soaring demand of field analysis has significantly promoted the miniaturization of analytical atomic spectrometers or at least instrumental components. Various techniques have also been developed to approach the goal of portable/miniaturized AFS instrumentation for field analysis. In this review, potentially portable/miniaturized AFS techniques, primarily involving advanced instrumental components and whole instrumentation with references since 2000, are summarized and discussed. The discussion mainly includes five aspects: radiation source, atomizer, detector, sample introduction, and miniaturized atomic fluorescence spectrometer/system. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Combined spectral-domain optical coherence tomography and hyperspectral imaging applied for tissue analysis: Preliminary results

    NASA Astrophysics Data System (ADS)

    Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.

    2017-09-01

    In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.

  11. Detection of Gunshot Residues Using Mass Spectrometry

    PubMed Central

    Blanes, Lucas; Cole, Nerida; Doble, Philip; Roux, Claude

    2014-01-01

    In recent years, forensic scientists have become increasingly interested in the detection and interpretation of organic gunshot residues (OGSR) due to the increasing use of lead- and heavy metal-free ammunition. This has also been prompted by the identification of gunshot residue- (GSR-) like particles in environmental and occupational samples. Various techniques have been investigated for their ability to detect OGSR. Mass spectrometry (MS) coupled to a chromatographic system is a powerful tool due to its high selectivity and sensitivity. Further, modern MS instruments can detect and identify a number of explosives and additives which may require different ionization techniques. Finally, MS has been applied to the analysis of both OGSR and inorganic gunshot residue (IGSR), although the “gold standard” for analysis is scanning electron microscopy with energy dispersive X-ray microscopy (SEM-EDX). This review presents an overview of the technical attributes of currently available MS and ionization techniques and their reported applications to GSR analysis. PMID:24977168

  12. A New Heuristic Anonymization Technique for Privacy Preserved Datasets Publication on Cloud Computing

    NASA Astrophysics Data System (ADS)

    Aldeen Yousra, S.; Mazleena, Salleh

    2018-05-01

    Recent advancement in Information and Communication Technologies (ICT) demanded much of cloud services to sharing users’ private data. Data from various organizations are the vital information source for analysis and research. Generally, this sensitive or private data information involves medical, census, voter registration, social network, and customer services. Primary concern of cloud service providers in data publishing is to hide the sensitive information of individuals. One of the cloud services that fulfill the confidentiality concerns is Privacy Preserving Data Mining (PPDM). The PPDM service in Cloud Computing (CC) enables data publishing with minimized distortion and absolute privacy. In this method, datasets are anonymized via generalization to accomplish the privacy requirements. However, the well-known privacy preserving data mining technique called K-anonymity suffers from several limitations. To surmount those shortcomings, I propose a new heuristic anonymization framework for preserving the privacy of sensitive datasets when publishing on cloud. The advantages of K-anonymity, L-diversity and (α, k)-anonymity methods for efficient information utilization and privacy protection are emphasized. Experimental results revealed the superiority and outperformance of the developed technique than K-anonymity, L-diversity, and (α, k)-anonymity measure.

  13. Optical power-based interrogation of plasmonic tilted fiber Bragg grating biosensors

    NASA Astrophysics Data System (ADS)

    González-Vila, Á.; Lopez-Aldaba, A.; Kinet, D.; Mégret, P.; Lopez-Amo, M.; Caucheteur, C.

    2017-04-01

    Two interrogation techniques for plasmonic tilted fiber Bragg grating sensors are reported and experimentally tested. Typical interrogation methods are usually based on tracking the wavelength shift of the most sensitive cladding mode, but for biosensing applications, spectrometer-based methods can be replaced by more efficient solutions. The proposed techniques thus rely on the measurement of the induced changes in optical power. The first one consists of a properly polarized tunable laser source set to emit at the wavelength of the sensor most sensitive mode and an optical power meter to measure the transmitted response. For the second method, a uniform fiber Bragg grating is photo-inscribed beyond the sensor in such a way that its central wavelength matches the sensor most sensitive mode, acting as an optical filter. Using a LED source, light reflected backwards by this grating is partially attenuated when passing through the sensor due to plasmon wave excitation and the power changes are quantified once again with an optical power meter. A performance analysis of the techniques is carried out and they both result competitive interrogation solutions. The work thus focuses on the development of cost-effective alternatives for monitoring this kind of biosensors in practical situations.

  14. Detection of malignant lesions in vivo in the upper gastrointestinal tract using image-guided Raman endoscopy

    NASA Astrophysics Data System (ADS)

    Bergholt, Mads Sylvest; Zheng, Wei; Lin, Kan; Ho, Khek Yu; Yeoh, Khay Guan; Teh, Ming; So, Jimmy Bok Yan; Huang, Zhiwei

    2012-01-01

    Raman spectroscopy is a vibrational analytic technique sensitive to the changes in biomolecular composition and conformations occurring in tissue. With our most recent development of near-infrared (NIR) Raman endoscopy integrated with diagnostic algorithms, in vivo real-time Raman diagnostics has been realized under multimodal wide-field imaging (i.e., white- light reflectance (WLR), narrow-band imaging (NBI), autofluorescence imaging (AFI)) modalities. A selection of 177 patients who previously underwent Raman endoscopy (n=2510 spectra) was used to render two robust models based on partial least squares - discriminant analysis (PLS-DA) for esophageal and gastric cancer diagnosis. The Raman endoscopy technique was validated prospectively on 4 new gastric and esophageal patients for in vivo tissue diagnosis. The Raman endoscopic technique could identify esophageal cancer in vivo with a sensitivity of 88.9% (8/9) and specificity of 100.0% (11/11) and gastric cancers with a sensitivity of 77.8% (14/18) and specificity of 100.0% (13/13). This study realizes for the first time the image-guided Raman endoscopy for real-time in vivo diagnosis of malignancies in the esophagus and gastric at the biomolecular level.

  15. Quantitative sensory testing response patterns to capsaicin- and ultraviolet-B-induced local skin hypersensitization in healthy subjects: a machine-learned analysis.

    PubMed

    Lötsch, Jörn; Geisslinger, Gerd; Heinemann, Sarah; Lerch, Florian; Oertel, Bruno G; Ultsch, Alfred

    2017-08-16

    The comprehensive assessment of pain-related human phenotypes requires combinations of nociceptive measures that produce complex high-dimensional data, posing challenges to bioinformatic analysis. In this study, we assessed established experimental models of heat hyperalgesia of the skin, consisting of local ultraviolet-B (UV-B) irradiation or capsaicin application, in 82 healthy subjects using a variety of noxious stimuli. We extended the original heat stimulation by applying cold and mechanical stimuli and assessing the hypersensitization effects with a clinically established quantitative sensory testing (QST) battery (German Research Network on Neuropathic Pain). This study provided a 246 × 10-sized data matrix (82 subjects assessed at baseline, following UV-B application, and following capsaicin application) with respect to 10 QST parameters, which we analyzed using machine-learning techniques. We observed statistically significant effects of the hypersensitization treatments in 9 different QST parameters. Supervised machine-learned analysis implemented as random forests followed by ABC analysis pointed to heat pain thresholds as the most relevantly affected QST parameter. However, decision tree analysis indicated that UV-B additionally modulated sensitivity to cold. Unsupervised machine-learning techniques, implemented as emergent self-organizing maps, hinted at subgroups responding to topical application of capsaicin. The distinction among subgroups was based on sensitivity to pressure pain, which could be attributed to sex differences, with women being more sensitive than men. Thus, while UV-B and capsaicin share a major component of heat pain sensitization, they differ in their effects on QST parameter patterns in healthy subjects, suggesting a lack of redundancy between these models.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

  16. Sensitivity analysis of consumption cycles

    NASA Astrophysics Data System (ADS)

    Jungeilges, Jochen; Ryazanova, Tatyana; Mitrofanova, Anastasia; Popova, Irina

    2018-05-01

    We study the special case of a nonlinear stochastic consumption model taking the form of a 2-dimensional, non-invertible map with an additive stochastic component. Applying the concept of the stochastic sensitivity function and the related technique of confidence domains, we establish the conditions under which the system's complex consumption attractor is likely to become observable. It is shown that the level of noise intensities beyond which the complex consumption attractor is likely to be observed depends on the weight given to past consumption in an individual's preference adjustment.

  17. Sensitivity of double centrifugation sugar fecal flotation for detecting intestinal helminths in coyotes (Canis latrans).

    PubMed

    Liccioli, Stefano; Catalano, Stefano; Kutz, Susan J; Lejeune, Manigandan; Verocai, Guilherme G; Duignan, Padraig J; Fuentealba, Carmen; Ruckstuhl, Kathreen E; Massolo, Alessandro

    2012-07-01

    Fecal analysis is commonly used to estimate prevalence and intensity of intestinal helminths in wild carnivores, but few studies have assessed the reliability of fecal flotation compared to analysis of intestinal tracts. We investigated sensitivity of the double centrifugation sugar fecal flotation and kappa agreement between fecal flotation and postmortem examination of intestines for helminths of coyotes (Canis latrans). We analyzed 57 coyote carcasses that were collected between October 2010 and March 2011 in the metropolitan area of Calgary and Edmonton, Alberta, Canada. Before analyses, intestines and feces were frozen at -80 C for 72 hr to inactivate Echinococcus eggs, protecting operators from potential exposure. Five species of helminths were found by postmortem examination, including Toxascaris leonina, Uncinaria stenocephala, Ancylostoma caninum, Taenia sp., and Echinococcus multilocularis. Sensitivity of fecal flotation was high (0.84) for detection of T. leonina but low for Taenia sp. (0.27), E. multilocularis (0.46), and U. stenocephala (0.00). Good kappa agreement between techniques was observed only for T. leonina (0.64), for which we detected also a significant correlation between adult female parasite intensity and fecal egg counts (R(s)=0.53, P=0.01). Differences in sensitivity may be related to parasite characteristics that affect recovery of eggs on flotation. Fecal parasitologic analyses are highly applicable to study the disease ecology of urban carnivores, and they often provide important information on environmental contamination and potential of zoonotic risks. However, fecal-based parasitologic surveys should first assess the sensitivity of the techniques to understand their biases and limitations.

  18. Comparison of DWI and 18F-FDG PET/CT for assessing preoperative N-staging in gastric cancer: evidence from a meta-analysis

    PubMed Central

    Luo, Mingxu; Song, Hongmei; Liu, Gang; Lin, Yikai; Luo, Lintao; Zhou, Xin; Chen, Bo

    2017-01-01

    The diagnostic values of diffusion weighted imaging (DWI) and 18F-fluorodeoxyglucose positron emission tomography/computed tomography (18F-FDG PET/CT) for N-staging of gastric cancer (GC) were identified and compared. After a systematic search to identify relevant articles, meta-analysis was used to summarize the sensitivities, specificities, and areas under curves (AUCs) for DWI and PET/CT. To better understand the diagnostic utility of DWI and PET/CT for N-staging, the performance of multi-detector computed tomography (MDCT) was used as a reference. Fifteen studies were analyzed. The pooled sensitivity, specificity, and AUC with 95% confidence intervals of DWI were 0.79 (0.73–0.85), 0.69 (0.61–0.77), and 0.81 (0.77–0.84), respectively. For PET/CT, the corresponding values were 0.52 (0.39–0.64), 0.88 (0.61–0.97), and 0.66 (0.62–0.70), respectively. Comparison of the two techniques revealed DWI had higher sensitivity and AUC, but no difference in specificity. DWI exhibited higher sensitivity but lower specificity than MDCT, and 18F-FDG PET/CT had lower sensitivity and equivalent specificity. Overall, DWI performed better than 18F-FDG PET/CT for preoperative N-staging in GC. When the efficacy of MDCT was taken as a reference, DWI represented a complementary imaging technique, while 18F-FDG PET/CT had limited utility for preoperative N-staging. PMID:29137440

  19. Development of a noise annoyance sensitivity scale

    NASA Technical Reports Server (NTRS)

    Bregman, H. L.; Pearson, R. G.

    1972-01-01

    Examining the problem of noise pollution from the psychological rather than the engineering view, a test of human sensitivity to noise was developed against the criterion of noise annoyance. Test development evolved from a previous study in which biographical, attitudinal, and personality data was collected on a sample of 166 subjects drawn from the adult community of Raleigh. Analysis revealed that only a small subset of the data collected was predictive of noise annoyance. Item analysis yielded 74 predictive items that composed the preliminary noise sensitivity test. This was administered to a sample of 80 adults who later rate the annoyance value of six sounds (equated in terms of peak sound pressure level) presented in a simulated home, living-room environment. A predictive model involving 20 test items was developed using multiple regression techniques, and an item weighting scheme was evaluated.

  20. The propagation of wind errors through ocean wave hindcasts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holthuijsen, L.H.; Booij, N.; Bertotti, L.

    1996-08-01

    To estimate uncertainties in wave forecast and hindcasts, computations have been carried out for a location in the Mediterranean Sea using three different analyses of one historic wind field. These computations involve a systematic sensitivity analysis and estimated wind field errors. This technique enables a wave modeler to estimate such uncertainties in other forecasts and hindcasts if only one wind analysis is available.

  1. Effects of foveal information processing

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.

    1984-01-01

    The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.

  2. LITHO1.0: An Updated Crust and Lithosphere Model of the Earth

    DTIC Science & Technology

    2010-09-01

    wc arc uncertain what causes the remainder of the discrepancy. The measurement discrepancies are much smaller than the signal in the data, and the...short-period group velocity data measured with a new technique which are sensitive to lid properties as well as crustal thickness and average...most progress was made on surface-wave measurements . We use a cluster analysis technique to measure surface-wave group velocity from lOmHz to 40mHz

  3. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  4. Steam generator tubing NDE performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, G.; Welty, C.S. Jr.

    1997-02-01

    Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less

  5. Novel CE-MS technique for detection of high explosives using perfluorooctanoic acid as a MEKC and mass spectrometric complexation reagent.

    PubMed

    Brensinger, Karen; Rollman, Christopher; Copper, Christine; Genzman, Ashton; Rine, Jacqueline; Lurie, Ira; Moini, Mehdi

    2016-01-01

    To address the need for the forensic analysis of high explosives, a novel capillary electrophoresis mass spectrometry (CE-MS) technique has been developed for high resolution, sensitivity, and mass accuracy detection of these compounds. The technique uses perfluorooctanoic acid (PFOA) as both a micellar electrokinetic chromatography (MEKC) reagent for separation of neutral explosives and as the complexation reagent for mass spectrometric detection of PFOA-explosive complexes in the negative ion mode. High explosives that formed complexes with PFOA included RDX, HMX, tetryl, and PETN. Some nitroaromatics were detected as molecular ions. Detection limits in the high parts per billion range and linear calibration responses over two orders of magnitude were obtained. For proof of concept, the technique was applied to the quantitative analysis of high explosives in sand samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Spectroscopic vector analysis for fast pattern quality monitoring

    NASA Astrophysics Data System (ADS)

    Sohn, Younghoon; Ryu, Sungyoon; Lee, Chihoon; Yang, Yusin

    2018-03-01

    In semiconductor industry, fast and effective measurement of pattern variation has been key challenge for assuring massproduct quality. Pattern measurement techniques such as conventional CD-SEMs or Optical CDs have been extensively used, but these techniques are increasingly limited in terms of measurement throughput and time spent in modeling. In this paper we propose time effective pattern monitoring method through the direct spectrum-based approach. In this technique, a wavelength band sensitive to a specific pattern change is selected from spectroscopic ellipsometry signal scattered by pattern to be measured, and the amplitude and phase variation in the wavelength band are analyzed as a measurement index of the pattern change. This pattern change measurement technique is applied to several process steps and verified its applicability. Due to its fast and simple analysis, the methods can be adapted to the massive process variation monitoring maximizing measurement throughput.

  7. Elemental depth profiling in transparent conducting oxide thin film by X-ray reflectivity and grazing incidence X-ray fluorescence combined analysis

    NASA Astrophysics Data System (ADS)

    Rotella, H.; Caby, B.; Ménesguen, Y.; Mazel, Y.; Valla, A.; Ingerle, D.; Detlefs, B.; Lépy, M.-C.; Novikova, A.; Rodriguez, G.; Streli, C.; Nolot, E.

    2017-09-01

    The optical and electrical properties of transparent conducting oxide (TCO) thin films are strongly linked with the structural and chemical properties such as elemental depth profile. In R&D environments, the development of non-destructive characterization techniques to probe the composition over the depth of deposited films is thus necessary. The combination of Grazing-Incidence X-ray Fluorescence (GIXRF) and X-ray reflectometry (XRR) is emerging as a fab-compatible solution for the measurement of thickness, density and elemental profile in complex stacks. Based on the same formalism, both techniques can be implemented on the same experimental set-up and the analysis can be combined in a single software in order to refine the sample model. While XRR is sensitive to the electronic density profile, GIXRF is sensitive to the atomic density (i. e. the elemental depth profile). The combination of both techniques allows to get simultaneous information about structural properties (thickness and roughness) as well as the chemical properties. In this study, we performed a XRR-GIXRF combined analysis on indium-free TCO thin films (Ga doped ZnO compound) in order to correlate the optical properties of the films with the elemental distribution of Ga dopant over the thickness. The variation of optical properties due to annealing process were probed by spectroscopic ellipsometry measurements. We studied the evolution of atomic profiles before and after annealing process. We show that the blue shift of the band gap in the optical absorption edge is linked to a homogenization of the atomic profiles of Ga and Zn over the layer after the annealing. This work demonstrates that the combination of the techniques gives insight into the material composition and makes the XRR-GIXRF combined analysis a promising technique for elemental depth profiling.

  8. Seeded amplification of chronic wasting disease prions in nasal brushings and recto-anal mucosal associated lymphoid tissues from elk by real time quaking-induced conversion

    USGS Publications Warehouse

    Haley, Nicholas J.; Siepker, Chris; Hoon-Hanks , Laura L.; Mitchell, Gordon; Walter, W. David; Manca, Matteo; Monello, Ryan J.; Powers, Jenny G.; Wild, Margaret A.; Hoover, Edward A.; Caughey, Byron; Richt, Jürgen a.; Fenwick, B.W.

    2016-01-01

    Chronic wasting disease (CWD), a transmissible spongiform encephalopathy of cervids, was first documented nearly 50 years ago in Colorado and Wyoming and has since been detected across North America and the Republic of Korea. The expansion of this disease makes the development of sensitive diagnostic assays and antemortem sampling techniques crucial for the mitigation of its spread; this is especially true in cases of relocation/reintroduction or prevalence studies of large or protected herds, where depopulation may be contraindicated. This study evaluated the sensitivity of the real-time quaking-induced conversion (RT-QuIC) assay of recto-anal mucosa-associated lymphoid tissue (RAMALT) biopsy specimens and nasal brushings collected antemortem. These findings were compared to results of immunohistochemistry (IHC) analysis of ante- and postmortem samples. RAMALT samples were collected from populations of farmed and free-ranging Rocky Mountain elk (Cervus elaphus nelsoni; n = 323), and nasal brush samples were collected from a subpopulation of these animals (n = 205). We hypothesized that the sensitivity of RT-QuIC would be comparable to that of IHC analysis of RAMALT and would correspond to that of IHC analysis of postmortem tissues. We found RAMALT sensitivity (77.3%) to be highly correlative between RT-QuIC and IHC analysis. Sensitivity was lower when testing nasal brushings (34%), though both RAMALT and nasal brush test sensitivities were dependent on both the PRNP genotype and disease progression determined by the obex score. These data suggest that RT-QuIC, like IHC analysis, is a relatively sensitive assay for detection of CWD prions in RAMALT biopsy specimens and, with further investigation, has potential for large-scale and rapid automated testing of antemortem samples for CWD.

  9. Using foreground/background analysis to determine leaf and canopy chemistry

    NASA Technical Reports Server (NTRS)

    Pinzon, J. E.; Ustin, S. L.; Hart, Q. J.; Jacquemoud, S.; Smith, M. O.

    1995-01-01

    Spectral Mixture Analysis (SMA) has become a well established procedure for analyzing imaging spectrometry data, however, the technique is relatively insensitive to minor sources of spectral variation (e.g., discriminating stressed from unstressed vegetation and variations in canopy chemistry). Other statistical approaches have been tried e.g., stepwise multiple linear regression analysis to predict canopy chemistry. Grossman et al. reported that SMLR is sensitive to measurement error and that the prediction of minor chemical components are not independent of patterns observed in more dominant spectral components like water. Further, they observed that the relationships were strongly dependent on the mode of expressing reflectance (R, -log R) and whether chemistry was expressed on a weight (g/g) or are basis (g/sq m). Thus, alternative multivariate techniques need to be examined. Smith et al. reported a revised SMA that they termed Foreground/Background Analysis (FBA) that permits directing the analysis along any axis of variance by identifying vectors through the n-dimensional spectral volume orthonormal to each other. Here, we report an application of the FBA technique for the detection of canopy chemistry using a modified form of the analysis.

  10. Electrical bioimpedance and other techniques for gastric emptying and motility evaluation

    PubMed Central

    Huerta-Franco, María Raquel; Vargas-Luna, Miguel; Montes-Frausto, Juana Berenice; Flores-Hernández, Corina; Morales-Mata, Ismael

    2012-01-01

    The aim of this article is to identify non-invasive, inexpensive, highly sensitive and accurate techniques for evaluating and diagnosing gastric diseases. In the case of the stomach, there are highly sensitive and specific methods for assessing gastric motility and emptying (GME). However, these methods are invasive, expensive and/or not technically feasible for all clinicians and patients. We present a summary of the most relevant international information on non-invasive methods and techniques for clinically evaluating GME. We particularly emphasize the potential of gastric electrical bioimpedance (EBI). EBI was initially used mainly in gastric emptying studies and was essentially abandoned in favor of techniques such as electrogastrography and the gold standard, scintigraphy. The current research evaluating the utility of gastric EBI either combines this technique with other frequently used techniques or uses new methods for gastric EBI signal analysis. In this context, we discuss our results and those of other researchers who have worked with gastric EBI. In this review article, we present the following topics: (1) a description of the oldest methods and procedures for evaluating GME; (2) an explanation of the methods currently used to evaluate gastric activity; and (3) a perspective on the newest trends and techniques in clinical and research GME methods. We conclude that gastric EBI is a highly effective non-invasive, easy to use and inexpensive technique for assessing GME. PMID:22368782

  11. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  12. Sensitivity analysis for linear structural equation models, longitudinal mediation with latent growth models and blended learning in biostatistics education

    NASA Astrophysics Data System (ADS)

    Sullivan, Adam John

    In chapter 1, we consider the biases that may arise when an unmeasured confounder is omitted from a structural equation model (SEM) and sensitivity analysis techniques to correct for such biases. We give an analysis of which effects in an SEM are and are not biased by an unmeasured confounder. It is shown that a single unmeasured confounder will bias not just one but numerous effects in an SEM. We present sensitivity analysis techniques to correct for biases in total, direct, and indirect effects when using SEM analyses, and illustrate these techniques with a study of aging and cognitive function. In chapter 2, we consider longitudinal mediation with latent growth curves. We define the direct and indirect effects using counterfactuals and consider the assumptions needed for identifiability of those effects. We develop models with a binary treatment/exposure followed by a model where treatment/exposure changes with time allowing for treatment/exposure-mediator interaction. We thus formalize mediation analysis with latent growth curve models using counterfactuals, makes clear the assumptions and extends these methods to allow for exposure mediator interactions. We present and illustrate the techniques with a study on Multiple Sclerosis(MS) and depression. In chapter 3, we report on a pilot study in blended learning that took place during the Fall 2013 and Summer 2014 semesters here at Harvard. We blended the traditional BIO 200: Principles of Biostatistics and created ID 200: Principles of Biostatistics and epidemiology. We used materials from the edX course PH207x: Health in Numbers: Quantitative Methods in Clinical & Public Health Research and used. These materials were used as a video textbook in which students would watch a given number of these videos prior to class. Using surveys as well as exam data we informally assess these blended classes from the student's perspective as well as a comparison of these students with students in another course, BIO 201: Introduction to Statistical Methods in Fall 2013 as well as students from BIO 200 in Fall semesters of 1992 and 1993. We then suggest improvements upon our original course designs and follow up with an informal look at how these implemented changes affected the second offering of the newly blended ID 200 in Summer 2014.

  13. Approaching the Limit in Atomic Spectrochemical Analysis.

    ERIC Educational Resources Information Center

    Hieftje, Gary M.

    1982-01-01

    To assess the ability of current analytical methods to approach the single-atom detection level, theoretical and experimentally determined detection levels are presented for several chemical elements. A comparison of these methods shows that the most sensitive atomic spectrochemical technique currently available is based on emission from…

  14. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher sensitivity and specificity for the diagnosis of CR-BSIs in newborns when compared to the quantitative technique. In addition, this method is easier to perform and shows better agreement with the gold standard, and should therefore be recommended for routine clinical laboratory use. PFGE may contribute to the control of CR-BSIs by identifying clusters of microorganisms in neonatal ICUs, providing a means of determining potential cross-infection between patients.

  15. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  16. Acoustic emission and nondestructive evaluation of biomaterials and tissues.

    PubMed

    Kohn, D H

    1995-01-01

    Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.

  17. Approaches for the analysis of low molecular weight compounds with laser desorption/ionization techniques and mass spectrometry.

    PubMed

    Bergman, Nina; Shevchenko, Denys; Bergquist, Jonas

    2014-01-01

    This review summarizes various approaches for the analysis of low molecular weight (LMW) compounds by different laser desorption/ionization mass spectrometry techniques (LDI-MS). It is common to use an agent to assist the ionization, and small molecules are normally difficult to analyze by, e.g., matrix assisted laser desorption/ionization mass spectrometry (MALDI-MS) using the common matrices available today, because the latter are generally small organic compounds themselves. This often results in severe suppression of analyte peaks, or interference of the matrix and analyte signals in the low mass region. However, intrinsic properties of several LDI techniques such as high sensitivity, low sample consumption, high tolerance towards salts and solid particles, and rapid analysis have stimulated scientists to develop methods to circumvent matrix-related issues in the analysis of LMW molecules. Recent developments within this field as well as historical considerations and future prospects are presented in this review.

  18. Differential die-away analysis system response modeling and detector design

    NASA Astrophysics Data System (ADS)

    Jordan, K. A.; Gozani, T.; Vujic, J.

    2008-05-01

    Differential die-away-analysis (DDAA) is a sensitive technique to detect presence of fissile materials such as 235U and 239Pu. DDAA uses a high-energy (14 MeV) pulsed neutron generator to interrogate a shipping container. The signature is a fast neutron signal hundreds of microseconds after the cessation of the neutron pulse. This fast neutron signal has decay time identical to the thermal neutron diffusion decay time of the inspected cargo. The theoretical aspects of a cargo inspection system based on the differential die-away technique are explored. A detailed mathematical model of the system is developed, and experimental results validating this model are presented.

  19. Fibre optic technique for simultaneous measurement of strain and temperature variations in composite materials

    NASA Astrophysics Data System (ADS)

    Michie, W. C.; Culshaw, Brian; Roberts, Scott S. J.; Davidson, Roger

    1991-12-01

    A technique based upon the differential sensitivities of dual mode and polarimetric sensing schemes is shown to be capable of resolving simultaneously temperature and strain variations to within 20 micro-epsilon and 1 K over a strain and temperature excursion of 2 micro-epsilon and 45 K. The technique is evaluated experimentally over an 80 cm sensing length of unembedded optical fiber and in an 8 ply unidirectional carbon/epoxide laminate subject to temperature and strain cycling. A comparative analysis of the performance of the embedded and the unembedded fiber sensors is presented.

  20. Estimation of Plutonium-240 Mass in Waste Tanks Using Ultra-Sensitive Detection of Radioactive Xenon Isotopes from Spontaneous Fission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowyer, Theodore W.; Gesh, Christopher J.; Haas, Daniel A.

    This report details efforts to develop a technique which is able to detect and quantify the mass of 240Pu in waste storage tanks and other enclosed spaces. If the isotopic ratios of the plutonium contained in the enclosed space is also known, then this technique is capable of estimating the total mass of the plutonium without physical sample retrieval and radiochemical analysis of hazardous material. Results utilizing this technique are reported for a Hanford Site waste tank (TX-118) and a well-characterized plutonium sample in a laboratory environment.

  1. On determining important aspects of mathematical models: Application to problems in physics and chemistry

    NASA Technical Reports Server (NTRS)

    Rabitz, Herschel

    1987-01-01

    The use of parametric and functional gradient sensitivity analysis techniques is considered for models described by partial differential equations. By interchanging appropriate dependent and independent variables, questions of inverse sensitivity may be addressed to gain insight into the inversion of observational data for parameter and function identification in mathematical models. It may be argued that the presence of a subset of dominantly strong coupled dependent variables will result in the overall system sensitivity behavior collapsing into a simple set of scaling and self similarity relations amongst elements of the entire matrix of sensitivity coefficients. These general tools are generic in nature, but herein their application to problems arising in selected areas of physics and chemistry is presented.

  2. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids.

    PubMed

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-06-04

    Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.

  3. Parameterization and Sensitivity Analysis of a Complex Simulation Model for Mosquito Population Dynamics, Dengue Transmission, and Their Control

    PubMed Central

    Ellis, Alicia M.; Garcia, Andres J.; Focks, Dana A.; Morrison, Amy C.; Scott, Thomas W.

    2011-01-01

    Models can be useful tools for understanding the dynamics and control of mosquito-borne disease. More detailed models may be more realistic and better suited for understanding local disease dynamics; however, evaluating model suitability, accuracy, and performance becomes increasingly difficult with greater model complexity. Sensitivity analysis is a technique that permits exploration of complex models by evaluating the sensitivity of the model to changes in parameters. Here, we present results of sensitivity analyses of two interrelated complex simulation models of mosquito population dynamics and dengue transmission. We found that dengue transmission may be influenced most by survival in each life stage of the mosquito, mosquito biting behavior, and duration of the infectious period in humans. The importance of these biological processes for vector-borne disease models and the overwhelming lack of knowledge about them make acquisition of relevant field data on these biological processes a top research priority. PMID:21813844

  4. Comparative evaluation of workload estimation techniques in piloting tasks

    NASA Technical Reports Server (NTRS)

    Wierwille, W. W.

    1983-01-01

    Techniques to measure operator workload in a wide range of situations and tasks were examined. The sensitivity and intrusion of a wide variety of workload assessment techniques in simulated piloting tasks were investigated. Four different piloting tasks, psychomotor, perceptual, mediational, and communication aspects of piloting behavior were selected. Techniques to determine relative sensitivity and intrusion were applied. Sensitivity is the relative ability of a workload estimation technique to discriminate statistically significant differences in operator loading. High sensitivity requires discriminable changes in score means as a function of load level and low variation of the scores about the means. Intrusion is an undesirable change in the task for which workload is measured, resulting from the introduction of the workload estimation technique or apparatus.

  5. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign andmore » threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.« less

  6. Analyzing Single Giant Unilamellar Vesicles With a Slotline-Based RF Nanometer Sensor

    DOE PAGES

    Cui, Yan; Kenworthy, Anne K.; Edidin, Michael; ...

    2016-03-11

    Novel techniques that enable reagent free detection and analysis of single cells are of great interest for the development of biological and medical sciences, as well as point-of-care health service technologies. Highly sensitive and broadband RF sensors are promising candidates for such a technique. In this paper, we present a highly sensitive and tunable RF sensor, which is based on interference processes and built with a 100-nm slotline structure. The highly concentrated RF fields, up to ~ 1.76×10 7 V/m, enable strong interactions between giant unilamellar vesicles (GUVs) and fields for high-sensitivity operations. We also provide two modeling approaches tomore » extract cell dielectric properties from measured scattering parameters. GUVs of different molecular compositions are synthesized and analyzed with the RF sensor at ~ 2, ~ 2.5, and ~ 2.8 GHz with an initial |S 21| min of ~ -100 dB. Corresponding GUV dielectric properties are obtained. Finally, a one-dimensional scanning of single GUV is also demonstrated.« less

  7. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  8. Sensitive molecular diagnostics using surface-enhanced resonance Raman scattering (SERRS)

    NASA Astrophysics Data System (ADS)

    Faulds, Karen; Graham, Duncan; McKenzie, Fiona; MacRae, Douglas; Ricketts, Alastair; Dougan, Jennifer

    2009-02-01

    Surface enhanced resonance Raman scattering (SERRS) is an analytical technique with several advantages over competitive techniques in terms of improved sensitivity and multiplexing. We have made great progress in the development of SERRS as a quantitative analytical method, in particular for the detection of DNA. SERRS is an extremely sensitive and selective technique which when applied to the detection of labelled DNA sequences allows detection limits to be obtained which rival, and in most cases, are better than fluorescence. Here the conditions are explored which will enable the successful detection of DNA using SERRS. The enhancing surface which is used is crucial and in this case suspensions of nanoparticles were used as they allow quantitative behaviour to be achieved and allow analogous systems to current fluorescence based systems to be made. The aggregation conditions required to obtain SERRS of DNA are crucial and herein we describe the use of spermine as an aggregating agent. The nature of the label which is used, be it fluorescent, positively or negatively charged also effects the SERRS response and these conditions are again explored here. We have clearly demonstrated the ability to identify the components of a mixture of 5 analytes in solution by using two different excitation wavelengths and also of a 6-plex using data analysis techniques. These conditions will allow the use of SERRS for the detection of target DNA in a meaningful diagnostic assay.

  9. A sensitive continuum analysis method for gamma ray spectra

    NASA Technical Reports Server (NTRS)

    Thakur, Alakh N.; Arnold, James R.

    1993-01-01

    In this work we examine ways to improve the sensitivity of the analysis procedure for gamma ray spectra with respect to small differences in the continuum (Compton) spectra. The method developed is applied to analyze gamma ray spectra obtained from planetary mapping by the Mars Observer spacecraft launched in September 1992. Calculated Mars simulation spectra and actual thick target bombardment spectra have been taken as test cases. The principle of the method rests on the extraction of continuum information from Fourier transforms of the spectra. We study how a better estimate of the spectrum from larger regions of the Mars surface will improve the analysis for smaller regions with poorer statistics. Estimation of signal within the continuum is done in the frequency domain which enables efficient and sensitive discrimination of subtle differences between two spectra. The process is compared to other methods for the extraction of information from the continuum. Finally we explore briefly the possible uses of this technique in other applications of continuum spectra.

  10. System analysis in rotorcraft design: The past decade

    NASA Technical Reports Server (NTRS)

    Galloway, Thomas L.

    1988-01-01

    Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.

  11. A study of the stress wave factor technique for nondestructive evaluation of composite materials

    NASA Technical Reports Server (NTRS)

    Sarrafzadeh-Khoee, A.; Kiernan, M. T.; Duke, J. C., Jr.; Henneke, E. G., II

    1986-01-01

    The acousto-ultrasonic method of nondestructive evaluation is an extremely sensitive means of assessing material response. Efforts continue to complete the understanding of this method. In order to achieve the full sensitivity of the technique, extreme care must be taken in its performance. This report provides an update of the efforts to advance the understanding of this method and to increase its application to the nondestructive evaluation of composite materials. Included are descriptions of a novel optical system that is capable of measuring in-plane and out-of-plane displacements, an IBM PC-based data acquisition system, an extensive data analysis software package, the azimuthal variation of acousto-ultrasonic behavior in graphite/epoxy laminates, and preliminary examination of processing variation in graphite-aluminum tubes.

  12. The use of laser-induced fluorescence or ultraviolet detectors for sensitive and selective analysis of tobramycin or erythropoietin in complex samples.

    PubMed

    Ahmed, Hytham M; Ebeid, Wael B

    2015-05-15

    Complex samples analysis is a challenge in pharmaceutical and biopharmaceutical analysis. In this work, tobramycin (TOB) analysis in human urine samples and recombinant human erythropoietin (rhEPO) analysis in the presence of similar protein were selected as representative examples of such samples analysis. Assays of TOB in urine samples are difficult because of poor detectability. Therefore laser induced fluorescence detector (LIF) was combined with a separation technique, micellar electrokinetic chromatography (MEKC), to determine TOB through derivatization with fluorescein isothiocyanate (FITC). Borate was used as background electrolyte (BGE) with negative-charged mixed micelles as additive. The method was successively applied to urine samples. The LOD and LOQ for Tobramycin in urine were 90 and 200ng/ml respectively and recovery was >98% (n=5). All urine samples were analyzed by direct injection without sample pre-treatment. Another use of hyphenated analytical technique, capillary zone electrophoresis (CZE) connected to ultraviolet (UV) detector was also used for sensitive analysis of rhEPO at low levels (2000IU) in the presence of large amount of human serum albumin (HSA). Analysis of rhEPO was achieved by the use of the electrokinetic injection (EI) with discontinuous buffers. Phosphate buffer was used as BGE with metal ions as additive. The proposed method can be used for the estimation of large number of quality control rhEPO samples in a short period. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. A review of optimization and quantification techniques for chemical exchange saturation transfer (CEST) MRI toward sensitive in vivo imaging

    PubMed Central

    Guo, Yingkun; Zheng, Hairong; Sun, Phillip Zhe

    2015-01-01

    Chemical exchange saturation transfer (CEST) MRI is a versatile imaging method that probes the chemical exchange between bulk water and exchangeable protons. CEST imaging indirectly detects dilute labile protons via bulk water signal changes following selective saturation of exchangeable protons, which offers substantial sensitivity enhancement and has sparked numerous biomedical applications. Over the past decade, CEST imaging techniques have rapidly evolved due to contributions from multiple domains, including the development of CEST mathematical models, innovative contrast agent designs, sensitive data acquisition schemes, efficient field inhomogeneity correction algorithms, and quantitative CEST (qCEST) analysis. The CEST system that underlies the apparent CEST-weighted effect, however, is complex. The experimentally measurable CEST effect depends not only on parameters such as CEST agent concentration, pH and temperature, but also on relaxation rate, magnetic field strength and more importantly, experimental parameters including repetition time, RF irradiation amplitude and scheme, and image readout. Thorough understanding of the underlying CEST system using qCEST analysis may augment the diagnostic capability of conventional imaging. In this review, we provide a concise explanation of CEST acquisition methods and processing algorithms, including their advantages and limitations, for optimization and quantification of CEST MRI experiments. PMID:25641791

  14. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.

    PubMed

    Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon

    2017-02-01

    A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.

  16. Supersonic molecular beam-hyperthermal surface ionisation coupled with time-of-flight mass spectrometry applied to trace level detection of polynuclear aromatic hydrocarbons in drinking water for reduced sample preparation and analysis time.

    PubMed

    Davis, S C; Makarov, A A; Hughes, J D

    1999-01-01

    Analysis of sub-ppb levels of polynuclear aromatic hydrocarbons (PAHs) in drinking water by high performance liquid chromatography (HPLC) fluorescence detection typically requires large water samples and lengthy extraction procedures. The detection itself, although selective, does not give compound identity confirmation. Benchtop gas chromatography/mass spectrometry (GC/MS) systems operating in the more sensitive selected ion monitoring (SIM) acquisition mode discard spectral information and, when operating in scanning mode, are less sensitive and scan too slowly. The selectivity of hyperthermal surface ionisation (HSI), the high column flow rate capacity of the supersonic molecular beam (SMB) GC/MS interface, and the high acquisition rate of time-of-flight (TOF) mass analysis, are combined here to facilitate a rapid, specific and sensitive technique for the analysis of trace levels of PAHs in water. This work reports the advantages gained by using the GC/HSI-TOF system over the HPLC fluorescence method, and discusses in some detail the nature of the instrumentation used.

  17. Surrogate-based Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Queipo, Nestor V.; Haftka, Raphael T.; Shyy, Wei; Goel, Tushar; Vaidyanathan, Raj; Tucker, P. Kevin

    2005-01-01

    A major challenge to the successful full-scale development of modem aerospace systems is to address competing objectives such as improved performance, reduced costs, and enhanced safety. Accurate, high-fidelity models are typically time consuming and computationally expensive. Furthermore, informed decisions should be made with an understanding of the impact (global sensitivity) of the design variables on the different objectives. In this context, the so-called surrogate-based approach for analysis and optimization can play a very valuable role. The surrogates are constructed using data drawn from high-fidelity models, and provide fast approximations of the objectives and constraints at new design points, thereby making sensitivity and optimization studies feasible. This paper provides a comprehensive discussion of the fundamental issues that arise in surrogate-based analysis and optimization (SBAO), highlighting concepts, methods, techniques, as well as practical implications. The issues addressed include the selection of the loss function and regularization criteria for constructing the surrogates, design of experiments, surrogate selection and construction, sensitivity analysis, convergence, and optimization. The multi-objective optimal design of a liquid rocket injector is presented to highlight the state of the art and to help guide future efforts.

  18. A Study of Imputation Algorithms. Working Paper Series.

    ERIC Educational Resources Information Center

    Hu, Ming-xiu; Salvucci, Sameena

    Many imputation techniques and imputation software packages have been developed over the years to deal with missing data. Different methods may work well under different circumstances, and it is advisable to conduct a sensitivity analysis when choosing an imputation method for a particular survey. This study reviewed about 30 imputation methods…

  19. Sensitivity analysis of periodic errors in heterodyne interferometry

    NASA Astrophysics Data System (ADS)

    Ganguly, Vasishta; Kim, Nam Ho; Kim, Hyo Soo; Schmitz, Tony

    2011-03-01

    Periodic errors in heterodyne displacement measuring interferometry occur due to frequency mixing in the interferometer. These nonlinearities are typically characterized as first- and second-order periodic errors which cause a cyclical (non-cumulative) variation in the reported displacement about the true value. This study implements an existing analytical periodic error model in order to identify sensitivities of the first- and second-order periodic errors to the input parameters, including rotational misalignments of the polarizing beam splitter and mixing polarizer, non-orthogonality of the two laser frequencies, ellipticity in the polarizations of the two laser beams, and different transmission coefficients in the polarizing beam splitter. A local sensitivity analysis is first conducted to examine the sensitivities of the periodic errors with respect to each input parameter about the nominal input values. Next, a variance-based approach is used to study the global sensitivities of the periodic errors by calculating the Sobol' sensitivity indices using Monte Carlo simulation. The effect of variation in the input uncertainty on the computed sensitivity indices is examined. It is seen that the first-order periodic error is highly sensitive to non-orthogonality of the two linearly polarized laser frequencies, while the second-order error is most sensitive to the rotational misalignment between the laser beams and the polarizing beam splitter. A particle swarm optimization technique is finally used to predict the possible setup imperfections based on experimentally generated values for periodic errors.

  20. Breathing dynamics based parameter sensitivity analysis of hetero-polymeric DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talukder, Srijeeta; Sen, Shrabani; Chaudhury, Pinaki, E-mail: pinakc@rediffmail.com

    We study the parameter sensitivity of hetero-polymeric DNA within the purview of DNA breathing dynamics. The degree of correlation between the mean bubble size and the model parameters is estimated for this purpose for three different DNA sequences. The analysis leads us to a better understanding of the sequence dependent nature of the breathing dynamics of hetero-polymeric DNA. Out of the 14 model parameters for DNA stability in the statistical Poland-Scheraga approach, the hydrogen bond interaction ε{sub hb}(AT) for an AT base pair and the ring factor ξ turn out to be the most sensitive parameters. In addition, the stackingmore » interaction ε{sub st}(TA-TA) for an TA-TA nearest neighbor pair of base-pairs is found to be the most sensitive one among all stacking interactions. Moreover, we also establish that the nature of stacking interaction has a deciding effect on the DNA breathing dynamics, not the number of times a particular stacking interaction appears in a sequence. We show that the sensitivity analysis can be used as an effective measure to guide a stochastic optimization technique to find the kinetic rate constants related to the dynamics as opposed to the case where the rate constants are measured using the conventional unbiased way of optimization.« less

  1. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  2. Radioimmunoassays and 2-site immunoradiometric "sandwich" assays: basic principles.

    PubMed

    Rodbard, D

    1988-10-01

    The "sandwich" or noncompetitive reagent-excess, 2-site immunoradiometric assay (2-site IRMA), ELISA, USERIA, and related techniques, have several advantages compared with the traditional or competitive radioimmunoassays. IRMAs can provide improved sensitivity and specificity. However, IRMAs present some practical problems with nonspecific binding, increased consumption of antibody, biphasic dose response curve, (high dose hook effect), and may require special techniques for dose response curve analysis. We anticipate considerable growth in the popularity and importance of 2-site IRMA.

  3. Principles of ESCA and application to metal corrosion, coating and lubrication

    NASA Technical Reports Server (NTRS)

    Wheeler, D. R.

    1978-01-01

    The principles of ESCA (electron spectroscopy for chemical analysis) were described by comparison with other spectroscopic techniques. The advantages and disadvantages of ESCA as compared to other surface sensitive analytical techniques were evaluated. The use of ESCA was illustrated by actual applications to oxidation of steel and Rene 41, the chemistry of lubricant additives on steel, and the composition of sputter deposited hard coatings. A bibliography of material that was useful for further study of ESCA was presented and commented upon.

  4. Error analysis of Dobson spectrophotometer measurements of the total ozone content

    NASA Technical Reports Server (NTRS)

    Holland, A. C.; Thomas, R. W. L.

    1975-01-01

    A study of techniques for measuring atmospheric ozone is reported. This study represents the second phase of a program designed to improve techniques for the measurement of atmospheric ozone. This phase of the program studied the sensitivity of Dobson direct sun measurements and the ozone amounts inferred from those measurements to variation in the atmospheric temperature profile. The study used the plane - parallel Monte-Carlo model developed and tested under the initial phase of this program, and a series of standard model atmospheres.

  5. Gamma Spectroscopy by Artificial Neural Network Coupled with MCNP

    NASA Astrophysics Data System (ADS)

    Sahiner, Huseyin

    While neutron activation analysis is widely used in many areas, sensitivity of the analysis depends on how the analysis is conducted. Even though the sensitivity of the techniques carries error, compared to chemical analysis, its range is in parts per million or sometimes billion. Due to this sensitivity, the use of neutron activation analysis becomes important when analyzing bio-samples. Artificial neural network is an attractive technique for complex systems. Although there are neural network applications on spectral analysis, training by simulated data to analyze experimental data has not been made. This study offers an improvement on spectral analysis and optimization on neural network for the purpose. The work considers five elements that are considered as trace elements for bio-samples. However, the system is not limited to five elements. The only limitation of the study comes from data library availability on MCNP. A perceptron network was employed to identify five elements from gamma spectra. In quantitative analysis, better results were obtained when the neural fitting tool in MATLAB was used. As a training function, Levenberg-Marquardt algorithm was used with 23 neurons in the hidden layer with 259 gamma spectra in the input. Because the interest of the study deals with five elements, five neurons representing peak counts of five isotopes in the input layer were used. Five output neurons revealed mass information of these elements from irradiated kidney stones. Results showing max error of 17.9% in APA, 24.9% in UA, 28.2% in COM, 27.9% in STRU type showed the success of neural network approach in analyzing gamma spectra. This high error was attributed to Zn that has a very long decay half-life compared to the other elements. The simulation and experiments were made under certain experimental setup (3 hours irradiation, 96 hours decay time, 8 hours counting time). Nevertheless, the approach is subject to be generalized for different setups.

  6. [RATIONAL ASPECTS OF BACTERIOPHAGES USE].

    PubMed

    Vakarina, A A; Kataeva, L V; Karpukhina, N F

    2015-01-01

    Analysis of existing aspects of bacteriophage use and study features of their lytic activity by using various techniques. Effect of monophages and associated bacteriophages (staphylococci, piopolyvalent and piocombined, intestiphage, pneumonia klebsiella and polyvalent klebsiella produced by "Microgen") was studied with 380 strains of Staphylococcus aureus and 279 cultures of Klebsiella pneumoniae in liquid and solid nutrient media. From patients with intestinal disorder, sensitivity was analyzed to 184 strains of Salmonella genus bacteria 18 serological variants to salmonella bacteriophages, 137 strains of Escherichia coli (lactose-negative, hemolytic), as well as some members of OKA groups (21 serovars) to coli-proteic and piopolyvalent bacteriophages. Lytic ability of the piobacteriophage against Klebsiella and Proteus genus bacteria was determined. Staphylococcus aureus was sensitive to staphylococcus bacteriophage in 71.6% of cases and to piobacteriophage--in 86.15% of cases. A 100% lytic ability of salmonella bacteriophage against Salmonella spp. was established. Sensitivity of E. coli of various serogroups to coli-proteic and piobacteriophage was 66 - 100%. Klebsiella, Proteus genus bacteria were sensitive to piobacteriophage in only 35% and 43.15% of cases, respectively. A more rational use of bacteriophages is necessary: development of a technique, evaluation of sensitivity of bacteria to bacteriophage, introduction of corrections into their production (expansion of bacteriophage spectra, determination and indication of their concentration in accompanying documents).

  7. Design principles of water sensitive in settlement area on the river banks

    NASA Astrophysics Data System (ADS)

    Ryanti, E.; Hasriyanti, N.; Utami, W. D.

    2018-03-01

    This research will formulate the principle of designing settlement area of Kapuas River Pontianak with the approach of water sensitive concept of urban design (WSUD) the densely populated settlement area. By using a case study the approach that is a dense settlement area located on the banks of the river with literature study techniques to formulate the aspects considered and components that are set in the design, descriptive analysis with the rationalistic paradigm for identification characteristics of the settlement in the river banks areas with consideration of WSUD elements and formulate the principles of designing water-sensitive settlement areas. This research is important to do because the problems related to the water management system in the existing riverside settlement in Pontianak has not been maximal to do. So the primary of this research contains several objectives that will be achieved that is identifying the characteristics of riverside settlement area based on consideration of design aspects of the area that are sensitive to water and the principle of designing the area so that the existing problem structure will be formulated in relation to the community’s need for infrastructure in settlement environment and formulate and develop appropriate technology guidelines for integrated water management systems in riverside settlement areas and design techniques for water-sensitive settlements (WSUD).

  8. Translations on Eastern Europe, Scientific Affairs, No. 562

    DTIC Science & Technology

    1977-10-28

    remodeling and mod- ernization of the institute’s facilities resulted in an increase in the reactor’s neutron flux and power output capacity and...research technique involving the use of the experimental reactor is neutron activation analysis. Using this method it is possible to produce...artificial radioactivity through the bombardment of non-active substances with neutrons . This is one of the most sensitive methods of chemical analysis

  9. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  10. Understanding Organics in Meteorites and the Pre-Biotic Environment

    NASA Technical Reports Server (NTRS)

    Zare, Richard N.

    2003-01-01

    (1) Refinement of the analytic capabilities of our experiment via characterization of molecule-specific response and the effects upon analysis of the type of sample under investigation; (2) Measurement of polycyclic aromatic hydrocarbons (PAHs) with high sensitivity and spatial resolution within extraterrestrial samples; (3) Investigation of the interstellar reactions of PAHs via the analysis of species formed in systems modeling dust grains and ices; (4) Investigations into the potential role of PAHs in prebiotic and early biotic chemistry via photoreactions of PAHs under simulated prebiotic Earth conditions. To meet these objectives, we use microprobe laser-desorption, laser-ionization mass spectrometry (MuL(exp 2)MS), which is a sensitive, selective, and spatially resolved technique for detection of aromatic compounds. Appendix A presents a description of the MuL(exp 2)MS technique. The initial grant proposal was for a three-year funding period, while the award was given for a one-year interim period. Because of this change in time period, emphasis was shifted from the first research goal, which was more development-oriented, in order to focus more on the other analysis-oriented goals. The progress made on each of the four research areas is given below.

  11. In situ analysis of soybeans and nuts by probe electrospray ionization mass spectrometry.

    PubMed

    Petroselli, Gabriela; Mandal, Mridul K; Chen, Lee C; Hiraoka, Kenzo; Nonami, Hiroshi; Erra-Balsells, Rosa

    2015-04-01

    The probe electrospray ionization (PESI) is an ESI-based ionization technique that generates electrospray from the tip of a solid metal needle. In the present work, we describe the PESI mass spectra obtained by in situ measurement of soybeans and several nuts (peanuts, walnuts, cashew nuts, macadamia nuts and almonds) using different solid needles as sampling probes. It was found that PESI-MS is a valuable approach for in situ lipid analysis of these seeds. The phospholipid and triacylglycerol PESI spectra of different nuts and soybean were compared by principal component analysis (PCA). PCA shows significant differences among the data of each family of seeds. Methanolic extracts of nuts and soybean were exposed to air and sunlight for several days. PESI mass spectra were recorded before and after the treatment. Along the aging of the oil (rancidification), the formation of oxidated species with variable number of hydroperoxide groups could be observed in the PESI spectra. The relative intensity of oxidated triacylglycerols signals increased with days of exposition. Monitoring sensitivity of PESI-MS was high. This method provides a fast, simple and sensitive technique for the analysis (detection and characterization) of lipids in seed tissue and degree of oxidation of the oil samples. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  13. Clearance of the cervical spine in clinically unevaluable trauma patients.

    PubMed

    Halpern, Casey H; Milby, Andrew H; Guo, Wensheng; Schuster, James M; Gracias, Vicente H; Stein, Sherman C

    2010-08-15

    Meta-analytic costeffectiveness analysis. Our goal was to compare the results of different management strategies for trauma patients in whom the cervical spine was not clinically evaluable due to impaired consciousness, endotracheal intubation, or painful distracting injuries. We performed a structured literature review related to cervical spine trauma, radiographic clearance techniques (plain radiography, flexion/extension, CT, and MRI), and complications associated with semirigid collar use. Meta-analytic techniques were used to pool data from multiple sources to calculate pooled mean estimates of sensitivities and specificities of imaging techniques for cervical spinal clearance, rates of complications from various clearance strategies and from empirical use of semirigid collars. A decision analysis model was used to compare outcomes and costs among these strategies. Slightly more than 7.5% of patients who are clinically unevaluable have cervical spine injuries, and 42% of these injuries are associated with spinal instability. Sensitivity of plain radiography or fluoroscopy for spinal clearance was 57% (95% CI: 57%-60%). Sensitivities for CT and MRI alone were 83% (82%-84%) and 87% (84%-89%), respectively. Complications associated with collar use ranged from 1.3% (2 days) to 7.1% (10 days) but were usually minor and short-lived. Quadriplegia resulting from spinal instability missed by a clearance test had enormous impacts on longevity, quality of life, and costs. These impacts overshadowed the effects of prolonged collar application, even when the incidence of quadriplegia was extremely low. As currently used, neuroimaging studies for cervical spinal clearance in clinically unevaluable patients are not cost-effective compared with empirical immobilization in a semirigid collar.

  14. Moment-based metrics for global sensitivity analysis of hydrological systems

    NASA Astrophysics Data System (ADS)

    Dell'Oca, Aronne; Riva, Monica; Guadagnini, Alberto

    2017-12-01

    We propose new metrics to assist global sensitivity analysis, GSA, of hydrological and Earth systems. Our approach allows assessing the impact of uncertain parameters on main features of the probability density function, pdf, of a target model output, y. These include the expected value of y, the spread around the mean and the degree of symmetry and tailedness of the pdf of y. Since reliable assessment of higher-order statistical moments can be computationally demanding, we couple our GSA approach with a surrogate model, approximating the full model response at a reduced computational cost. Here, we consider the generalized polynomial chaos expansion (gPCE), other model reduction techniques being fully compatible with our theoretical framework. We demonstrate our approach through three test cases, including an analytical benchmark, a simplified scenario mimicking pumping in a coastal aquifer and a laboratory-scale conservative transport experiment. Our results allow ascertaining which parameters can impact some moments of the model output pdf while being uninfluential to others. We also investigate the error associated with the evaluation of our sensitivity metrics by replacing the original system model through a gPCE. Our results indicate that the construction of a surrogate model with increasing level of accuracy might be required depending on the statistical moment considered in the GSA. The approach is fully compatible with (and can assist the development of) analysis techniques employed in the context of reduction of model complexity, model calibration, design of experiment, uncertainty quantification and risk assessment.

  15. Enzymatic signal amplification for sensitive detection of intracellular antigens by flow cytometry.

    PubMed

    Karkmann, U; Radbruch, A; Hölzel, V; Scheffold, A

    1999-11-19

    Flow cytometry is the method of choice for the analysis of single cells with respect to the expression of specific antigens. Antigens can be detected with specific antibodies either on the cell surface or within the cells, after fixation and permeabilization of the cell membrane. Using conventional fluorochrome-labeled antibodies several thousand antigens are required for clear-cut separation of positive and negative cells. More sensitive reagents, e.g., magnetofluorescent liposomes conjugated to specific antibodies permit the detection of less than 200 molecules per cell but cannot be used for the detection of intracellular antigens. Here, we describe an enzymatic amplification technique (intracellular tyramine-based signal amplification, ITSA) for the sensitive cytometric analysis of intracellular cytokines by immunofluorescence. This approach results in a 10 to 15-fold improvement of the signal-to-noise ratio compared to conventional fluorochrome labeled antibodies and permits the detection of as few as 300-400 intracellular antigens per cell.

  16. Characterization of image heterogeneity using 2D Minkowski functionals increases the sensitivity of detection of a targeted MRI contrast agent.

    PubMed

    Canuto, Holly C; McLachlan, Charles; Kettunen, Mikko I; Velic, Marko; Krishnan, Anant S; Neves, Andre' A; de Backer, Maaike; Hu, D-E; Hobson, Michael P; Brindle, Kevin M

    2009-05-01

    A targeted Gd(3+)-based contrast agent has been developed that detects tumor cell death by binding to the phosphatidylserine (PS) exposed on the plasma membrane of dying cells. Although this agent has been used to detect tumor cell death in vivo, the differences in signal intensity between treated and untreated tumors was relatively small. As cell death is often spatially heterogeneous within tumors, we investigated whether an image analysis technique that parameterizes heterogeneity could be used to increase the sensitivity of detection of this targeted contrast agent. Two-dimensional (2D) Minkowski functionals (MFs) provided an automated and reliable method for parameterization of image heterogeneity, which does not require prior assumptions about the number of regions or features in the image, and were shown to increase the sensitivity of detection of the contrast agent as compared to simple signal intensity analysis. (c) 2009 Wiley-Liss, Inc.

  17. Sensitivity analysis and approximation methods for general eigenvalue problems

    NASA Technical Reports Server (NTRS)

    Murthy, D. V.; Haftka, R. T.

    1986-01-01

    Optimization of dynamic systems involving complex non-hermitian matrices is often computationally expensive. Major contributors to the computational expense are the sensitivity analysis and reanalysis of a modified design. The present work seeks to alleviate this computational burden by identifying efficient sensitivity analysis and approximate reanalysis methods. For the algebraic eigenvalue problem involving non-hermitian matrices, algorithms for sensitivity analysis and approximate reanalysis are classified, compared and evaluated for efficiency and accuracy. Proper eigenvector normalization is discussed. An improved method for calculating derivatives of eigenvectors is proposed based on a more rational normalization condition and taking advantage of matrix sparsity. Important numerical aspects of this method are also discussed. To alleviate the problem of reanalysis, various approximation methods for eigenvalues are proposed and evaluated. Linear and quadratic approximations are based directly on the Taylor series. Several approximation methods are developed based on the generalized Rayleigh quotient for the eigenvalue problem. Approximation methods based on trace theorem give high accuracy without needing any derivatives. Operation counts for the computation of the approximations are given. General recommendations are made for the selection of appropriate approximation technique as a function of the matrix size, number of design variables, number of eigenvalues of interest and the number of design points at which approximation is sought.

  18. Quantification of ferritin bound iron in human serum using species-specific isotope dilution mass spectrometry.

    PubMed

    Ren, Yao; Walczyk, Thomas

    2014-09-01

    Ferritin is a hollow sphere protein composed of 24 subunits that can store up to 4500 iron atoms in its inner cavity. It is mainly found in the liver and spleen but also in serum at trace levels. Serum ferritin is considered as the best single indicator in assessing body iron stores except liver or bone marrow biopsy. However, it is confounded by other disease conditions. Ferritin bound iron (FBI) and ferritin saturation have been suggested as more robust biomarkers. The current techniques for FBI determination are limited by low antibody specificity, low instrument sensitivity and possible analyte losses during sample preparation. The need for a highly sensitive and reliable method is widely recognized. Here we describe a novel technique to detect serum FBI using species-specific isotope dilution mass spectrometry (SS-IDMS). [(57)Fe]-ferritin was produced by biosynthesis and in vitro labeling with the (57)Fe spike in the form of [(57)Fe]-citrate after cell lysis and heat treatment. [(57)Fe]-ferritin for sample spiking was further purified by fast liquid protein chromatography. Serum ferritin and added [(57)Fe]-ferritin were separated from other iron species by ultrafiltration followed by isotopic analysis of FBI using negative thermal ionization mass spectrometry. Repeatability of our assay is 8% with an absolute detection limit of 18 ng FBI in the sample. As compared to other speciation techniques, SS-IDMS offers maximum control over sample losses and species conversion during analysis. The described technique may therefore serve as a reference technique for clinical applications of FBI as a new biomarker for assessing body iron status.

  19. Mild extraction methods using aqueous glucose solution for the analysis of natural dyes in textile artefacts dyed with Dyer's madder (Rubia tinctorum L.).

    PubMed

    Ford, Lauren; Henderson, Robert L; Rayner, Christopher M; Blackburn, Richard S

    2017-03-03

    Madder (Rubia tinctorum L.) has been widely used as a red dye throughout history. Acid-sensitive colorants present in madder, such as glycosides (lucidin primeveroside, ruberythric acid, galiosin) and sensitive aglycons (lucidin), are degraded in the textile back extraction process; in previous literature these sensitive molecules are either absent or present in only low concentrations due to the use of acid in typical textile back extraction processes. Anthraquinone aglycons alizarin and purpurin are usually identified in analysis following harsh back extraction methods, such those using solvent mixtures with concentrated hydrochloric acid at high temperatures. Use of softer extraction techniques potentially allows for dye components present in madder to be extracted without degradation, which can potentially provide more information about the original dye profile, which varies significantly between madder varieties, species and dyeing technique. Herein, a softer extraction method involving aqueous glucose solution was developed and compared to other back extraction techniques on wool dyed with root extract from different varieties of Rubia tinctorum. Efficiencies of the extraction methods were analysed by HPLC coupled with diode array detection. Acidic literature methods were evaluated and they generally caused hydrolysis and degradation of the dye components, with alizarin, lucidin, and purpurin being the main compounds extracted. In contrast, extraction in aqueous glucose solution provides a highly effective method for extraction of madder dyed wool and is shown to efficiently extract lucidin primeveroside and ruberythric acid without causing hydrolysis and also extract aglycons that are present due to hydrolysis during processing of the plant material. Glucose solution is a favourable extraction medium due to its ability to form extensive hydrogen bonding with glycosides present in madder, and displace them from the fibre. This new glucose method offers an efficient process that preserves these sensitive molecules and is a step-change in analysis of madder dyed textiles as it can provide further information about historical dye preparation and dyeing processes that current methods cannot. The method also efficiently extracts glycosides in artificially aged samples, making it applicable for museum textile artefacts. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Measurement of tissue optical properties with optical coherence tomography: Implication for noninvasive blood glucose concentration monitoring

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.

    Approximately 14 million people in the USA and more than 140 million people worldwide suffer from diabetes mellitus. The current glucose sensing technique involves a finger puncture several times a day to obtain a droplet of blood for analysis. There have been enormous efforts by many scientific groups and companies to quantify glucose concentration noninvasively using different optical techniques. However, these techniques face limitations associated with low sensitivity, accuracy, and insufficient specificity of glucose concentrations over a physiological range. Optical coherence tomography (OCT), a new technology, is being applied for noninvasive imaging in tissues with high resolution. OCT utilizes sensitive detection of photons coherently scattered from tissue. The high resolution of this technique allows for exceptionally accurate measurement of tissue scattering from a specific layer of skin compared with other optical techniques and, therefore, may provide noninvasive and continuous monitoring of blood glucose concentration with high accuracy. In this dissertation work I experimentally and theoretically investigate feasibility of noninvasive, real-time, sensitive, and specific monitoring of blood glucose concentration using an OCT-based biosensor. The studies were performed in scattering media with stable optical properties (aqueous suspensions of polystyrene microspheres and milk), animals (New Zealand white rabbits and Yucatan micropigs), and normal subjects (during oral glucose tolerance tests). The results of these studies demonstrated: (1) capability of the OCT technique to detect changes in scattering coefficient with the accuracy of about 1.5%; (2) a sharp and linear decrease of the OCT signal slope in the dermis with the increase of blood glucose concentration; (3) the change in the OCT signal slope measured during bolus glucose injection experiments (characterized by a sharp increase of blood glucose concentration) is higher than that measured in the glucose clamping experiments (characterized by slow, controlled increase of the blood glucose concentration); and (4) the accuracy of glucose concentration monitoring may substantially be improved if optimal dimensions of the probed skin area are used. The results suggest that high-resolution OCT technique has a potential for noninvasive, accurate, and continuous glucose monitoring with high sensitivity.

  1. Integrated analytical techniques with high sensitivity for studying brain translocation and potential impairment induced by intranasally instilled copper nanoparticles.

    PubMed

    Bai, Ru; Zhang, Lili; Liu, Ying; Li, Bai; Wang, Liming; Wang, Peng; Autrup, Herman; Beer, Christiane; Chen, Chunying

    2014-04-07

    Health impacts of inhalation exposure to engineered nanomaterials have attracted increasing attention. In this paper, integrated analytical techniques with high sensitivity were used to study the brain translocation and potential impairment induced by intranasally instilled copper nanoparticles (CuNPs). Mice were exposed to CuNPs in three doses (1, 10, 40 mg/kg bw). The body weight of mice decreased significantly in the 10 and 40 mg/kg group (p<0.05) but recovered slightly within exposure duration. Inductively coupled plasma mass spectrometry (ICP-MS) analysis showed that CuNPs could enter the brain. Altered distribution of some important metal elements was observed by synchrotron radiation X-ray fluorescence (SRXRF). H&E staining and immunohistochemical analysis showed that CuNPs produced damages to nerve cells and astrocyte might be the one of the potential targets of CuNPs. The changes of neurotransmitter levels in different brain regions demonstrate that the dysfunction occurred in exposed groups. These data indicated that CuNPs could enter the brain after nasal inhalation and induced damages to the central nervous system (CNS). Integration of effective analytical techniques for systematic investigations is a promising direction to better understand the biological activities of nanomaterials. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Contrast-enhanced magnetic resonance imaging of pulmonary lesions: description of a technique aiming clinical practice.

    PubMed

    Koenigkam-Santos, Marcel; Optazaite, Elzbieta; Sommer, Gregor; Safi, Seyer; Heussel, Claus Peter; Kauczor, Hans-Ulrich; Puderbach, Michael

    2015-01-01

    To propose a technique for evaluation of pulmonary lesions using contrast-enhanced MRI; to assess morphological patterns of enhancement and correlate quantitative analysis with histopathology. Thirty-six patients were prospectively studied. Volumetric-interpolated T1W images were obtained during consecutive breath holds after bolus triggered contrast injection. Volume coverage of first three acquisitions was limited (higher temporal resolution) and last acquisition obtained at 4th min. Two radiologists individually evaluated the patterns of enhancement. Region-of-interest-based signal intensity (SI)-time curves were created to assess quantitative parameters. Readers agreed moderately to substantially concerning lesions' enhancement pattern. SI-time curves could be created for all lesions. In comparison to benign, malignant lesions showed higher values of maximum enhancement, early peak, slope and 4th min enhancement. Early peak >15% showed 100% sensitivity to detect malignancy, maximum enhancement >40% showed 100% specificity. The proposed technique is robust, simple to perform and can be applied in clinical scenario. It allows visual evaluation of enhancement pattern/progression together with creation of SI-time curves and assessment of derived quantitative parameters. Perfusion analysis was highly sensitive to detect malignancy, in accordance to what is recommended by most recent guidelines on imaging evaluation of pulmonary lesions. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Poisson and negative binomial item count techniques for surveys with sensitive question.

    PubMed

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  4. Molecular diagnosis of bloodstream infections: planning to (physically) reach the bedside.

    PubMed

    Leggieri, N; Rida, A; François, P; Schrenzel, Jacques

    2010-08-01

    Faster identification of infecting microorganisms and treatment options is a first-ranking priority in the infectious disease area, in order to prevent inappropriate treatment and overuse of broad-spectrum antibiotics. Standard bacterial identification is intrinsically time-consuming, and very recently there has been a burst in the number of commercially available nonphenotype-based techniques and in the documentation of a possible clinical impact of these techniques. Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) is now a standard diagnostic procedure on cultures and hold promises on spiked blood. Meanwhile, commercial PCR-based techniques have improved with the use of bacterial DNA enrichment methods, the diversity of amplicon analysis techniques (melting curve analysis, microarrays, gel electrophoresis, sequencing and analysis by mass spectrometry) leading to the ability to challenge bacterial culture as the gold standard for providing earlier diagnosis with a better 'clinical' sensitivity and additional prognostic information. Laboratory practice has already changed with MALDI-TOF MS, but a change in clinical practice, driven by emergent nucleic acid-based techniques, will need the demonstration of real-life applicability as well as robust clinical-impact-oriented studies.

  5. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  6. In situ mass analysis of particles by surface ionization mass spectrometry

    NASA Technical Reports Server (NTRS)

    Lassiter, W. S.; Moen, A. L.

    1974-01-01

    A qualitative study of the application of surface ionization and mass spectrometry to the in situ detection and constituent analysis of atmospheric particles was conducted. The technique consists of mass analysis of ions formed as a result of impingement of a stream of particles on a hot filament where, it is presumed, surface ionization takes place. Laboratory air particles containing K, Ca, and possibly hydrocarbons were detected. Other known particles such as Al2O3, Pb(NO3)2, and Cr2O3 were analyzed by detecting the respective metal atoms making up the particles. In some cases, mass numbers indicative of compounds making up the particles were detected showing surface ionization of particles sometimes leads to chemical analysis as well as to elemental analysis. Individual particles were detected, and it was shown that the technique is sensitive to Al2O3 particles with a mass of a few nanograms.

  7. The Design and Operation of Ultra-Sensitive and Tunable Radio-Frequency Interferometers.

    PubMed

    Cui, Yan; Wang, Pingshan

    2014-12-01

    Dielectric spectroscopy (DS) is an important technique for scientific and technological investigations in various areas. DS sensitivity and operating frequency ranges are critical for many applications, including lab-on-chip development where sample volumes are small with a wide range of dynamic processes to probe. In this work, we present the design and operation considerations of radio-frequency (RF) interferometers that are based on power-dividers (PDs) and quadrature-hybrids (QHs). Such interferometers are proposed to address the sensitivity and frequency tuning challenges of current DS techniques. Verified algorithms together with mathematical models are presented to quantify material properties from scattering parameters for three common transmission line sensing structures, i.e., coplanar waveguides (CPWs), conductor-backed CPWs, and microstrip lines. A high-sensitivity and stable QH-based interferometer is demonstrated by measuring glucose-water solution at a concentration level that is ten times lower than some recent RF sensors while our sample volume is ~1 nL. Composition analysis of ternary mixture solutions are also demonstrated with a PD-based interferometer. Further work is needed to address issues like system automation, model improvement at high frequencies, and interferometer scaling.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, J.M.; Callahan, C.A.; Cline, J.F.

    Bioassays were used in a three-phase research project to assess the comparative sensitivity of test organisms to known chemicals, determine if the chemical components in field soil and water samples containing unknown contaminants could be inferred from our laboratory studies using known chemicals, and to investigate kriging (a relatively new statistical mapping technique) and bioassays as methods to define the areal extent of chemical contamination. The algal assay generally was most sensitive to samples of pure chemicals, soil elutriates and water from eight sites with known chemical contamination. Bioassays of nine samples of unknown chemical composition from the Rocky Mountainmore » Arsenal (RMA) site showed that a lettuce seed soil contact phytoassay was most sensitive. In general, our bioassays can be used to broadly identify toxic components of contaminated soil. Nearly pure compounds of insecticides and herbicides were less toxic in the sensitive bioassays than were the counterpart commercial formulations. This finding indicates that chemical analysis alone may fail to correctly rate the severity of environmental toxicity. Finally, we used the lettuce seed phytoassay and kriging techniques in a field study at RMA to demonstrate the feasibility of mapping contamination to aid in cleanup decisions. 25 references, 9 figures, 9 tables.« less

  9. Efficient Analysis of Complex Structures

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.

    2000-01-01

    Last various accomplishments achieved during this project are : (1) A Survey of Neural Network (NN) applications using MATLAB NN Toolbox on structural engineering especially on equivalent continuum models (Appendix A). (2) Application of NN and GAs to simulate and synthesize substructures: 1-D and 2-D beam problems (Appendix B). (3) Development of an equivalent plate-model analysis method (EPA) for static and vibration analysis of general trapezoidal built-up wing structures composed of skins, spars and ribs. Calculation of all sorts of test cases and comparison with measurements or FEA results. (Appendix C). (4) Basic work on using second order sensitivities on simulating wing modal response, discussion of sensitivity evaluation approaches, and some results (Appendix D). (5) Establishing a general methodology of simulating the modal responses by direct application of NN and by sensitivity techniques, in a design space composed of a number of design points. Comparison is made through examples using these two methods (Appendix E). (6) Establishing a general methodology of efficient analysis of complex wing structures by indirect application of NN: the NN-aided Equivalent Plate Analysis. Training of the Neural Networks for this purpose in several cases of design spaces, which can be applicable for actual design of complex wings (Appendix F).

  10. Rapid analysis of controlled substances using desorption electrospray ionization mass spectrometry.

    PubMed

    Rodriguez-Cruz, Sandra E

    2006-01-01

    The recently developed technique of desorption electrospray ionization (DESI) has been applied to the rapid analysis of controlled substances. Experiments have been performed using a commercial ThermoFinnigan LCQ Advantage MAX ion-trap mass spectrometer with limited modifications. Results from the ambient sampling of licit and illicit tablets demonstrate the ability of the DESI technique to detect the main active ingredient(s) or controlled substance(s), even in the presence of other higher-concentration components. Full-scan mass spectrometry data provide preliminary identification by molecular weight determination, while rapid analysis using the tandem mass spectrometry (MS/MS) mode provides fragmentation data which, when compared to the laboratory-generated ESI-MS/MS spectral library, provide structural information and final identification of the active ingredient(s). The consecutive analysis of tablets containing different active components indicates there is no cross-contamination or interference from tablet to tablet, demonstrating the reliability of the DESI technique for rapid sampling (one tablet/min or better). Active ingredients have been detected for tablets in which the active component represents less than 1% of the total tablet weight, demonstrating the sensitivity of the technique. The real-time sampling of cannabis plant material is also presented.

  11. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  12. Analytical Modelling of the Effects of Different Gas Turbine Cooling Techniques on Engine Performance =

    NASA Astrophysics Data System (ADS)

    Uysal, Selcuk Can

    In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).

  13. Flow injection gas chromatography with sulfur chemiluminescence detection for the analysis of total sulfur in complex hydrocarbon matrixes.

    PubMed

    Hua, Yujuan; Hawryluk, Myron; Gras, Ronda; Shearer, Randall; Luong, Jim

    2018-01-01

    A fast and reliable analytical technique for the determination of total sulfur levels in complex hydrocarbon matrices is introduced. The method employed flow injection technique using a gas chromatograph as a sample introduction device and a gas phase dual-plasma sulfur chemiluminescence detector for sulfur quantification. Using the technique described, total sulfur measurement in challenging hydrocarbon matrices can be achieved in less than 10 s with sample-to-sample time <2 min. The high degree of selectivity and sensitivity toward sulfur compounds of the detector offers the ability to measure low sulfur levels with a detection limit in the range of 20 ppb w/w S. The equimolar response characteristic of the detector allows the quantitation of unknown sulfur compounds and simplifies the calibration process. Response is linear over a concentration range of five orders of magnitude, with a high degree of repeatability. The detector's lack of response to hydrocarbons enables direct analysis without the need for time-consuming sample preparation and chromatographic separation processes. This flow injection-based sulfur chemiluminescence detection technique is ideal for fast analysis or trace sulfur analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Effectiveness of Myocardial Contrast Echocardiography Quantitative Analysis during Adenosine Stress versus Visual Analysis before Percutaneous Therapy in Acute Coronary Pain: A Coronary Artery TIMI Grading Comparing Study

    PubMed Central

    Yang, Lixia; Mu, Yuming; Quaglia, Luiz Augusto; Tang, Qi; Guan, Lina; Wang, Chunmei; Shih, Ming Chi

    2012-01-01

    The study aim was to compare two different stress echocardiography interpretation techniques based on the correlation with thrombosis in myocardial infarction (TIMI ) flow grading from acute coronary syndrome (ACS) patients. Forty-one patients with suspected ACS were studied before diagnostic coronary angiography with myocardial contrast echocardiography (MCE) at rest and at stress. The correlation of visual interpretation of MCE and TIMI flow grade was significant. The quantitative analysis (myocardial perfusion parameters: A, β, and A × β) and TIMI flow grade were significant. MCE visual interpretation and TIMI flow grade had a high degree of agreement, on diagnosing myocardial perfusion abnormality. If one considers TIMI flow grade <3 as abnormal, MCE visual interpretation at rest had 73.1% accuracy with 58.2% sensitivity and 84.2% specificity and at stress had 80.4% accuracy with 76.6% sensitivity and 83.3% specificity. The MCE quantitative analysis has better accuracy with 100% of agreement with different level of TIMI flow grading. MCE quantitative analysis at stress has showed a direct correlation with TIMI flow grade, more significant than the visual interpretation technique. Further studies could measure the clinical relevance of this more objective approach to managing acute coronary syndrome patient before percutaneous coronary intervention (PCI). PMID:22778555

  15. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  16. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to whichmore » the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.« less

  17. Measurements of 55Fe activity in activated steel samples with GEMPix

    NASA Astrophysics Data System (ADS)

    Curioni, A.; Dinar, N.; La Torre, F. P.; Leidner, J.; Murtas, F.; Puddu, S.; Silari, M.

    2017-03-01

    In this paper we present a novel method, based on the recently developed GEMPix detector, to measure the 55Fe content in samples of metallic material activated during operation of CERN accelerators and experimental facilities. The GEMPix, a gas detector with highly pixelated read-out, has been obtained by coupling a triple Gas Electron Multiplier (GEM) to a quad Timepix ASIC. Sample preparation, measurements performed on 45 samples and data analysis are described. The calibration factor (counts per second per unit specific activity) has been obtained via measurements of the 55Fe activity determined by radiochemical analysis of the same samples. Detection limit and sensitivity to the current Swiss exemption limit are calculated. Comparison with radiochemical analysis shows inconsistency for the sensitivity for only two samples, most likely due to underestimated uncertainties of the GEMPix analysis. An operative test phase of this technique is already planned at CERN.

  18. Method of confidence domains in the analysis of noise-induced extinction for tritrophic population system

    NASA Astrophysics Data System (ADS)

    Bashkirtseva, Irina; Ryashko, Lev; Ryazanova, Tatyana

    2017-09-01

    A problem of the analysis of the noise-induced extinction in multidimensional population systems is considered. For the investigation of conditions of the extinction caused by random disturbances, a new approach based on the stochastic sensitivity function technique and confidence domains is suggested, and applied to tritrophic population model of interacting prey, predator and top predator. This approach allows us to analyze constructively the probabilistic mechanisms of the transition to the noise-induced extinction from both equilibrium and oscillatory regimes of coexistence. In this analysis, a method of principal directions for the reducing of the dimension of confidence domains is suggested. In the dispersion of random states, the principal subspace is defined by the ratio of eigenvalues of the stochastic sensitivity matrix. A detailed analysis of two scenarios of the noise-induced extinction in dependence on parameters of considered tritrophic system is carried out.

  19. Quantification of rapid environmental redox processes with quick-scanning x-ray absorption spectroscopy (Q-XAS)

    PubMed Central

    Ginder-Vogel, Matthew; Landrot, Gautier; Fischel, Jason S.; Sparks, Donald L.

    2009-01-01

    Quantification of the initial rates of environmental reactions at the mineral/water interface is a fundamental prerequisite to determining reaction mechanisms and contaminant transport modeling and predicting environmental risk. Until recently, experimental techniques with adequate time resolution and elemental sensitivity to measure initial rates of the wide variety of environmental reactions were quite limited. Techniques such as electron paramagnetic resonance and Fourier transform infrared spectroscopies suffer from limited elemental specificity and poor sensitivity to inorganic elements, respectively. Ex situ analysis of batch and stirred-flow systems provides high elemental sensitivity; however, their time resolution is inadequate to characterize rapid environmental reactions. Here we apply quick-scanning x-ray absorption spectroscopy (Q-XAS), at sub-second time-scales, to measure the initial oxidation rate of As(III) to As(V) by hydrous manganese(IV) oxide. Using Q-XAS, As(III) and As(V) concentrations were determined every 0.98 s in batch reactions. The initial apparent As(III) depletion rate constants (t < 30 s) measured with Q-XAS are nearly twice as large as rate constants measured with traditional analytical techniques. Our results demonstrate the importance of developing analytical techniques capable of analyzing environmental reactions on the same time scale as they occur. Given the high sensitivity, elemental specificity, and time resolution of Q-XAS, it has many potential applications. They could include measuring not only redox reactions but also dissolution/precipitation reactions, such as the formation and/or reductive dissolution of Fe(III) (hydr)oxides, solid-phase transformations (i.e., formation of layered-double hydroxide minerals), or almost any other reaction occurring in aqueous media that can be measured using x-ray absorption spectroscopy. PMID:19805269

  20. Sensitivity of high-frequency Rayleigh-wave data revisited

    USGS Publications Warehouse

    Xia, J.; Miller, R.D.; Ivanov, J.

    2007-01-01

    Rayleigh-wave phase velocity of a layered earth model is a function of frequency and four groups of earth properties: P-wave velocity, S-wave velocity (Vs), density, and thickness of layers. Analysis of the Jacobian matrix (or the difference method) provides a measure of dispersion curve sensitivity to earth properties. Vs is the dominant influence for the fundamental mode (Xia et al., 1999) and higher modes (Xia et al., 2003) of dispersion curves in a high frequency range (>2 Hz) followed by layer thickness. These characteristics are the foundation of determining S-wave velocities by inversion of Rayleigh-wave data. More applications of surface-wave techniques show an anomalous velocity layer such as a high-velocity layer (HVL) or a low-velocity layer (LVL) commonly exists in near-surface materials. Spatial location (depth) of an anomalous layer is usually the most important information that surface-wave techniques are asked to provide. Understanding and correctly defining the sensitivity of high-frequency Rayleigh-wave data due to depth of an anomalous velocity layer are crucial in applying surface-wave techniques to obtain a Vs profile and/or determine the depth of an anomalous layer. Because depth is not a direct earth property of a layered model, changes in depth will result in changes in other properties. Modeling results show that sensitivity at a given depth calculated by the difference method is dependent on the Vs difference (contrast) between an anomalous layer and surrounding layers. The larger the contrast is, the higher the sensitivity due to depth of the layer. Therefore, the Vs contrast is a dominant contributor to sensitivity of Rayleigh-wave data due to depth of an anomalous layer. Modeling results also suggest that the most sensitive depth for an HVL is at about the middle of the depth to the half-space, but for an LVL it is near the ground surface. ?? 2007 Society of Exploration Geophysicists.

  1. Femoral graft-tunnel angles in posterior cruciate ligament reconstruction: analysis with 3-dimensional models and cadaveric experiments.

    PubMed

    Kim, Sung-Jae; Chun, Yong-Min; Kim, Sung-Hwan; Moon, Hong-Kyo; Jang, Jae-Won

    2013-07-01

    The purpose of this study was to compare four graft-tunnel angles (GTA), the femoral GTA formed by three different femoral tunneling techniques (the outside-in, a modified inside-out technique in the posterior sag position with knee hyperflexion, and the conventional inside-out technique) and the tibia GTA in 3-dimensional (3D) knee flexion models, as well as to examine the influence of femoral tunneling techniques on the contact pressure between the intra-articular aperture of the femoral tunnel and the graft. Twelve cadaveric knees were tested. Computed tomography scans were performed at different knee flexion angles (0°, 45°, 90°, and 120°). Femoral and tibial GTAs were measured at different knee flexion angles on the 3D knee models. Using pressure sensitive films, stress on the graft of the angulation of the femoral tunnel aperture was measured in posterior cruciate ligament reconstructed cadaveric knees. Between 45° and 120° of knee flexion, there were no significant differences between the outside-in and modified inside-out techniques. However, the femoral GTA for the conventional inside-out technique was significantly less than that for the other two techniques (p<0.001). In cadaveric experiments using pressure-sensitive film, the maximum contact pressure for the modified inside-out and outside-in technique was significantly lower than that for the conventional inside-out technique (p=0.024 and p=0.017). The conventional inside-out technique results in a significantly lesser GTA and higher stress at the intra-articular aperture of the femoral tunnel than the outside-in technique. However, the results for the modified inside-out technique are similar to those for the outside-in technique.

  2. Sensitive microplate assay for the detection of proteolytic enzymes using radiolabeled gelatin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, B.D.; Kwan-Lim, G.E.; Maizels, R.M.

    1988-07-01

    A sensitive, microplate assay is described for the detection of a wide range of proteolytic enzymes, using radio-iodine-labeled gelatin as substrate. The technique uses the Bolton-Hunter reagent to label the substrate, which is then coated onto the wells of polyvinyl chloride microtiter plates. By measuring the radioactivity released the assay is able to detect elastase, trypsin, and collagenase in concentrations of 1 ng/ml or less, while the microtiter format permits multiple sample handling and minimizes sample volumes required for analysis.

  3. Critical comparison of diffuse reflectance spectroscopy and colorimetry as dermatological diagnostic tools for acanthosis nigricans: a chemometric approach.

    PubMed

    Devpura, Suneetha; Pattamadilok, Bensachee; Syed, Zain U; Vemulapalli, Pranita; Henderson, Marsha; Rehse, Steven J; Hamzavi, Iltefat; Lim, Henry W; Naik, Ratna

    2011-06-01

    Quantification of skin changes due to acanthosis nigricans (AN), a disorder common among insulin-resistant diabetic and obese individuals, was investigated using two optical techniques: diffuse reflectance spectroscopy (DRS) and colorimetry. Measurements were obtained from AN lesions on the neck and two control sites of eight AN patients. A principal component/discriminant function analysis successfully differentiated between AN lesion and normal skin with 87.7% sensitivity and 94.8% specificity in DRS measurements and 97.2% sensitivity and 96.4% specificity in colorimetry measurements.

  4. Improved numerical solutions for chaotic-cancer-model

    NASA Astrophysics Data System (ADS)

    Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair

    2017-01-01

    In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.

  5. First data from CUORE-0

    DOE PAGES

    Vignati, A. M.; Aguirre, C. P.; Artusa, D. R.; ...

    2015-03-24

    CUORE-0 is an experiment built to test and demonstrate the performance of the upcoming CUORE experiment. Composed of 52 TeO 2 bolometers of 750 g each, it is expected to reach a sensitivity to the 0νββ half-life of 130Te around 3 · 10 24 y in one year of live time. We present the first data, corresponding to an exposure of 7.1 kg y. An analysis of the background indicates that the CUORE sensitivity goal is within reach, validating our techniques to reduce the α radioactivity of the detector.

  6. First data from CUORE-0

    NASA Astrophysics Data System (ADS)

    Vignati, A. M.; Aguirre, C. P.; Artusa, D. R.; Avignone, F. T., III; Azzolini, O.; Balata, M.; Banks, T. I.; Bari, G.; Beeman, J.; Bellini, F.; Bersani, A.; Biassoni, M.; Brofferio, C.; Bucci, C.; Cai, X. Z.; Camacho, A.; Canonica, L.; Cao, X.; Capelli, S.; Carbone, L.; Cardani, L.; Carrettoni, M.; Casali, N.; Chiesa, D.; Chott, N.; Clemenza, M.; Cosmelli, C.; Cremonesi, O.; Creswick, R. J.; Dafinei, I.; Dally, A.; Datskov, V.; De Biasi, A.; Deninno, M. M.; Di Domizio, S.; di Vacri, M. L.; Ejzak, L.; Fang, D. Q.; Farach, H. A.; Faverzani, M.; Fernandes, G.; Ferri, E.; Ferroni, F.; Fiorini, E.; Franceschi, M. A.; Freedman, S. J.; Fujikawa, B. K.; Giachero, A.; Gironi, L.; Giuliani, A.; Goett, J.; Gorla, P.; Gotti, C.; Gutierrez, T. D.; Haller, E. E.; Han, K.; Heeger, K. M.; Hennings-Yeomans, R.; Huang, H. Z.; Kadel, R.; Kazkaz, K.; Keppel, G.; Kolomensky, Yu. G.; Li, Y. L.; Ligi, C.; Lim, K. E.; Liu, X.; Ma, Y. G.; Maiano, C.; Maino, M.; Martinez, M.; Maruyama, R. H.; Mei, Y.; Moggi, N.; Morganti, S.; Napolitano, T.; Nisi, S.; Nones, C.; Norman, E. B.; Nucciotti, A.; O'Donnell, T.; Orio, F.; Orlandi, D.; Ouellet, J. L.; Pallavicini, M.; Palmieri, V.; Pattavina, L.; Pavan, M.; Pedretti; Pessina, G.; Piperno, G.; Pira, C.; Pirro, S.; Previtali, E.; Rampazzo, V.; Rosenfeld, C.; Rusconi, C.; Sala, E.; Sangiorgio, S.; Scielzo, N. D.; Sisti, M.; Smith, A. R.; Taffarello, L.; Tenconi, M.; Terranova, F.; Tian, W. D.; Tomei, C.; Trentalange, S.; Ventura, G.; Wang, B. S.; Wang, H. W.; Wielgus, L.; Wilson, J.; Winslow, L. A.; Wise, T.; Woodcraft, A.; Zanotti, L.; Zarra, C.; Zhu, B. X.; Zucchelli, S.

    CUORE-0 is an experiment built to test and demonstrate the performance of the upcoming CUORE experiment. Com- posed of 52 TeO2 bolometers of 750 g each, it is expected to reach a sensitivity to the 0νββ half-life of 130Te around 3 · 1024 y in one year of live time. We present the first data, corresponding to an exposure of 7.1 kg y. An analysis of the background indicates that the CUORE sensitivity goal is within reach, validating our techniques to reduce the α radioactivity of the detector.

  7. Uncertainty Analysis of Decomposing Polyurethane Foam

    NASA Technical Reports Server (NTRS)

    Hobbs, Michael L.; Romero, Vicente J.

    2000-01-01

    Sensitivity/uncertainty analyses are necessary to determine where to allocate resources for improved predictions in support of our nation's nuclear safety mission. Yet, sensitivity/uncertainty analyses are not commonly performed on complex combustion models because the calculations are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, a variety of sensitivity/uncertainty analyses were used to determine the uncertainty associated with thermal decomposition of polyurethane foam exposed to high radiative flux boundary conditions. The polyurethane used in this study is a rigid closed-cell foam used as an encapsulant. Related polyurethane binders such as Estane are used in many energetic materials of interest to the JANNAF community. The complex, finite element foam decomposition model used in this study has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state decomposition front velocity calculated as the derivative of the decomposition front location versus time. An analytical mean value sensitivity/uncertainty (MV) analysis was used to determine the standard deviation by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation was essentially determined from a second derivative that was extremely sensitive to numerical noise. To minimize the numerical noise, 50-micrometer element dimensions and approximately 1-msec time steps were required to obtain stable uncertainty results. As an alternative method to determine the uncertainty and sensitivity in the decomposition front velocity, surrogate response surfaces were generated for use with a constrained Latin Hypercube Sampling (LHS) technique. Two surrogate response surfaces were investigated: 1) a linear surrogate response surface (LIN) and 2) a quadratic response surface (QUAD). The LHS techniques do not require derivatives of the response variable and are subsequently relatively insensitive to numerical noise. To compare the LIN and QUAD methods to the MV method, a direct LHS analysis (DLHS) was performed using the full grid and timestep resolved finite element model. The surrogate response models (LIN and QUAD) are shown to give acceptable values of the mean and standard deviation when compared to the fully converged DLHS model.

  8. Imaging free radicals in organelles, cells, tissue, and in vivo with immuno-spin trapping.

    PubMed

    Mason, Ronald Paul

    2016-08-01

    The accurate and sensitive detection of biological free radicals in a reliable manner is required to define the mechanistic roles of such species in biochemistry, medicine and toxicology. Most of the techniques currently available are either not appropriate to detect free radicals in cells and tissues due to sensitivity limitations (electron spin resonance, ESR) or subject to artifacts that make the validity of the results questionable (fluorescent probe-based analysis). The development of the immuno-spin trapping technique overcomes all these difficulties. This technique is based on the reaction of amino acid- and DNA base-derived radicals with the spin trap 5, 5-dimethyl-1-pyrroline N-oxide (DMPO) to form protein- and DNA-DMPO nitroxide radical adducts, respectively. These adducts have limited stability and decay to produce the very stable macromolecule-DMPO-nitrone product. This stable product can be detected by mass spectrometry, NMR or immunochemistry by the use of anti-DMPO nitrone antibodies. The formation of macromolecule-DMPO-nitrone adducts is based on the selective reaction of free radical addition to the spin trap and is thus not subject to artifacts frequently encountered with other methods for free radical detection. The selectivity of spin trapping for free radicals in biological systems has been proven by ESR. Immuno-spin trapping is proving to be a potent, sensitive (a million times higher sensitivity than ESR), and easy (not quantum mechanical) method to detect low levels of macromolecule-derived radicals produced in vitro and in vivo. Anti-DMPO antibodies have been used to determine the distribution of free radicals in cells and tissues and even in living animals. In summary, the invention of the immuno-spin trapping technique has had a major impact on the ability to accurately and sensitively detect biological free radicals and, subsequently, on our understanding of the role of free radicals in biochemistry, medicine and toxicology. Published by Elsevier B.V.

  9. A blinded international study on the reliability of genetic testing for GGGGCC-repeat expansions in C9orf72 reveals marked differences in results among 14 laboratories

    PubMed Central

    Akimoto, Chizuru; Volk, Alexander E; van Blitterswijk, Marka; Van den Broeck, Marleen; Leblond, Claire S; Lumbroso, Serge; Camu, William; Neitzel, Birgit; Onodera, Osamu; van Rheenen, Wouter; Pinto, Susana; Weber, Markus; Smith, Bradley; Proven, Melanie; Talbot, Kevin; Keagle, Pamela; Chesi, Alessandra; Ratti, Antonia; van der Zee, Julie; Alstermark, Helena; Birve, Anna; Calini, Daniela; Nordin, Angelica; Tradowsky, Daniela C; Just, Walter; Daoud, Hussein; Angerbauer, Sabrina; DeJesus-Hernandez, Mariely; Konno, Takuya; Lloyd-Jani, Anjali; de Carvalho, Mamede; Mouzat, Kevin; Landers, John E; Veldink, Jan H; Silani, Vincenzo; Gitler, Aaron D; Shaw, Christopher E; Rouleau, Guy A; van den Berg, Leonard H; Van Broeckhoven, Christine; Rademakers, Rosa; Andersen, Peter M; Kubisch, Christian

    2014-01-01

    Background The GGGGCC-repeat expansion in C9orf72 is the most frequent mutation found in patients with amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD). Most of the studies on C9orf72 have relied on repeat-primed PCR (RP-PCR) methods for detection of the expansions. To investigate the inherent limitations of this technique, we compared methods and results of 14 laboratories. Methods The 14 laboratories genotyped DNA from 78 individuals (diagnosed with ALS or FTD) in a blinded fashion. Eleven laboratories used a combination of amplicon-length analysis and RP-PCR, whereas three laboratories used RP-PCR alone; Southern blotting techniques were used as a reference. Results Using PCR-based techniques, 5 of the 14 laboratories got results in full accordance with the Southern blotting results. Only 50 of the 78 DNA samples got the same genotype result in all 14 laboratories. There was a high degree of false positive and false negative results, and at least one sample could not be genotyped at all in 9 of the 14 laboratories. The mean sensitivity of a combination of amplicon-length analysis and RP-PCR was 95.0% (73.9–100%), and the mean specificity was 98.0% (87.5–100%). Overall, a sensitivity and specificity of more than 95% was observed in only seven laboratories. Conclusions Because of the wide range seen in genotyping results, we recommend using a combination of amplicon-length analysis and RP-PCR as a minimum in a research setting. We propose that Southern blotting techniques should be the gold standard, and be made obligatory in a clinical diagnostic setting. PMID:24706941

  10. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  11. Tunable lasers and their application in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.

    1975-01-01

    The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.

  12. First- and second-order sensitivity analysis of linear and nonlinear structures

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Mroz, Z.

    1986-01-01

    This paper employs the principle of virtual work to derive sensitivity derivatives of structural response with respect to stiffness parameters using both direct and adjoint approaches. The computations required are based on additional load conditions characterized by imposed initial strains, body forces, or surface tractions. As such, they are equally applicable to numerical or analytical solution techniques. The relative efficiency of various approaches for calculating first and second derivatives is assessed. It is shown that for the evaluation of second derivatives the most efficient approach is one that makes use of both the first-order sensitivities and adjoint vectors. Two example problems are used for demonstrating the various approaches.

  13. Molecular sensing with magnetic nanoparticles using magnetic spectroscopy of nanoparticle Brownian motion.

    PubMed

    Zhang, Xiaojuan; Reeves, Daniel B; Perreard, Irina M; Kett, Warren C; Griswold, Karl E; Gimi, Barjor; Weaver, John B

    2013-12-15

    Functionalized magnetic nanoparticles (mNPs) have shown promise in biosensing and other biomedical applications. Here we use functionalized mNPs to develop a highly sensitive, versatile sensing strategy required in practical biological assays and potentially in vivo analysis. We demonstrate a new sensing scheme based on magnetic spectroscopy of nanoparticle Brownian motion (MSB) to quantitatively detect molecular targets. MSB uses the harmonics of oscillating mNPs as a metric for the freedom of rotational motion, thus reflecting the bound state of the mNP. The harmonics can be detected in vivo from nanogram quantities of iron within 5s. Using a streptavidin-biotin binding system, we show that the detection limit of the current MSB technique is lower than 150 pM (0.075 pmole), which is much more sensitive than previously reported techniques based on mNP detection. Using mNPs conjugated with two anti-thrombin DNA aptamers, we show that thrombin can be detected with high sensitivity (4 nM or 2 pmole). A DNA-DNA interaction was also investigated. The results demonstrated that sequence selective DNA detection can be achieved with 100 pM (0.05 pmole) sensitivity. The results of using MSB to sense these interactions, show that the MSB based sensing technique can achieve rapid measurement (within 10s), and is suitable for detecting and quantifying a wide range of biomarkers or analytes. It has the potential to be applied in variety of biomedical applications or diagnostic analyses. © 2013 Elsevier B.V. All rights reserved.

  14. Unified Model Deformation and Flow Transition Measurements

    NASA Technical Reports Server (NTRS)

    Burner, Alpheus W.; Liu, Tianshu; Garg, Sanjay; Bell, James H.; Morgan, Daniel G.

    1999-01-01

    The number of optical techniques that may potentially be used during a given wind tunnel test is continually growing. These include parameter sensitive paints that are sensitive to temperature or pressure, several different types of off-body and on-body flow visualization techniques, optical angle-of-attack (AoA), optical measurement of model deformation, optical techniques for determining density or velocity, and spectroscopic techniques for determining various flow field parameters. Often in the past the various optical techniques were developed independently of each other, with little or no consideration for other techniques that might also be used during a given test. Recently two optical techniques have been increasingly requested for production measurements in NASA wind tunnels. These are the video photogrammetric (or videogrammetric) technique for measuring model deformation known as the video model deformation (VMD) technique, and the parameter sensitive paints for making global pressure and temperature measurements. Considerations for, and initial attempts at, simultaneous measurements with the pressure sensitive paint (PSP) and the videogrammetric techniques have been implemented. Temperature sensitive paint (TSP) has been found to be useful for boundary-layer transition detection since turbulent boundary layers convect heat at higher rates than laminar boundary layers of comparable thickness. Transition is marked by a characteristic surface temperature change wherever there is a difference between model and flow temperatures. Recently, additional capabilities have been implemented in the target-tracking videogrammetric measurement system. These capabilities have permitted practical simultaneous measurements using parameter sensitive paint and video model deformation measurements that led to the first successful unified test with TSP for transition detection in a large production wind tunnel.

  15. Imaging Analysis of the Hard X-Ray Telescope ProtoEXIST2 and New Techniques for High-Resolution Coded-Aperture Telescopes

    NASA Technical Reports Server (NTRS)

    Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.

    2016-01-01

    Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.

  16. Loss-compensation technique using a split-spectrum approach for optical fiber air-gap intensity-based sensors

    NASA Astrophysics Data System (ADS)

    Wang, Anbo; Miller, Mark S.; Gunther, Michael F.; Murphy, Kent A.; Claus, Richard O.

    1993-03-01

    A self-referencing technique compensating for fiber losses and source fluctuations in air-gap intensity-based optical fiber sensors is described and demonstrated. A resolution of 0.007 micron has been obtained over a measurement range of 0-250 microns for an intensity-based displacement sensor using this referencing technique. The sensor is shown to have minimal sensitivity to fiber bending losses and variations in the LED input power. A theoretical model for evaluation of step-index multimode optical fiber splice is proposed. The performance of the sensor as a displacement sensor agrees well with the theoretical analysis.

  17. Seismic Constraints on Interior Solar Convection

    NASA Technical Reports Server (NTRS)

    Hanasoge, Shravan M.; Duvall, Thomas L.; DeRosa, Marc L.

    2010-01-01

    We constrain the velocity spectral distribution of global-scale solar convective cells at depth using techniques of local helioseismology. We calibrate the sensitivity of helioseismic waves to large-scale convective cells in the interior by analyzing simulations of waves propagating through a velocity snapshot of global solar convection via methods of time-distance helioseismology. Applying identical analysis techniques to observations of the Sun, we are able to bound from above the magnitudes of solar convective cells as a function of spatial convective scale. We find that convection at a depth of r/R(solar) = 0.95 with spatial extent l < 30, where l is the spherical harmonic degree, comprise weak flow systems, on the order of 15 m/s or less. Convective features deeper than r/R(solar) = 0.95 are more difficult to image due to the rapidly decreasing sensitivity of helioseismic waves.

  18. Imaging-based molecular barcoding with pixelated dielectric metasurfaces

    NASA Astrophysics Data System (ADS)

    Tittl, Andreas; Leitis, Aleksandrs; Liu, Mingkai; Yesilkoy, Filiz; Choi, Duk-Yong; Neshev, Dragomir N.; Kivshar, Yuri S.; Altug, Hatice

    2018-06-01

    Metasurfaces provide opportunities for wavefront control, flat optics, and subwavelength light focusing. We developed an imaging-based nanophotonic method for detecting mid-infrared molecular fingerprints and implemented it for the chemical identification and compositional analysis of surface-bound analytes. Our technique features a two-dimensional pixelated dielectric metasurface with a range of ultrasharp resonances, each tuned to a discrete frequency; this enables molecular absorption signatures to be read out at multiple spectral points, and the resulting information is then translated into a barcode-like spatial absorption map for imaging. The signatures of biological, polymer, and pesticide molecules can be detected with high sensitivity, covering applications such as biosensing and environmental monitoring. Our chemically specific technique can resolve absorption fingerprints without the need for spectrometry, frequency scanning, or moving mechanical parts, thereby paving the way toward sensitive and versatile miniaturized mid-infrared spectroscopy devices.

  19. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  20. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids

    PubMed Central

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-01-01

    Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756

  1. Differentiation of live and dead salmonella cells using fourier transform infrared (FTIR) spectroscopy and principle component analysis (PCA) technique

    USDA-ARS?s Scientific Manuscript database

    Various technologies have been developed for pathogen detection using optical, electrochemical, biochemical and physical properties. Conventional microbiological methods need time from days to week to get the result. Though this method is very sensitive and accurate, a rapid detection of pathogens i...

  2. Interactional Effects of Instructional Quality and Teacher Judgement Accuracy on Achievement.

    ERIC Educational Resources Information Center

    Helmke, Andreas; Schrader, Friedrich-Wilhelm

    1987-01-01

    Analysis of predictions of 32 teachers regarding 690 fifth-graders' scores on a mathematics achievement test found that the combination of high judgement accuracy with varied instructional techniques was particularly favorable to students in contrast to a combination of high diagnostic sensitivity with a low frequency of cues or individual…

  3. 75 FR 28616 - Agilent Technologies, Inc.; Analysis of the Agreement Containing Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... equipment used to test cell phones and communications equipment, machines that determine the contents of... employ various analytical techniques to test samples of many types, are used by academic researchers... require the sensitivity provided by ICP-MS, and because many customers perform tests pursuant to...

  4. Quantitative Analysis of Organophosphate and Pyrethroid Insecticides, PyrethroidTransformation Products, Polybrominated Diphenyl Ethers and Bisphenol A in Residential Surface Wipe Samples

    EPA Science Inventory

    Surface wipe sampling is a frequently used technique for measuring persistent pollutants in residential environments. One characteristic of this form of sampling is the need to extract the entire wipe sample to achieve adequate sensitivity and to ensure representativeness. Most s...

  5. Sensitivity of FIA Volume Estimates to Changes in Stratum Weights and Number of Strata

    Treesearch

    James A. Westfall; Michael Hoppus

    2005-01-01

    In the Northeast region, the USDA Forest Service Forest Inventory and Analysis (FIA) program utilizes stratified sampling techniques to improve the precision of population estimates. Recently, interpretation of aerial photographs was replaced with classified remotely sensed imagery to determine stratum weights and plot stratum assignments. However, stratum weights...

  6. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  7. A comparison of sorptive extraction techniques coupled to a new quantitative, sensitive, high throughput GC-MS/MS method for methoxypyrazine analysis in wine.

    PubMed

    Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E

    2016-02-01

    Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Comparison of 2- and 3-dimensional shoulder ultrasound to magnetic resonance imaging in a community hospital for the detection of supraspinatus rotator cuff tears with improved worktime room efficiency.

    PubMed

    Co, Steven; Bhalla, Sonny; Rowan, Kevin; Aippersbach, Sven; Bicknell, Simon

    2012-08-01

    The purpose of this study was to evaluate whether 3-dimensional (3D) volumetric acquisition of shoulder ultrasound (US) data for supraspinatus rotator cuff tears is as sensitive when compared with conventional 2-dimensional (2D) US and routine magnetic resonance imaging (MRI), and whether there is improved workroom time efficiency when using the 3D technique compared with the 2D technique. In this prospective study, 39 shoulders underwent US and MRI examination of their rotator cuff to confirm the accuracy of both the 2D and 3D techniques. The difference in sensitivities was compared by using confidence interval analysis. The mean times required to obtain the 2D and 3D US data and to review the scans were compared by using a 1-tailed Wilcoxon test. Sensitivity and specificity of 2D US in detecting supraspinatus full- and partial-thickness tears was 100% and 96%, and 80% and 100%, respectively, and similar values were obtained with 3D US at 100% and 100%, and 90% and 96.6%, respectively. Analysis of the confidence limits of the sensitivities showed no significant difference. The mean time (± SD) of the overall 2D examination of the shoulder, including interpretation was 10.02 ± 3.28 minutes, whereas, for the 3D examination, it was 7.08 ± 0.35 minutes. Comparison between the 2 cohorts when using a 1-tailed Wilcoxon test showed a statistically significant difference (P < .05). 3D US of the shoulder is as accurate as 2D US when compared with MRI for the diagnosis of full- and partial-thickness supraspinatus rotator cuff tears, and 3D US examination significantly reduced the time between the initial scan and the radiologist interpretation, ultimately improving workplace efficiency. Copyright © 2012 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  9. Identifying and counting point defects in carbon nanotubes.

    PubMed

    Fan, Yuwei; Goldsmith, Brett R; Collins, Philip G

    2005-12-01

    The prevailing conception of carbon nanotubes and particularly single-walled carbon nanotubes (SWNTs) continues to be one of perfectly crystalline wires. Here, we demonstrate a selective electrochemical method that labels point defects and makes them easily visible for quantitative analysis. High-quality SWNTs are confirmed to contain one defect per 4 microm on average, with a distribution weighted towards areas of SWNT curvature. Although this defect density compares favourably to high-quality, silicon single-crystals, the presence of a single defect can have tremendous electronic effects in one-dimensional conductors such as SWNTs. We demonstrate a one-to-one correspondence between chemically active point defects and sites of local electronic sensitivity in SWNT circuits, confirming the expectation that individual defects may be critical to understanding and controlling variability, noise and chemical sensitivity in SWNT electronic devices. By varying the SWNT synthesis technique, we further show that the defect spacing can be varied over orders of magnitude. The ability to detect and analyse point defects, especially at very low concentrations, indicates the promise of this technique for quantitative process analysis, especially in nanoelectronics development.

  10. Microstructure and composition analysis of low-Z/low-Z multilayers by combining hard and resonant soft X-ray reflectivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, P. N., E-mail: pnrao@rrcat.gov.in; Rai, S. K.; Srivastava, A. K.

    2016-06-28

    Microstructure and composition analysis of periodic multilayer structure consisting of a low electron density contrast (EDC) material combination by grazing incidence hard X-ray reflectivity (GIXR), resonant soft X-ray reflectivity (RSXR), and transmission electron microscopy (TEM) are presented. Measurements of reflectivity at different energies allow combining the sensitivity of GIXR data to microstructural parameters like layer thicknesses and interfacing roughness, with the layer composition sensitivity of RSXR. These aspects are shown with an example of 10-period C/B{sub 4}C multilayer. TEM observation reveals that interfaces C on B{sub 4}C and B{sub 4}C on C are symmetric. Although GIXR provides limited structural informationmore » when EDC between layers is low, measurements using a scattering technique like GIXR with a microscopic technique like TEM improve the microstructural information of low EDC combination. The optical constants of buried layers have been derived by RSXR. The derived optical constants from the measured RSXR data suggested the presence of excess carbon into the boron carbide layer.« less

  11. Analysis of airfoil leading edge separation bubbles

    NASA Technical Reports Server (NTRS)

    Carter, J. E.; Vatsa, V. N.

    1982-01-01

    A local inviscid-viscous interaction technique was developed for the analysis of low speed airfoil leading edge transitional separation bubbles. In this analysis an inverse boundary layer finite difference analysis is solved iteratively with a Cauchy integral representation of the inviscid flow which is assumed to be a linear perturbation to a known global viscous airfoil analysis. Favorable comparisons with data indicate the overall validity of the present localized interaction approach. In addition numerical tests were performed to test the sensitivity of the computed results to the mesh size, limits on the Cauchy integral, and the location of the transition region.

  12. Meta-analysis of diagnostic accuracy studies accounting for disease prevalence: alternative parameterizations and model selection.

    PubMed

    Chu, Haitao; Nie, Lei; Cole, Stephen R; Poole, Charles

    2009-08-15

    In a meta-analysis of diagnostic accuracy studies, the sensitivities and specificities of a diagnostic test may depend on the disease prevalence since the severity and definition of disease may differ from study to study due to the design and the population considered. In this paper, we extend the bivariate nonlinear random effects model on sensitivities and specificities to jointly model the disease prevalence, sensitivities and specificities using trivariate nonlinear random-effects models. Furthermore, as an alternative parameterization, we also propose jointly modeling the test prevalence and the predictive values, which reflect the clinical utility of a diagnostic test. These models allow investigators to study the complex relationship among the disease prevalence, sensitivities and specificities; or among test prevalence and the predictive values, which can reveal hidden information about test performance. We illustrate the proposed two approaches by reanalyzing the data from a meta-analysis of radiological evaluation of lymph node metastases in patients with cervical cancer and a simulation study. The latter illustrates the importance of carefully choosing an appropriate normality assumption for the disease prevalence, sensitivities and specificities, or the test prevalence and the predictive values. In practice, it is recommended to use model selection techniques to identify a best-fitting model for making statistical inference. In summary, the proposed trivariate random effects models are novel and can be very useful in practice for meta-analysis of diagnostic accuracy studies. Copyright 2009 John Wiley & Sons, Ltd.

  13. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.

  14. Detection of target-probe oligonucleotide hybridization using synthetic nanopore resistive pulse sensing.

    PubMed

    Booth, Marsilea Adela; Vogel, Robert; Curran, James M; Harbison, SallyAnn; Travas-Sejdic, Jadranka

    2013-07-15

    Despite the plethora of DNA sensor platforms available, a portable, sensitive, selective and economic sensor able to rival current fluorescence-based techniques would find use in many applications. In this research, probe oligonucleotide-grafted particles are used to detect target DNA in solution through a resistive pulse nanopore detection technique. Using carbodiimide chemistry, functionalized probe DNA strands are attached to carboxylated dextran-based magnetic particles. Subsequent incubation with complementary target DNA yields a change in surface properties as the two DNA strands hybridize. Particle-by-particle analysis with resistive pulse sensing is performed to detect these changes. A variable pressure method allows identification of changes in the surface charge of particles. As proof-of-principle, we demonstrate that target hybridization is selectively detected at micromolar concentrations (nanomoles of target) using resistive pulse sensing, confirmed by fluorescence and phase analysis light scattering as complementary techniques. The advantages, feasibility and limitations of using resistive pulse sensing for sample analysis are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Analysis and optimal design of moisture sensor for rice grain moisture measurement

    NASA Astrophysics Data System (ADS)

    Jain, Sweety; Mishra, Pankaj Kumar; Thakare, Vandana Vikas

    2018-04-01

    The analysis and design of a microstrip sensor for accurate determination of moisture content (MC) in rice grains based on oven drying technique, this technique is easy, fast and less time-consuming to other techniques. The sensor is designed with low insertion loss, reflection coefficient and maximum gain is -35dB and 5.88dB at 2.68GHz as well as discussed all the parameters such as axial ratio, maximum gain, smith chart etc, which is helpful for analysis the moisture measurement. The variation in percentage of moisture measurement with magnitude and phase of transmission coefficient is investigated at selected frequencies. The microstrip moisture sensor consists of one layer: substrate FR4, thickness 1.638 is simulated by computer simulated technology microwave studio (CST MWS). It is concluded that the proposed sensor is suitable for development as a complete sensor and to estimate the optimum moisture content of rice grains with accurately, sensitivity, compact, versatile and suitable for determining the moisture content of other crops and agriculture products.

  16. Low angle light scattering analysis: a novel quantitative method for functional characterization of human and murine platelet receptors.

    PubMed

    Mindukshev, Igor; Gambaryan, Stepan; Kehrer, Linda; Schuetz, Claudia; Kobsar, Anna; Rukoyatkina, Natalia; Nikolaev, Viacheslav O; Krivchenko, Alexander; Watson, Steve P; Walter, Ulrich; Geiger, Joerg

    2012-07-01

    Determinations of platelet receptor functions are indispensable diagnostic indicators of cardiovascular and hemostatic diseases including hereditary and acquired receptor defects and receptor responses to drugs. However, presently available techniques for assessing platelet function have some disadvantages, such as low sensitivity and the requirement of large sample sizes and unphysiologically high agonist concentrations. Our goal was to develop and initially characterize a new technique designed to quantitatively analyze platelet receptor activation and platelet function on the basis of measuring changes in low angle light scattering. We developed a novel technique based on low angle light scattering registering changes in light scattering at a range of different angles in platelet suspensions during activation. The method proved to be highly sensitive for simultaneous real time detection of changes in size and shape of platelets during activation. Unlike commonly-used methods, the light scattering method could detect platelet shape change and aggregation in response to nanomolar concentrations of extracellular nucleotides. Furthermore, our results demonstrate that the advantages of the light scattering method make it a choice method for platelet receptor monitoring and for investigation of both murine and human platelets in disease models. Our data demonstrate the suitability and superiority of this new low angle light scattering method for comprehensive analyses of platelet receptors and functions. This highly sensitive, quantitative, and online detection of essential physiological, pathophysiological and pharmacological-response properties of human and mouse platelets is a significant improvement over conventional techniques.

  17. Critical factors determining the quantification capability of matrix-assisted laser desorption/ionization– time-of-flight mass spectrometry

    PubMed Central

    Wang, Chia-Chen; Lai, Yin-Hung; Ou, Yu-Meng; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-01-01

    Quantitative analysis with mass spectrometry (MS) is important but challenging. Matrix-assisted laser desorption/ionization (MALDI) coupled with time-of-flight (TOF) MS offers superior sensitivity, resolution and speed, but such techniques have numerous disadvantages that hinder quantitative analyses. This review summarizes essential obstacles to analyte quantification with MALDI-TOF MS, including the complex ionization mechanism of MALDI, sensitive characteristics of the applied electric fields and the mass-dependent detection efficiency of ion detectors. General quantitative ionization and desorption interpretations of ion production are described. Important instrument parameters and available methods of MALDI-TOF MS used for quantitative analysis are also reviewed. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644968

  18. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  19. Indel analysis by droplet digital PCR: a sensitive method for DNA mixture detection and chimerism analysis.

    PubMed

    Santurtún, Ana; Riancho, José A; Arozamena, Jana; López-Duarte, Mónica; Zarrabeitia, María T

    2017-01-01

    Several methods have been developed to determinate genetic profiles from a mixed samples and chimerism analysis in transplanted patients. The aim of this study was to explore the effectiveness of using the droplet digital PCR (ddPCR) for mixed chimerism detection (a mixture of genetic profiles resulting after allogeneic hematopoietic stem cell transplantation (HSCT)). We analyzed 25 DNA samples from patients who had undergone HSCT and compared the performance of ddPCR and two established methods for chimerism detection, based upon the Indel and STRs analysis, respectively. Additionally, eight artificial mixture DNA samples were created to evaluate the sensibility of ddPCR. Our results show that the chimerism percentages estimated by the analysis of a single Indel using ddPCR were very similar to those calculated by the amplification of 15 STRs (r 2  = 0.970) and with the results obtained by the amplification of 38 Indels (r 2  = 0.975). Moreover, the amplification of a single Indel by ddPCR was sensitive enough to detect a minor DNA contributor comprising down to 0.5 % of the sample. We conclude that ddPCR can be a powerful tool for the determination of a genetic profile of forensic mixtures and clinical chimerism analysis when traditional techniques are not sensitive enough.

  20. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability

    PubMed Central

    ChariDingari, Narahara; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P.; Kumar, G. Manoj

    2012-01-01

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real world applications, e.g. quality assurance and process monitoring. Specifically, variability in sample, system and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a non-linear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), due to its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data – highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples as well as in related areas of forensic and biological sample analysis. PMID:22292496

  1. Approach of technical decision-making by element flow analysis and Monte-Carlo simulation of municipal solid waste stream.

    PubMed

    Tian, Bao-Guo; Si, Ji-Tao; Zhao, Yan; Wang, Hong-Tao; Hao, Ji-Ming

    2007-01-01

    This paper deals with the procedure and methodology which can be used to select the optimal treatment and disposal technology of municipal solid waste (MSW), and to provide practical and effective technical support to policy-making, on the basis of study on solid waste management status and development trend in China and abroad. Focusing on various treatment and disposal technologies and processes of MSW, this study established a Monte-Carlo mathematical model of cost minimization for MSW handling subjected to environmental constraints. A new method of element stream (such as C, H, O, N, S) analysis in combination with economic stream analysis of MSW was developed. By following the streams of different treatment processes consisting of various techniques from generation, separation, transfer, transport, treatment, recycling and disposal of the wastes, the element constitution as well as its economic distribution in terms of possibility functions was identified. Every technique step was evaluated economically. The Mont-Carlo method was then conducted for model calibration. Sensitivity analysis was also carried out to identify the most sensitive factors. Model calibration indicated that landfill with power generation of landfill gas was economically the optimal technology at the present stage under the condition of more than 58% of C, H, O, N, S going to landfill. Whether or not to generate electricity was the most sensitive factor. If landfilling cost increases, MSW separation treatment was recommended by screening first followed with incinerating partially and composting partially with residue landfilling. The possibility of incineration model selection as the optimal technology was affected by the city scale. For big cities and metropolitans with large MSW generation, possibility for constructing large-scale incineration facilities increases, whereas, for middle and small cities, the effectiveness of incinerating waste decreases.

  2. Incorporation of support vector machines in the LIBS toolbox for sensitive and robust classification amidst unexpected sample and system variability.

    PubMed

    Dingari, Narahara Chari; Barman, Ishan; Myakalwar, Ashwin Kumar; Tewari, Surya P; Kumar Gundawar, Manoj

    2012-03-20

    Despite the intrinsic elemental analysis capability and lack of sample preparation requirements, laser-induced breakdown spectroscopy (LIBS) has not been extensively used for real-world applications, e.g., quality assurance and process monitoring. Specifically, variability in sample, system, and experimental parameters in LIBS studies present a substantive hurdle for robust classification, even when standard multivariate chemometric techniques are used for analysis. Considering pharmaceutical sample investigation as an example, we propose the use of support vector machines (SVM) as a nonlinear classification method over conventional linear techniques such as soft independent modeling of class analogy (SIMCA) and partial least-squares discriminant analysis (PLS-DA) for discrimination based on LIBS measurements. Using over-the-counter pharmaceutical samples, we demonstrate that the application of SVM enables statistically significant improvements in prospective classification accuracy (sensitivity), because of its ability to address variability in LIBS sample ablation and plasma self-absorption behavior. Furthermore, our results reveal that SVM provides nearly 10% improvement in correct allocation rate and a concomitant reduction in misclassification rates of 75% (cf. PLS-DA) and 80% (cf. SIMCA)-when measurements from samples not included in the training set are incorporated in the test data-highlighting its robustness. While further studies on a wider matrix of sample types performed using different LIBS systems is needed to fully characterize the capability of SVM to provide superior predictions, we anticipate that the improved sensitivity and robustness observed here will facilitate application of the proposed LIBS-SVM toolbox for screening drugs and detecting counterfeit samples, as well as in related areas of forensic and biological sample analysis.

  3. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  4. Laboratory Workflow Analysis of Culture of Periprosthetic Tissues in Blood Culture Bottles.

    PubMed

    Peel, Trisha N; Sedarski, John A; Dylla, Brenda L; Shannon, Samantha K; Amirahmadi, Fazlollaah; Hughes, John G; Cheng, Allen C; Patel, Robin

    2017-09-01

    Culture of periprosthetic tissue specimens in blood culture bottles is more sensitive than conventional techniques, but the impact on laboratory workflow has yet to be addressed. Herein, we examined the impact of culture of periprosthetic tissues in blood culture bottles on laboratory workflow and cost. The workflow was process mapped, decision tree models were constructed using probabilities of positive and negative cultures drawn from our published study (T. N. Peel, B. L. Dylla, J. G. Hughes, D. T. Lynch, K. E. Greenwood-Quaintance, A. C. Cheng, J. N. Mandrekar, and R. Patel, mBio 7:e01776-15, 2016, https://doi.org/10.1128/mBio.01776-15), and the processing times and resource costs from the laboratory staff time viewpoint were used to compare periprosthetic tissues culture processes using conventional techniques with culture in blood culture bottles. Sensitivity analysis was performed using various rates of positive cultures. Annualized labor savings were estimated based on salary costs from the U.S. Labor Bureau for Laboratory staff. The model demonstrated a 60.1% reduction in mean total staff time with the adoption of tissue inoculation into blood culture bottles compared to conventional techniques (mean ± standard deviation, 30.7 ± 27.6 versus 77.0 ± 35.3 h per month, respectively; P < 0.001). The estimated annualized labor cost savings of culture using blood culture bottles was $10,876.83 (±$337.16). Sensitivity analysis was performed using various rates of culture positivity (5 to 50%). Culture in blood culture bottles was cost-effective, based on the estimated labor cost savings of $2,132.71 for each percent increase in test accuracy. In conclusion, culture of periprosthetic tissue in blood culture bottles is not only more accurate than but is also cost-saving compared to conventional culture methods. Copyright © 2017 American Society for Microbiology.

  5. Dual-echo ASL based assessment of motor networks: a feasibility study

    NASA Astrophysics Data System (ADS)

    Storti, Silvia Francesca; Boscolo Galazzo, Ilaria; Pizzini, Francesca B.; Menegaz, Gloria

    2018-04-01

    Objective. Dual-echo arterial spin labeling (DE-ASL) technique has been recently proposed for the simultaneous acquisition of ASL and blood-oxygenation-level-dependent (BOLD)-functional magnetic resonance imaging (fMRI) data. The assessment of this technique in detecting functional connectivity at rest or during motor and motor imagery tasks is still unexplored both per-se and in comparison with conventional methods. The purpose is to quantify the sensitivity of the DE-ASL sequence with respect to the conventional fMRI sequence (cvBOLD) in detecting brain activations, and to assess and compare the relevance of node features in decoding the network structure. Approach. Thirteen volunteers were scanned acquiring a pseudo-continuous DE-ASL sequence from which the concomitant BOLD (ccBOLD) simultaneously to the ASL can be extracted. The approach consists of two steps: (i) model-based analyses for assessing brain activations at individual and group levels, followed by statistical analysis for comparing the activation elicited by the three sequences under two conditions (motor and motor imagery), respectively; (ii) brain connectivity graph-theoretical analysis for assessing and comparing the network models properties. Main results. Our results suggest that cvBOLD and ccBOLD have comparable sensitivity in detecting the regions involved in the active task, whereas ASL offers a higher degree of co-localization with smaller activation volumes. The connectivity results and the comparative analysis of node features across sequences revealed that there are no strong changes between rest and tasks and that the differences between the sequences are limited to few connections. Significance. Considering the comparable sensitivity of the ccBOLD and cvBOLD sequences in detecting activated brain regions, the results demonstrate that DE-ASL can be successfully applied in functional studies allowing to obtain both ASL and BOLD information within a single sequence. Further, DE-ASL is a powerful technique for research and clinical applications allowing to perform quantitative comparisons as well as to characterize functional connectivity.

  6. An alternative method for analysis of food taints using stir bar sorptive extraction.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2010-09-10

    The determination of taints in food products currently can involve the use of several sample extraction techniques, including direct headspace (DHS), steam distillation extraction (SDE) and more recently solid phase microextraction (SPME). Each of these techniques has disadvantages, such as the use of large volumes of solvents (SDE), or limitations in sensitivity (DHS), or have only been applied to date for determination of individual or specific groups of tainting compounds (SPME). The use of stir bar sorptive extraction (SBSE) has been evaluated as a quantitative screening method for unknown tainting compounds in foods. A range of commonly investigated problem compounds, with a range of physical and chemical properties, were examined. The method was optimised to give the best response for the majority of compounds and the performance was evaluated by examining the accuracy, precision, linearity, limits of detection and quantitation and uncertainties for each analyte. For most compounds SBSE gave the lowest limits of detection compared to steam distillation extraction or direct headspace analysis and in general was better than these established techniques. However, for methyl methacrylate and hexanal no response was observed following stir bar extraction under the optimised conditions. The assays were carried out using a single quadrupole GC-MS in scan mode. A comparison of acquisition modes and instrumentation was performed using standards to illustrate the increase in sensitivity possible using more targeted ion monitoring or a more sensitive high resolution mass spectrometer. This comparison illustrated the usefulness of this approach as an alternative to specialised glassware or expensive instrumentation. SBSE in particular offers a 'greener' extraction method by a large reduction in the use of organic solvents and also minimises the potential for contamination from external laboratory sources, which is of particular concern for taint analysis. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. [Amanitine determination as an example of peptide analysis in the biological samples with HPLC-MS technique].

    PubMed

    Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof

    Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers satisfactory sensitivity of determination.

  8. Precision cleaning verification of fluid components by air/water impingement and total carbon analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1994-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 sq m. Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging/diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg/sq ft of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVRs impinged from witness plates of 0.05 to 0.75 sq m.

  9. Precision Cleaning Verification of Fluid Components by Air/Water Impingement and Total Carbon Analysis

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.; Fogarty, Chris; Cantrell, Chris; Melton, Gregory S.

    1995-01-01

    NASA personnel at Kennedy Space Center's Material Science Laboratory have developed new environmentally sound precision cleaning and verification techniques for systems and components found at the center. This technology is required to replace existing methods traditionally employing CFC-113. The new patent-pending technique of precision cleaning verification is for large components of cryogenic fluid systems. These are stainless steel, sand cast valve bodies with internal surface areas ranging from 0.2 to 0.9 m(exp 2). Extrapolation of this technique to components of even larger sizes (by orders of magnitude) is planned. Currently, the verification process is completely manual. In the new technique, a high velocity, low volume water stream impacts the part to be verified. This process is referred to as Breathing Air/Water Impingement and forms the basis for the Impingement Verification System (IVS). The system is unique in that a gas stream is used to accelerate the water droplets to high speeds. Water is injected into the gas stream in a small, continuous amount. The air/water mixture is then passed through a converging-diverging nozzle where the gas is accelerated to supersonic velocities. These droplets impart sufficient energy to the precision cleaned surface to place non-volatile residue (NVR) contaminants into suspension in the water. The sample water is collected and its NVR level is determined by total organic carbon (TOC) analysis at 880 C. The TOC, in ppm carbon, is used to establish the NVR level. A correlation between the present gravimetric CFC-113 NVR and the IVS NVR is found from experimental sensitivity factors measured for various contaminants. The sensitivity has the units of ppm of carbon per mg-ft(exp 2) of contaminant. In this paper, the equipment is described and data are presented showing the development of the sensitivity factors from a test set including four NVR's impinged from witness plates of 0.05 to 0.75 m(exp 2).

  10. High-resolution high-sensitivity elemental imaging by secondary ion mass spectrometry: from traditional 2D and 3D imaging to correlative microscopy

    NASA Astrophysics Data System (ADS)

    Wirtz, T.; Philipp, P.; Audinot, J.-N.; Dowsett, D.; Eswara, S.

    2015-10-01

    Secondary ion mass spectrometry (SIMS) constitutes an extremely sensitive technique for imaging surfaces in 2D and 3D. Apart from its excellent sensitivity and high lateral resolution (50 nm on state-of-the-art SIMS instruments), advantages of SIMS include high dynamic range and the ability to differentiate between isotopes. This paper first reviews the underlying principles of SIMS as well as the performance and applications of 2D and 3D SIMS elemental imaging. The prospects for further improving the capabilities of SIMS imaging are discussed. The lateral resolution in SIMS imaging when using the microprobe mode is limited by (i) the ion probe size, which is dependent on the brightness of the primary ion source, the quality of the optics of the primary ion column and the electric fields in the near sample region used to extract secondary ions; (ii) the sensitivity of the analysis as a reasonable secondary ion signal, which must be detected from very tiny voxel sizes and thus from a very limited number of sputtered atoms; and (iii) the physical dimensions of the collision cascade determining the origin of the sputtered ions with respect to the impact site of the incident primary ion probe. One interesting prospect is the use of SIMS-based correlative microscopy. In this approach SIMS is combined with various high-resolution microscopy techniques, so that elemental/chemical information at the highest sensitivity can be obtained with SIMS, while excellent spatial resolution is provided by overlaying the SIMS images with high-resolution images obtained by these microscopy techniques. Examples of this approach are given by presenting in situ combinations of SIMS with transmission electron microscopy (TEM), helium ion microscopy (HIM) and scanning probe microscopy (SPM).

  11. Fetal Electrocardiogram Extraction and Analysis Using Adaptive Noise Cancellation and Wavelet Transformation Techniques.

    PubMed

    Sutha, P; Jayanthi, V E

    2017-12-08

    Birth defect-related demise is mainly due to congenital heart defects. In the earlier stage of pregnancy, fetus problem can be identified by finding information about the fetus to avoid stillbirths. The gold standard used to monitor the health status of the fetus is by Cardiotachography(CTG), cannot be used for long durations and continuous monitoring. There is a need for continuous and long duration monitoring of fetal ECG signals to study the progressive health status of the fetus using portable devices. The non-invasive method of electrocardiogram recording is one of the best method used to diagnose fetal cardiac problem rather than the invasive methods.The monitoring of the fECG requires development of a miniaturized hardware and a efficient signal processing algorithms to extract the fECG embedded in the mother ECG. The paper discusses a prototype hardware developed to monitor and record the raw mother ECG signal containing the fECG and a signal processing algorithm to extract the fetal Electro Cardiogram signal. We have proposed two methods of signal processing, first is based on the Least Mean Square (LMS) Adaptive Noise Cancellation technique and the other method is based on the Wavelet Transformation technique. A prototype hardware was designed and developed to acquire the raw ECG signal containing the mother and fetal ECG and the signal processing techniques were used to eliminate the noises and extract the fetal ECG and the fetal Heart Rate Variability was studied. Both the methods were evaluated with the signal acquired from a fetal ECG simulator, from the Physionet database and that acquired from the subject. Both the methods are evaluated by finding heart rate and its variability, amplitude spectrum and mean value of extracted fetal ECG. Also the accuracy, sensitivity and positive predictive value are also determined for fetal QRS detection technique. In this paper adaptive filtering technique uses Sign-sign LMS algorithm and wavelet techniques with Daubechies wavelet, employed along with de noising techniques for the extraction of fetal Electrocardiogram.Both the methods are having good sensitivity and accuracy. In adaptive method the sensitivity is 96.83, accuracy 89.87, wavelet sensitivity is 95.97 and accuracy is 88.5. Additionally, time domain parameters from the plot of heart rate variability of mother and fetus are analyzed.

  12. Sensitivity Analysis of the Static Aeroelastic Response of a Wing

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.

    1993-01-01

    A technique to obtain the sensitivity of the static aeroelastic response of a three dimensional wing model is designed and implemented. The formulation is quite general and accepts any aerodynamic and structural analysis capability. A program to combine the discipline level, or local, sensitivities into global sensitivity derivatives is developed. A variety of representations of the wing pressure field are developed and tested to determine the most accurate and efficient scheme for representing the field outside of the aerodynamic code. Chebyshev polynomials are used to globally fit the pressure field. This approach had some difficulties in representing local variations in the field, so a variety of local interpolation polynomial pressure representations are also implemented. These panel based representations use a constant pressure value, a bilinearly interpolated value. or a biquadraticallv interpolated value. The interpolation polynomial approaches do an excellent job of reducing the numerical problems of the global approach for comparable computational effort. Regardless of the pressure representation used. sensitivity and response results with excellent accuracy have been produced for large integrated quantities such as wing tip deflection and trim angle of attack. The sensitivities of such things as individual generalized displacements have been found with fair accuracy. In general, accuracy is found to be proportional to the relative size of the derivatives to the quantity itself.

  13. Rare cell isolation and analysis in microfluidics

    PubMed Central

    Chen, Yuchao; Li, Peng; Huang, Po-Hsun; Xie, Yuliang; Mai, John D.; Wang, Lin; Nguyen, Nam-Trung; Huang, Tony Jun

    2014-01-01

    Rare cells are low-abundance cells in a much larger population of background cells. Conventional benchtop techniques have limited capabilities to isolate and analyze rare cells because of their generally low selectivity and significant sample loss. Recent rapid advances in microfluidics have been providing robust solutions to the challenges in the isolation and analysis of rare cells. In addition to the apparent performance enhancements resulting in higher efficiencies and sensitivity levels, microfluidics provides other advanced features such as simpler handling of small sample volumes and multiplexing capabilities for high-throughput processing. All of these advantages make microfluidics an excellent platform to deal with the transport, isolation, and analysis of rare cells. Various cellular biomarkers, including physical properties, dielectric properties, as well as immunoaffinities, have been explored for isolating rare cells. In this Focus article, we discuss the design considerations of representative microfluidic devices for rare cell isolation and analysis. Examples from recently published works are discussed to highlight the advantages and limitations of the different techniques. Various applications of these techniques are then introduced. Finally, a perspective on the development trends and promising research directions in this field are proposed. PMID:24406985

  14. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  15. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  17. EBUS-Guided Cautery-Assisted Transbronchial Forceps Biopsies: Safety and Sensitivity Relative to Transbronchial Needle Aspiration

    PubMed Central

    Bramley, Kyle; Pisani, Margaret A.; Murphy, Terrence E.; Araujo, Katy; Homer, Robert; Puchalski, Jonathan

    2016-01-01

    Background EBUS-guided transbronchial needle aspiration (TBNA) is important in the evaluation of thoracic lymphadenopathy. Reliably providing excellent diagnostic yield for malignancy, its diagnosis of sarcoidosis is inconsistent. Furthermore, when larger “core” biopsy samples of malignant tissue are required, TBNA may not suffice. The primary objective of this study was to determine if the sequential use of TBNA and a novel technique called cautery-assisted transbronchial forceps biopsies (ca-TBFB) was safe. Secondary outcomes included sensitivity and successful acquisition of tissue. Methods Fifty unselected patients undergoing convex probe EBUS were prospectively enrolled. Under EBUS guidance, all lymph nodes ≥ 1 cm were sequentially biopsied using TBNA and ca-TBFB. Safety and sensitivity were assessed at the nodal level for 111 nodes. Results of each technique were also reported on a per-patient basis. Results There were no significant adverse events. In nodes determined to be malignant, TBNA provided higher sensitivity (100%) than ca-TBFB (78%). However, among nodes with granulomatous inflammation, ca-TBFB exhibited higher sensitivity (90%) than TBNA (33%). For analysis based on patients rather than nodes, 6 of the 31 patients with malignancy would have been missed or understaged if the diagnosis was based on samples obtained by ca-TBFB. On the other hand, 3 of 8 patients with sarcoidosis would have been missed if analysis was based only on TBNA samples. In some cases only ca-TBFB acquired sufficient tissue for the core samples needed in clinical trials of malignancy. Conclusions The sequential use of TBNA and ca-TBFB appears to be safe. The larger samples obtained from ca-TBFB increased its sensitivity to detect granulomatous disease and provided specimens for clinical trials of malignancy when needle biopsies were insufficient. For thoracic surgeons and advanced bronchoscopists, we advocate ca-TBFB as an alternative to TBNA in select clinical scenarios. PMID:26912301

  18. Endobronchial Ultrasound-Guided Cautery-Assisted Transbronchial Forceps Biopsies: Safety and Sensitivity Relative to Transbronchial Needle Aspiration.

    PubMed

    Bramley, Kyle; Pisani, Margaret A; Murphy, Terrence E; Araujo, Katy L; Homer, Robert J; Puchalski, Jonathan T

    2016-05-01

    Endobronchial ultrasound (EBUS)-guided transbronchial needle aspiration (TBNA) is important in the evaluation of thoracic lymphadenopathy. Reliably providing excellent diagnostic yield for malignancy, its diagnosis of sarcoidosis is inconsistent. Furthermore, TBNA may not suffice when larger "core biopsy" samples of malignant tissue are required. The primary objective of this study was to determine if the sequential use of TBNA and a novel technique called cautery-assisted transbronchial forceps biopsy (ca-TBFB) was safe. Secondary outcomes included sensitivity and successful acquisition of tissue. The study prospectively enrolled 50 unselected patients undergoing convex-probe EBUS. All lymph nodes exceeding 1 cm were sequentially biopsied under EBUS guidance using TBNA and ca-TBFB. Safety and sensitivity were assessed at the nodal level for 111 nodes. Results of each technique were also reported for each patient. There were no significant adverse events. In nodes determined to be malignant, TBNA provided higher sensitivity (100%) than ca-TBFB (78%). However, among nodes with granulomatous inflammation, ca-TBFB exhibited higher sensitivity (90%) than TBNA (33%). On the one hand, for analysis based on patients rather than nodes, 6 of the 31 patients with malignancy would have been missed or understaged if the diagnosis were based on samples obtained by ca-TBFB. On the other hand, 3 of 8 patients with sarcoidosis would have been missed if analysis were based only on TBNA samples. In some patients, only ca-TBFB acquired sufficient tissue for the core samples needed in clinical trials of malignancy. The sequential use of TBNA and ca-TBFB appears to be safe. The larger samples obtained from ca-TBFB increased its sensitivity to detect granulomatous disease and provided adequate specimens for clinical trials of malignancy when specimens from needle biopsies were insufficient. For thoracic surgeons and advanced bronchoscopists, we advocate ca-TBFB as an alternative to TBNA in select clinical scenarios. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Combined ultrasound-guided cutting-needle biopsy and standard pleural biopsy for diagnosis of malignant pleural effusions.

    PubMed

    Wang, Jinlin; Zhou, Xinghua; Xie, Xiaohong; Tang, Qing; Shen, Panxiao; Zeng, Yunxiang

    2016-11-17

    The most efficient approach to diagnose malignant pleural effusions (MPEs) is still controversial and uncertain. This study aimed to evaluate the utility of a combined approach using ultrasound (US)-guided cutting-needle biopsy (CNB) and standard pleural biopsy (SPB) for diagnosing MPE. Pleural effusions were collected from 172 patients for biochemical and microbiological analyses. US-guided CNB and SPB were performed in the same operation sequentially to obtain specimens for histological analysis. US-guided CNB and SPB procedures provided adequate material for histological analysis in 90.7 and 93.0% of cases, respectively, while a combination of the 2 techniques was in 96.5% of cases. The sensitivity, specificity, positive-predictive value (PPV), negative-predictive value (NPV) and diagnostic accuracy of US-guided CNB versus SPB were: 51.2 vs 63.4%, 100 vs 100%, 100 vs 100%, 64.9 vs 72.2% and 74.4 vs 81.3%, respectively. When CNB was combined with SPB, the corresponding values were 88.6, 100, 100, 88.6 and 93.9%, respectively. Whereas sensitivity, NPV and diagnostic accuracy were not significantly different between CNB and SPB, the combination of CNB and SPB significantly improved the sensitivity, NPV and diagnostic accuracy versus each technique alone (p < 0.05). Significant pain (eight patients), moderate haemoptysis (two patients) and chest wall haematomas (two patients) were observed following CNB, while syncope (four patients) and a slight pneumothorax (four patients) were observed following SPB. Use of a combination of US-guided CNB and SPB afforded a high sensitivity to diagnose MPEs, it is a convenient and safe approach.

  20. Time-Resolved Fluorescent Immunochromatography of Aflatoxin B1 in Soybean Sauce: A Rapid and Sensitive Quantitative Analysis.

    PubMed

    Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen

    2016-07-14

    Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.

  1. Analysis of Er{sup 3+} and Ho{sup 3+} codoped fluoroindate glasses as wide range temperature sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haro-Gonzalez, P., E-mail: patharo@ull.es; Leon-Luis, S.F.; Gonzalez-Perez, S.

    2011-07-15

    Graphical abstract: The sensor sensitivity as a function of the temperature of erbium and holmium doped fluoroindate glasses. A wide temperature range from 20 K to 425 K is covered with a sensitivity larger than 0.0005. Highlights: {yields} The FIR technique has been carried out in fluoroindate glass sample. {yields} The Er doped fluoroindate sample has a maximum sensitivity of 0.0028 K{sup -1} at 425 K. {yields} The Ho doped fluoroindate sample has a maximum sensitivity of 0.0036 K{sup -1} at 59 K. -- Abstract: The fluorescence intensity ratio technique for two fluoroindate glass samples has been carried out. Themore » green emissions at 523 nm and at 545 nm in a 0.1 mol% of Er{sup 3+} doped fluoroindate glass was studied in a wide range of temperature from 125 K to 425 K with a maximum sensitivity of 0.0028 K{sup -1} for 425 K. In a sample doped with 0.1 mol% of Ho{sup 3+} the emissions at 545 nm and at 750 nm were analyzed as a function of temperature from 20 K to 300 K obtaining a maximum sensitivity of 0.0036 K{sup -1} at 59 K. Using both fluoroindate glass samples a wide temperature range from 20 K to 425 K is easily covered pumping with two low-cost diode laser at 406 nm and 473 nm.« less

  2. Temperature analysis of laser ignited metalized material using spectroscopic technique

    NASA Astrophysics Data System (ADS)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  3. Sol-gel titania-coated needles for solid phase dynamic extraction-GC/MS analysis of desomorphine and desocodeine.

    PubMed

    Su, Chi-Ju; Srimurugan, Sankarewaran; Chen, Chinpiao; Shu, Hun-Chi

    2011-01-01

    Novel sol-gel titania film coated needles for solid-phase dynamic extraction (SPDE)-GC/MS analysis of desomorphine and desocodeine are described. The high thermal stability of titania film permits efficient extraction and analysis of poorly volatile opiate drugs. The influences of sol-gel reaction time, coating layer, extraction and desorption time and temperature on the SPDE needle performance were investigated. The deuterium labeled internal standard was introduced either during the extraction of analyte or directly injected to GC after the extraction process. The latter method was shown to be more sensitive for the analysis of water and urine samples containing opiate drugs. The proposed conditions provided a wide linear range (from 5-5000 ppb), and satisfactory linearity, with R(2) values from 0.9958 to 0.9999, and prominent sensitivity, LOQs (1.0-5.0 ng/g). The sol-gel titania film coated needle with SPDE-GC/MS will be a promising technique for desomorphine and desocodeine analysis in urine.

  4. Evaluation of erythrocyte dysmorphism by light microscopy with lowering of the condenser lens: A simple and efficient method.

    PubMed

    Barros Silva, Gyl Eanes; Costa, Roberto Silva; Ravinal, Roberto Cuan; Saraiva e Silva, Jucélia; Dantas, Marcio; Coimbra, Terezila Machado

    2010-03-01

    To demonstrate that the evaluation of erythrocyte dysmorphism by light microscopy with lowering of the condenser lens (LMLC) is useful to identify patients with a haematuria of glomerular or non-glomerular origin. A comparative double-blind study between phase contrast microscopy (PCM) and LMLC is reported to evaluate the efficacy of these techniques. Urine samples of 39 patients followed up for 9 months were analyzed, and classified as glomerular and non-glomerular haematuria. The different microscopic techniques were compared using receiver-operator curve (ROC) analysis and area under curve (AUC). Reproducibility was assessed by coefficient of variation (CV). Specific cut-offs were set for each method according to their best rate of specificity and sensitivity as follows: 30% for phase contrast microscopy and 40% for standard LMLC, reaching in the first method the rate of 95% and 100% of sensitivity and specificity, respectively, and in the second method the rate of 90% and 100% of sensitivity and specificity, respectively. In ROC analysis, AUC for PCM was 0.99 and AUC for LMLC was 0.96. The CV was very similar in glomerular haematuria group for PCM (35%) and LMLC (35.3%). LMLC proved to be effective in contributing to the direction of investigation of haematuria, toward the nephrological or urological side. This method can substitute PCM when this equipment is not available.

  5. Preparation of Ion Exchange Films for Solid-Phase Spectrophotometry and Solid-Phase Fluorometry

    NASA Technical Reports Server (NTRS)

    Hill, Carol M.; Street, Kenneth W.; Tanner, Stephen P.; Philipp, Warren H.

    2000-01-01

    Atomic spectroscopy has dominated the field of trace inorganic analysis because of its high sensitivity and selectivity. The advantages gained by the atomic spectroscopies come with the disadvantage of expensive and often complicated instrumentation. Solid-phase spectroscopy, in which the analyte is preconcentrated on a solid medium followed by conventional spectrophotometry or fluorometry, requires less expensive instrumentation and has considerable sensitivity and selectivity. The sensitivity gains come from preconcentration and the use of chromophore (or fluorophore) developers and the selectivity is achieved by use of ion exchange conditions that favor the analyte in combination with speciative chromophores. Little work has been done to optimize the ion exchange medium (IEM) associated with these techniques. In this report we present a method for making ion exchange polymer films, which considerably simplify the solid-phase spectroscopic techniques. The polymer consists of formaldehyde-crosslinked polyvinyl alcohol with polyacrylic acid entrapped therein. The films are a carboxylate weak cation exchanger in the calcium form. They are mechanically sturdy and optically transparent in the ultraviolet and visible portion of the spectrum, which makes them suitable for spectrophotometry and fluorometry.

  6. Quantitative evaluation of skeletal muscle defects in second harmonic generation images.

    PubMed

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  7. Quantitative evaluation of skeletal muscle defects in second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  8. An electrooptic probe to determine internal electric fields in a piezoelectric transformer.

    PubMed

    Norgard, Peter; Kovaleski, Scott

    2012-02-01

    A technique using the electrooptic effect to determine the output voltage of an optically clear LiNbO(3) piezoelectric transformer was developed and explored. A brief mathematical description of the solution is provided, as well as experimental data demonstrating a linear response under ac resonant operating conditions. A technique to calibrate the diagnostic was developed and is described. Finally, a sensitivity analysis of the electrooptic response to variations in angular alignment between the LiNbO(3) transformer and the laser probe are discussed.

  9. Rapid regulation of nuclear proteins by rapamycin-induced translocation in fission yeast

    PubMed Central

    Ding, Lin; Laor, Dana; Weisman, Ronit; Forsburg, Susan L

    2014-01-01

    Genetic analysis of protein function requires a rapid means of inactivating the gene under study. Typically this exploits temperature sensitive mutations, or promoter shut-off techniques. We report the adaptation to Schizosaccharomyces pombe of the Anchor Away technique, originally designed in budding yeast (Haruki et al., 2008a). This method relies on a rapamycin-mediated interaction between the FRB and FKBP12 binding domains, to relocalize nuclear proteins of interest to the cytoplasm. We demonstrate a rapid nuclear depletion of abundant proteins as proof-of-principle. PMID:24733494

  10. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis.

    PubMed

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement.

  11. Designing novel cellulase systems through agent-based modeling and global sensitivity analysis

    PubMed Central

    Apte, Advait A; Senger, Ryan S; Fong, Stephen S

    2014-01-01

    Experimental techniques allow engineering of biological systems to modify functionality; however, there still remains a need to develop tools to prioritize targets for modification. In this study, agent-based modeling (ABM) was used to build stochastic models of complexed and non-complexed cellulose hydrolysis, including enzymatic mechanisms for endoglucanase, exoglucanase, and β-glucosidase activity. Modeling results were consistent with experimental observations of higher efficiency in complexed systems than non-complexed systems and established relationships between specific cellulolytic mechanisms and overall efficiency. Global sensitivity analysis (GSA) of model results identified key parameters for improving overall cellulose hydrolysis efficiency including: (1) the cellulase half-life, (2) the exoglucanase activity, and (3) the cellulase composition. Overall, the following parameters were found to significantly influence cellulose consumption in a consolidated bioprocess (CBP): (1) the glucose uptake rate of the culture, (2) the bacterial cell concentration, and (3) the nature of the cellulase enzyme system (complexed or non-complexed). Broadly, these results demonstrate the utility of combining modeling and sensitivity analysis to identify key parameters and/or targets for experimental improvement. PMID:24830736

  12. A Novel Tri-Enzyme System in Combination with Laser-Driven NMR Enables Efficient Nuclear Polarization of Biomolecules in Solution

    PubMed Central

    Lee, Jung Ho; Cavagnero, Silvia

    2013-01-01

    NMR is an extremely powerful, yet insensitive technique. Many available nuclear polarization methods that address sensitivity are not directly applicable to low-concentration biomolecules in liquids and are often too invasive. Photochemically induced dynamic nuclear polarization (photo-CIDNP) is no exception. It needs high-power laser irradiation, which often leads to sample degradation, and photosensitizer reduction. Here, we introduce a novel tri-enzyme system that significantly overcomes the above challenges rendering photo-CIDNP a practically applicable technique for NMR sensitivity enhancement in solution. The specificity of the nitrate reductase (NR) enzyme is exploited to selectively in situ re-oxidize the reduced photo-CIDNP dye FMNH2. At the same time, the oxygen-scavenging ability of glucose oxidase (GO) and catalase (CAT) is synergistically employed to prevent sample photodegradation. The resulting tri-enzyme system (NR-GO-CAT) enables prolonged sensitivity-enhanced data collection in 1D and 2D heteronuclear NMR, leading to the highest photo-CIDNP sensitivity enhancement (48-fold relative to SE-HSQC) achieved to date for amino acids and polypeptides in solution. NR-GO-CAT extends the concentration limit of photo-CIDNP NMR down to the low micromolar range. In addition, sensitivity (relative to the reference SE-HSQC) is found to be inversely proportional to sample concentration, paving the way to the future analysis of even more diluted samples. PMID:23560683

  13. Characterization of Homopolymer and Polymer Blend Films by Phase Sensitive Acoustic Microscopy

    NASA Astrophysics Data System (ADS)

    Ngwa, Wilfred; Wannemacher, Reinhold; Grill, Wolfgang

    2003-03-01

    CHARACTERIZATION OF HOMOPOLYMER AND POLYMER BLEND FILMS BY PHASE SENSITIVE ACOUSTIC MICROSCOPY W Ngwa, R Wannemacher, W Grill Institute of Experimental Physics II, University of Leipzig, 04103 Leipzig, Germany Abstract We have used phase sensitive acoustic microscopy (PSAM) to study homopolymer thin films of polystyrene (PS) and poly (methyl methacrylate) (PMMA), as well as PS/PMMA blend films. We show from our results that PSAM can be used as a complementary and highly valuable technique for elucidating the three-dimensional (3D) morphology and micromechanical properties of thin films. Three-dimensional image acquisition with vector contrast provides the basis for: complex V(z) analysis (per image pixel), 3D image processing, height profiling, and subsurface image analysis of the polymer films. Results show good agreement with previous studies. In addition, important new information on the three dimensional structure and properties of polymer films is obtained. Homopolymer film structure analysis reveals (pseudo-) dewetting by retraction of droplets, resulting in a morphology that can serve as a starting point for the analysis of polymer blend thin films. The outcome of confocal laser scanning microscopy studies, performed on the same samples are correlated with the obtained results. Advantages and limitations of PSAM are discussed.

  14. Techniques for detecting effects of urban and rural land-use practices on stream-water chemistry in selected watersheds in Texas, Minnesota,and Illinois

    USGS Publications Warehouse

    Walker, J.F.

    1993-01-01

    Selected statistical techniques were applied to three urban watersheds in Texas and Minnesota and three rural watersheds in Illinois. For the urban watersheds, single- and paired-site data-collection strategies were considered. The paired-site strategy was much more effective than the singlesite strategy for detecting changes. Analysis of storm load regression residuals demonstrated the potential utility of regressions for variability reduction. For the rural watersheds, none of the selected techniques were effective at identifying changes, primarily due to a small degree of management-practice implementation, potential errors introduced through the estimation of storm load, and small sample sizes. A Monte Carlo sensitivity analysis was used to determine the percent change in water chemistry that could be detected for each watershed. In most instances, the use of regressions improved the ability to detect changes.

  15. Plasma properties of hot coronal loops utilizing coordinated SMM and solar research rocket observations

    NASA Technical Reports Server (NTRS)

    Moses, J. Daniel

    1989-01-01

    Three improvements in photographic x-ray imaging techniques for solar astronomy are presented. The testing and calibration of a new film processor was conducted; the resulting product will allow photometric development of sounding rocket flight film immediately upon recovery at the missile range. Two fine grained photographic films were calibrated and flight tested to provide alternative detector choices when the need for high resolution is greater than the need for high sensitivity. An analysis technique used to obtain the characteristic curve directly from photographs of UV solar spectra were applied to the analysis of soft x-ray photographic images. The resulting procedure provides a more complete and straightforward determination of the parameters describing the x-ray characteristic curve than previous techniques. These improvements fall into the category of refinements instead of revolutions, indicating the fundamental suitability of the photographic process for x-ray imaging in solar astronomy.

  16. [THE COMPARATIVE ANALYSIS OF EFFECTIVENESS OF QUICK TESTS IN DIAGNOSTIC OF INFLUENZA AND RESPIRATORY SYNCYTIAL VIRAL INFECTION IN CHILDREN].

    PubMed

    Petrova, E R; Sukhovetskaia, V P; Pisareva, M M; Maiorova, V G; Sverlova, M V; Danilenko, D M; Petrova, P A; Krivitskaia, V Z; Sominina, A A

    2015-11-01

    The analysis was implemented concerning diagnostic parameters of commercial quick tests (immune chromatographic tests BinaxNOW Influenza A&B and BinaxNow RSV Alere, Scarborough Inc., USA) under detection of antigens of influenza virus A and respiratory syncytial virus in clinical materials. The polymerase chain reaction in real-time and isolation ofviruses in cell cultures. The analysis of naso-pharyngeal smears from 116 children demonstrated that sensitivity and specifcity of detection of influenza virus A using device mariPOC in comparison with polymerase chain reaction made up to 93.8% and 99.0% correspondingly at total concordance of results of both techniques as 98.3%. At diagnosing of respiratory syncytial virus using device mariPOC parameters made up to 77.3%, 98.9% and 862% as compared with polymerase chain reaction. The sensitivity, specificity and total concordance of results of immune chromatographic tests BinaxNOW in comparison ofpolymerase chain reaction made up to 86.7%, 100% and 96.2% correspondingly at detection of influenza virus A and 80.9%, 97.4% and 91.6% correspondingly at detection of respiratory syncytial virus. In comparison with isolation technique in cell cultures sensitivity of system mariPOC and immune chromatographic tests proved to be in 1.3-1.4 times higher at detection of influenza virus A and in 1.7-2 times higher in case of isolation of respiratory syncytial virus. There is no statistically significant differences between diagnostic parameters received for mariPOC and immune chromatographic tests at diagnosing influenza virus A and respiratory syncytial viral infection.

  17. Evaluation by latent class analysis of a magnetic capture based DNA extraction followed by real-time qPCR as a new diagnostic method for detection of Echinococcus multilocularis in definitive hosts.

    PubMed

    Maas, Miriam; van Roon, Annika; Dam-Deisz, Cecile; Opsteegh, Marieke; Massolo, Alessandro; Deksne, Gunita; Teunis, Peter; van der Giessen, Joke

    2016-10-30

    A new method, based on a magnetic capture based DNA extraction followed by qPCR, was developed for the detection of the zoonotic parasite Echinococcus multilocularis in definitive hosts. Latent class analysis was used to compare this new method with the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. In total, 60 red foxes and coyotes from three different locations were tested with both molecular methods and the sedimentation and counting technique (SCT) or intestinal scraping technique (IST). Though based on a limited number of samples, it could be established that the magnetic capture based DNA extraction followed by qPCR showed similar sensitivity and specificity as the currently used phenol-chloroform DNA extraction followed by single tube nested PCR. All methods have a high specificity as shown by Bayesian latent class analysis. Both molecular assays have higher sensitivities than the combined SCT and IST, though the uncertainties in sensitivity estimates were wide for all assays tested. The magnetic capture based DNA extraction followed by qPCR has the advantage of not requiring hazardous chemicals like the phenol-chloroform DNA extraction followed by single tube nested PCR. This supports the replacement of the phenol-chloroform DNA extraction followed by single tube nested PCR by the magnetic capture based DNA extraction followed by qPCR for molecular detection of E. multilocularis in definitive hosts. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Performance of Panfungal- and Specific-PCR-Based Procedures for Etiological Diagnosis of Invasive Fungal Diseases on Tissue Biopsy Specimens with Proven Infection: a 7-Year Retrospective Analysis from a Reference Laboratory

    PubMed Central

    Bernal-Martinez, L.; Castelli, M. V.; Rodriguez-Tudela, J. L.; Cuenca-Estrella, M.

    2014-01-01

    A retrospective analysis of real-time PCR (RT-PCR) results for 151 biopsy samples obtained from 132 patients with proven invasive fungal diseases was performed. PCR-based techniques proved to be fast and sensitive and enabled definitive diagnosis in all cases studied, with detection of a total of 28 fungal species. PMID:24574295

  19. Behavior sensitivities for control augmented structures

    NASA Technical Reports Server (NTRS)

    Manning, R. A.; Lust, R. V.; Schmit, L. A.

    1987-01-01

    During the past few years it has been recognized that combining passive structural design methods with active control techniques offers the prospect of being able to find substantially improved designs. These developments have stimulated interest in augmenting structural synthesis by adding active control system design variables to those usually considered in structural optimization. An essential step in extending the approximation concepts approach to control augmented structural synthesis is the development of a behavior sensitivity analysis capability for determining rates of change of dynamic response quantities with respect to changes in structural and control system design variables. Behavior sensitivity information is also useful for man-machine interactive design as well as in the context of system identification studies. Behavior sensitivity formulations for both steady state and transient response are presented and the quality of the resulting derivative information is evaluated.

  20. Thin layer chromatography coupled to paper spray ionization mass spectrometry for cocaine and its adulterants analysis.

    PubMed

    De Carvalho, Thays C; Tosato, Flavia; Souza, Lindamara M; Santos, Heloa; Merlo, Bianca B; Ortiz, Rafael S; Rodrigues, Rayza R T; Filgueiras, Paulo R; França, Hildegardo S; Augusti, Rodinei; Romão, Wanderson; Vaz, Boniek G

    2016-05-01

    Thin layer chromatography (TLC) is a simple and inexpensive type of chromatography that is extensively used in forensic laboratories for drugs of abuse analysis. In this work, TLC is optimized to analyze cocaine and its adulterants (caffeine, benzocaine, lidocaine and phenacetin) in which the sensitivity (visual determination of LOD from 0.5 to 14mgmL(-1)) and the selectivity (from the study of three different eluents: CHCl3:CH3OH:HCOOHglacial (75:20:5v%), (C2H5)2O:CHCl3 (50:50v%) and CH3OH:NH4OH (100:1.5v%)) were evaluated. Aiming to improve these figures of merit, the TLC spots were identified and quantified (linearity with R(2)>0.98) by the paper spray ionization mass spectrometry (PS-MS), reaching now lower LOD values (>1.0μgmL(-1)). The method developed in this work open up perspective of enhancing the reliability of traditional and routine TLC analysis employed in the criminal expertise units. Higher sensitivity, selectivity and rapidity can be provided in forensic reports, besides the possibility of quantitative analysis. Due to the great simplicity, the PS(+)-MS technique can also be coupled directly to other separation techniques such as the paper chromatography and can still be used in analyses of LSD blotter, documents and synthetic drugs. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Capillary electrophoresis in two-dimensional separation systems: Techniques and applications.

    PubMed

    Kohl, Felix J; Sánchez-Hernández, Laura; Neusüß, Christian

    2015-01-01

    The analysis of complex samples requires powerful separation techniques. Here, 2D chromatographic separation techniques (e.g. LC-LC, GC-GC) are increasingly applied in many fields. Electrophoretic separation techniques show a different selectivity in comparison to LC and GC and very high separation efficiency. Thus, 2D separation systems containing at least one CE-based separation technique are an interesting alternative featuring potentially a high degree of orthogonality. However, the generally small volumes and strong electrical fields in CE require special coupling techniques. These technical developments are reviewed in this work, discussing benefits and drawbacks of offline and online systems. Emphasis is placed on the design of the systems, their coupling, and the detector used. Moreover, the employment of strategies to improve peak capacity, resolution, or sensitivity is highlighted. Various applications of 2D separations with CE are summarized. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    PubMed

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-06

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.

  3. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome.

    PubMed

    Lalonde, Michel; Wells, R Glenn; Birnie, David; Ruddy, Terrence D; Wassenaar, Richard

    2014-07-01

    Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. About 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.

  4. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalonde, Michel, E-mail: mlalonde15@rogers.com; Wassenaar, Richard; Wells, R. Glenn

    2014-07-15

    Purpose: Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. Methods: Aboutmore » 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Results: Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster analysis results were similar to SPECT RNA phase analysis (ROC AUC = 0.78, p = 0.73 vs cluster AUC; sensitivity/specificity = 59%/89%) and PET scar size analysis (ROC AUC = 0.73, p = 1.0 vs cluster AUC; sensitivity/specificity = 76%/67%). Conclusions: A SPECT RNA cluster analysis algorithm was developed for the prediction of CRT outcome. Cluster analysis results produced results equivalent to those obtained from Fourier and scar analysis.« less

  5. Detection of Bladder CA by Microsatellite Analysis (MSA) — EDRN Public Portal

    Cancer.gov

    Goal 1: To determine sensitivity and specificity of microsatellite analysis (MSA) of urine sediment, using a panel of 15 microsatellite markers, in detecting bladder cancer in participants requiring cystoscopy. This technique will be compared to the diagnostic standard of cystoscopy, as well as to urine cytology. Goal 2: To determine the temporal performance characteristics of microsatellite analysis of urine sediment. Goal 3: To determine which of the 15 individual markers or combination of markers that make up the MSA test are most predictive of the presence of bladder cancer.

  6. Influences of geological parameters to probabilistic assessment of slope stability of embankment

    NASA Astrophysics Data System (ADS)

    Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr

    2018-04-01

    This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.

  7. Non-resurfacing techniques in the management of the patella at total knee arthroplasty: A systematic review and meta-analysis.

    PubMed

    Findlay, I; Wong, F; Smith, C; Back, D; Davies, A; Ajuied, A

    2016-03-01

    Recent meta-analyses support not resurfacing the patella at the time of TKA. Several different modes of intervention are reported for non-resurfacing management of the patella at TKA. We have conducted a systematic review and meta-analysis of non-resurfacing interventions in TKA. The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) study methodology and reporting system was adopted, utilising the PRISMA checklist and statement. Classes of patella interventions were defined as: 0. No intervention. 1. Osteophyte excision only. 2. Osteophyte excision, denervation, with soft tissue debridement. 3. Osteophyte excision, denervation, soft tissue debridement, and drilling or micro-fracture of eburnated bone. 4. Patellar resurfacing. A meta-analysis was conducted upon the pre- and post-operative KSS for each technique. Four hundred and twenty-three studies were identified, 12 studies met the inclusion criteria for the systematic review and eight for the meta-analysis. Two studies compared different non-resurfacing patellar techniques, the other studies used the non-resurfacing cohort as controls for their prospective RCTs comparing patellar resurfacing with non-resurfacing. The meta-analysis revealed no significant difference between the techniques. We conclude that there is no significant difference in KSS for differing non-resurfacing patellar techniques, but further trials using patellofemoral specific scores may better demonstrate superior efficacy of specific classes of patella intervention, by virtue of greater sensitivity for patellofemoral pain and dysfunction. I. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  8. Use of an automated chromium reduction system for hydrogen isotope ratio analysis of physiological fluids applied to doubly labeled water analysis.

    PubMed

    Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C

    2000-09-01

    The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. Copyright 2000 John Wiley & Sons, Ltd.

  9. Antibodies against toluene diisocyanate protein conjugates. Three methods of measurement.

    PubMed

    Patterson, R; Harris, K E; Zeiss, C R

    1983-12-01

    With the use of canine antisera against toluene diisocyanate (TDI)-dog serum albumin (DSA), techniques for measuring antibody against TDI-DSA were evaluated. The use of an ammonium sulfate precipitation assay showed suggestive evidence of antibody binding but high levels of TDI-DSA precipitation in the absence of antibody limit any usefulness of this technique. Double-antibody co-precipitation techniques will measure total antibody or Ig class antibody against 125I-TDI-DSA. These techniques are quantitative. The polystyrene tube radioimmunoassay is a highly sensitive method of detecting and quantitatively estimating IgG antibody. The enzyme linked immunosorbent assay is a rapidly adaptable method for the quantitative estimation of IgG, IgA, and IgM against TDI-homologous proteins. All these techniques were compared and results are demonstrated by using the same serum sample for analysis.

  10. Analysis of plant nucleotide sugars by hydrophilic interaction liquid chromatography and tandem mass spectrometry.

    PubMed

    Ito, Jun; Herter, Thomas; Baidoo, Edward E K; Lao, Jeemeng; Vega-Sánchez, Miguel E; Michelle Smith-Moritz, A; Adams, Paul D; Keasling, Jay D; Usadel, Björn; Petzold, Christopher J; Heazlewood, Joshua L

    2014-03-01

    Understanding the intricate metabolic processes involved in plant cell wall biosynthesis is limited by difficulties in performing sensitive quantification of many involved compounds. Hydrophilic interaction liquid chromatography is a useful technique for the analysis of hydrophilic metabolites from complex biological extracts and forms the basis of this method to quantify plant cell wall precursors. A zwitterionic silica-based stationary phase has been used to separate hydrophilic nucleotide sugars involved in cell wall biosynthesis from milligram amounts of leaf tissue. A tandem mass spectrometry operating in selected reaction monitoring mode was used to quantify nucleotide sugars. This method was highly repeatable and quantified 12 nucleotide sugars at low femtomole quantities, with linear responses up to four orders of magnitude to several 100pmol. The method was also successfully applied to the analysis of purified leaf extracts from two model plant species with variations in their cell wall sugar compositions and indicated significant differences in the levels of 6 out of 12 nucleotide sugars. The plant nucleotide sugar extraction procedure was demonstrated to have good recovery rates with minimal matrix effects. The approach results in a significant improvement in sensitivity when applied to plant samples over currently employed techniques. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. A method for the direct injection and analysis of small volume human blood spots and plasma extracts containing high concentrations of organic solvents using revered-phase 2D UPLC/MS.

    PubMed

    Rainville, Paul D; Simeone, Jennifer L; Root, Dan S; Mallet, Claude R; Wilson, Ian D; Plumb, Robert S

    2015-03-21

    The emergence of micro sampling techniques holds great potential to improve pharmacokinetic data quality, reduce animal usage, and save costs in safety assessment studies. The analysis of these samples presents new challenges for bioanalytical scientists, both in terms of sample processing and analytical sensitivity. The use of two dimensional LC/MS with, at-column-dilution for the direct analysis of highly organic extracts prepared from biological fluids such as dried blood spots and plasma is demonstrated. This technique negated the need to dry down and reconstitute, or dilute samples with water/aqueous buffer solutions, prior to injection onto a reversed-phase LC system. A mixture of model drugs, including bromhexine, triprolidine, enrofloxacin, and procaine were used to test the feasibility of the method. Finally an LC/MS assay for the probe pharmaceutical rosuvastatin was developed from dried blood spots and protein-precipitated plasma. The assays showed acceptable recovery, accuracy and precision according to US FDA guidelines. The resulting analytical method showed an increase in assay sensitivity of up to forty fold as compared to conventional methods by maximizing the amount loaded onto the system and the MS response for the probe pharmaceutical rosuvastatin from small volume samples.

  12. Advanced proteomic liquid chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Smith, Richard D.; Shen, Yufeng

    2012-10-26

    Liquid chromatography coupled with mass spectrometry is the predominant platform used to analyze proteomics samples consisting of large numbers of proteins and their proteolytic products (e.g., truncated polypeptides) and spanning a wide range of relative concentrations. This review provides an overview of advanced capillary liquid chromatography techniques and methodologies that greatly improve separation resolving power and proteomics analysis coverage, sensitivity, and throughput.

  13. Technical Note: Asteroid Detection Demonstration from SkySat-3 - B612 Data Using Synthetic Tracking

    NASA Technical Reports Server (NTRS)

    Zhai, C.; Shao, M.; Lai, S.; Boerner, P.; Dyer, J.; Lu, E.; Reitsema, H.; Buie, M.

    2018-01-01

    We report results from analyzing the data taken by the sCMOS cameras on board of SkySat3 using the synthetic tracking technique. The analysis demonstrates the expected sensitivity improvement in the signal-to-noise ratio of the faint asteroids from properly stacking up the short exposure images in post-processing.

  14. Differentially Methylated Region-Representational Difference Analysis (DMR-RDA): A Powerful Method to Identify DMRs in Uncharacterized Genomes.

    PubMed

    Sasheva, Pavlina; Grossniklaus, Ueli

    2017-01-01

    Over the last years, it has become increasingly clear that environmental influences can affect the epigenomic landscape and that some epigenetic variants can have heritable, phenotypic effects. While there are a variety of methods to perform genome-wide analyses of DNA methylation in model organisms, this is still a challenging task for non-model organisms without a reference genome. Differentially methylated region-representational difference analysis (DMR-RDA) is a sensitive and powerful PCR-based technique that isolates DNA fragments that are differentially methylated between two otherwise identical genomes. The technique does not require special equipment and is independent of prior knowledge about the genome. It is even applicable to genomes that have high complexity and a large size, being the method of choice for the analysis of plant non-model systems.

  15. DEER Sensitivity between Iron Centers and Nitroxides in Heme-Containing Proteins Improves Dramatically Using Broadband, High-Field EPR

    PubMed Central

    2016-01-01

    This work demonstrates the feasibility of making sensitive nanometer distance measurements between Fe(III) heme centers and nitroxide spin labels in proteins using the double electron–electron resonance (DEER) pulsed EPR technique at 94 GHz. Techniques to measure accurately long distances in many classes of heme proteins using DEER are currently strongly limited by sensitivity. In this paper we demonstrate sensitivity gains of more than 30 times compared with previous lower frequency (X-band) DEER measurements on both human neuroglobin and sperm whale myoglobin. This is achieved by taking advantage of recent instrumental advances, employing wideband excitation techniques based on composite pulses and exploiting more favorable relaxation properties of low-spin Fe(III) in high magnetic fields. This gain in sensitivity potentially allows the DEER technique to be routinely used as a sensitive probe of structure and conformation in the large number of heme and many other metalloproteins. PMID:27035368

  16. A Protein Nanopore-Based Approach for Bacteria Sensing

    NASA Astrophysics Data System (ADS)

    Apetrei, Aurelia; Ciuca, Andrei; Lee, Jong-kook; Seo, Chang Ho; Park, Yoonkyung; Luchian, Tudor

    2016-11-01

    We present herein a first proof of concept demonstrating the potential of a protein nanopore-based technique for real-time detection of selected Gram-negative bacteria ( Pseudomonas aeruginosa or Escherichia coli) at a concentration of 1.2 × 108 cfu/mL. The anionic charge on the bacterial outer membrane promotes the electrophoretically driven migration of bacteria towards a single α-hemolysin nanopore isolated in a lipid bilayer, clamped at a negative electric potential, and followed by capture at the nanopore's mouth, which we found to be described according to the classical Kramers' theory. By using a specific antimicrobial peptide as a putative molecular biorecognition element for the bacteria used herein, we suggest that the detection system can combine the natural sensitivity of the nanopore-based sensing techniques with selective biological recognition, in aqueous samples, and highlight the feasibility of the nanopore-based platform to provide portable, sensitive analysis and monitoring of bacterial pathogens.

  17. Flow Visualization at Cryogenic Conditions Using a Modified Pressure Sensitive Paint Approach

    NASA Technical Reports Server (NTRS)

    Watkins, A. Neal; Goad, William K.; Obara, Clifford J.; Sprinkle, Danny R.; Campbell, Richard L.; Carter, Melissa B.; Pendergraft, Odis C., Jr.; Bell, James H.; Ingram, JoAnne L.; Oglesby, Donald M.

    2005-01-01

    A modification to the Pressure Sensitive Paint (PSP) method was used to visualize streamlines on a Blended Wing Body (BWB) model at full-scale flight Reynolds numbers. In order to achieve these conditions, the tests were carried out in the National Transonic Facility operating under cryogenic conditions in a nitrogen environment. Oxygen is required for conventional PSP measurements, and several tests have been successfully completed in nitrogen environments by injecting small amounts (typically < 3000 ppm) of oxygen into the flow. A similar technique was employed here, except that air was purged through pressure tap orifices already existent on the model surface, resulting in changes in the PSP wherever oxygen was present. The results agree quite well with predicted results obtained through computational fluid dynamics analysis (CFD), which show this to be a viable technique for visualizing flows without resorting to more invasive procedures such as oil flow or minitufts.

  18. Application of gamma imaging techniques for the characterisation of position sensitive gamma detectors

    NASA Astrophysics Data System (ADS)

    Habermann, T.; Didierjean, F.; Duchêne, G.; Filliger, M.; Gerl, J.; Kojouharov, I.; Li, G.; Pietralla, N.; Schaffner, H.; Sigward, M.-H.

    2017-11-01

    A device to characterize position-sensitive germanium detectors has been implemented at GSI. The main component of this so called scanning table is a gamma camera that is capable of producing online 2D images of the scanned detector by means of a PET technique. To calibrate the gamma camera Compton imaging is employed. The 2D data can be processed further offline to obtain depth information. Of main interest is the response of the scanned detector in terms of the digitized pulse shapes from the preamplifier. This is an important input for pulse-shape analysis algorithms as they are in use for gamma tracking arrays in gamma spectroscopy. To validate the scanning table, a comparison of its results with a second scanning table implemented at the IPHC Strasbourg is envisaged. For this purpose a pixelated germanium detector has been scanned.

  19. Validity and consistency assessment of accident analysis methods in the petroleum industry.

    PubMed

    Ahmadi, Omran; Mortazavi, Seyed Bagher; Khavanin, Ali; Mokarami, Hamidreza

    2017-11-17

    Accident analysis is the main aspect of accident investigation. It includes the method of connecting different causes in a procedural way. Therefore, it is important to use valid and reliable methods for the investigation of different causal factors of accidents, especially the noteworthy ones. This study aimed to prominently assess the accuracy (sensitivity index [SI]) and consistency of the six most commonly used accident analysis methods in the petroleum industry. In order to evaluate the methods of accident analysis, two real case studies (process safety and personal accident) from the petroleum industry were analyzed by 10 assessors. The accuracy and consistency of these methods were then evaluated. The assessors were trained in the workshop of accident analysis methods. The systematic cause analysis technique and bowtie methods gained the greatest SI scores for both personal and process safety accidents, respectively. The best average results of the consistency in a single method (based on 10 independent assessors) were in the region of 70%. This study confirmed that the application of methods with pre-defined causes and a logic tree could enhance the sensitivity and consistency of accident analysis.

  20. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  1. Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines

    NASA Astrophysics Data System (ADS)

    Massa, Luca

    A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.

  2. Fourier transform ion cyclotron resonance mass spectrometry

    NASA Astrophysics Data System (ADS)

    Marshall, Alan G.

    1998-06-01

    As for Fourier transform infrared (FT-IR) interferometry and nuclear magnetic resonance (NMR) spectroscopy, the introduction of pulsed Fourier transform techniques revolutionized ion cyclotron resonance mass spectrometry: increased speed (factor of 10,000), increased sensitivity (factor of 100), increased mass resolution (factor of 10,000-an improvement not shared by the introduction of FT techniques to IR or NMR spectroscopy), increased mass range (factor of 500), and automated operation. FT-ICR mass spectrometry is the most versatile technique for unscrambling and quantifying ion-molecule reaction kinetics and equilibria in the absence of solvent (i.e., the gas phase). In addition, FT-ICR MS has the following analytically important features: speed (~1 second per spectrum); ultrahigh mass resolution and ultrahigh mass accuracy for analysis of mixtures and polymers; attomole sensitivity; MSn with one spectrometer, including two-dimensional FT/FT-ICR/MS; positive and/or negative ions; multiple ion sources (especially MALDI and electrospray); biomolecular molecular weight and sequencing; LC/MS; and single-molecule detection up to 108 Dalton. Here, some basic features and recent developments of FT-ICR mass spectrometry are reviewed, with applications ranging from crude oil to molecular biology.

  3. Genomic profiling of plasma cell disorders in a clinical setting: integration of microarray and FISH, after CD138 selection of bone marrow

    PubMed Central

    Berry, Nadine Kaye; Bain, Nicole L; Enjeti, Anoop K; Rowlings, Philip

    2014-01-01

    Aim To evaluate the role of whole genome comparative genomic hybridisation microarray (array-CGH) in detecting genomic imbalances as compared to conventional karyotype (GTG-analysis) or myeloma specific fluorescence in situ hybridisation (FISH) panel in a diagnostic setting for plasma cell dyscrasia (PCD). Methods A myeloma-specific interphase FISH (i-FISH) panel was carried out on CD138 PC-enriched bone marrow (BM) from 20 patients having BM biopsies for evaluation of PCD. Whole genome array-CGH was performed on reference (control) and neoplastic (test patient) genomic DNA extracted from CD138 PC-enriched BM and analysed. Results Comparison of techniques demonstrated a much higher detection rate of genomic imbalances using array-CGH. Genomic imbalances were detected in 1, 19 and 20 patients using GTG-analysis, i-FISH and array-CGH, respectively. Genomic rearrangements were detected in one patient using GTG-analysis and seven patients using i-FISH, while none were detected using array-CGH. I-FISH was the most sensitive method for detecting gene rearrangements and GTG-analysis was the least sensitive method overall. All copy number aberrations observed in GTG-analysis were detected using array-CGH and i-FISH. Conclusions We show that array-CGH performed on CD138-enriched PCs significantly improves the detection of clinically relevant and possibly novel genomic abnormalities in PCD, and thus could be considered as a standard diagnostic technique in combination with IGH rearrangement i-FISH. PMID:23969274

  4. Genomic profiling of plasma cell disorders in a clinical setting: integration of microarray and FISH, after CD138 selection of bone marrow.

    PubMed

    Berry, Nadine Kaye; Bain, Nicole L; Enjeti, Anoop K; Rowlings, Philip

    2014-01-01

    To evaluate the role of whole genome comparative genomic hybridisation microarray (array-CGH) in detecting genomic imbalances as compared to conventional karyotype (GTG-analysis) or myeloma specific fluorescence in situ hybridisation (FISH) panel in a diagnostic setting for plasma cell dyscrasia (PCD). A myeloma-specific interphase FISH (i-FISH) panel was carried out on CD138 PC-enriched bone marrow (BM) from 20 patients having BM biopsies for evaluation of PCD. Whole genome array-CGH was performed on reference (control) and neoplastic (test patient) genomic DNA extracted from CD138 PC-enriched BM and analysed. Comparison of techniques demonstrated a much higher detection rate of genomic imbalances using array-CGH. Genomic imbalances were detected in 1, 19 and 20 patients using GTG-analysis, i-FISH and array-CGH, respectively. Genomic rearrangements were detected in one patient using GTG-analysis and seven patients using i-FISH, while none were detected using array-CGH. I-FISH was the most sensitive method for detecting gene rearrangements and GTG-analysis was the least sensitive method overall. All copy number aberrations observed in GTG-analysis were detected using array-CGH and i-FISH. We show that array-CGH performed on CD138-enriched PCs significantly improves the detection of clinically relevant and possibly novel genomic abnormalities in PCD, and thus could be considered as a standard diagnostic technique in combination with IGH rearrangement i-FISH.

  5. Chapter 5: Modulation Excitation Spectroscopy with Phase-Sensitive Detection for Surface Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shulda, Sarah; Richards, Ryan M.

    Advancements in in situ spectroscopic techniques have led to significant progress being made in elucidating heterogeneous reaction mechanisms. The potential of these progressive methods is often limited only by the complexity of the system and noise in the data. Short-lived intermediates can be challenging, if not impossible, to identify with conventional spectra analysis means. Often equally difficult is separating signals that arise from active and inactive species. Modulation excitation spectroscopy combined with phase-sensitive detection analysis is a powerful tool for removing noise from the data while simultaneously revealing the underlying kinetics of the reaction. A stimulus is applied at amore » constant frequency to the reaction system, for example, a reactant cycled with an inert phase. Through mathematical manipulation of the data, any signal contributing to the overall spectra but not oscillating with the same frequency as the stimulus will be dampened or removed. With phase-sensitive detection, signals oscillating with the stimulus frequency but with various lag times are amplified providing valuable kinetic information. In this chapter, some examples are provided from the literature that have successfully used modulation excitation spectroscopy with phase-sensitive detection to uncover previously unobserved reaction intermediates and kinetics. Examples from a broad range of spectroscopic methods are included to provide perspective to the reader.« less

  6. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  7. From thermometric to spectrophotometric kinetic-catalytic methods of analysis. A review.

    PubMed

    Cerdà, Víctor; González, Alba; Danchana, Kaewta

    2017-05-15

    Kinetic-catalytic analytical methods have proved to be very easy and highly sensitive strategies for chemical analysis, that rely on simple instrumentation [1,2]. Molecular absorption spectrophotometry is commonly used as the detection technique. However, other detection systems, like electrochemical or thermometric ones, offer some interesting possibilities since they are not affected by the color or turbidity of the samples. In this review some initial experience with thermometric kinetic-catalytic methods is described, up to our current experience exploiting spectrophotometric flow techniques to automate this kind of reactions, including the use of integrated chips. Procedures for determination of inorganic and organic species in organic and inorganic matrices are presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Analysis of airfoil transitional separation bubbles

    NASA Technical Reports Server (NTRS)

    Davis, R. L.; Carter, J. E.

    1984-01-01

    A previously developed local inviscid-viscous interaction technique for the analysis of airfoil transitional separation bubbles, ALESEP (Airfoil Leading Edge Separation) has been modified to utilize a more accurate windward finite difference procedure in the reversed flow region, and a natural transition/turbulence model has been incorporated for the prediction of transition within the separation bubble. Numerous calculations and experimental comparisons are presented to demonstrate the effects of the windward differencing scheme and the natural transition/turbulence model. Grid sensitivity and convergence capabilities of this inviscid-viscous interaction technique are briefly addressed. A major contribution of this report is that with the use of windward differencing, a second, counter-rotating eddy has been found to exist in the wall layer of the primary separation bubble.

  9. Radar cross section fundamentals for the aircraft designer

    NASA Technical Reports Server (NTRS)

    Stadmore, H. A.

    1979-01-01

    Various aspects of radar cross-section (RCS) techniques are summarized, with emphasis placed on fundamental electromagnetic phenomena, such as plane and spherical wave formulations, and the definition of RCS is given in the far-field sense. The basic relationship between electronic countermeasures and a signature level is discussed in terms of the detectability range of a target vehicle. Fundamental radar-signature analysis techniques, such as the physical-optics and geometrical-optics approximations, are presented along with examples in terms of aircraft components. Methods of analysis based on the geometrical theory of diffraction are considered and various wave-propagation phenomena are related to local vehicle geometry. Typical vehicle components are also discussed, together with their contribution to total vehicle RCS and their individual signature sensitivities.

  10. Miniature Tunable Laser Spectrometers for Quantifying Atmospheric Trace Gases, Water Resources, Earth Back-Contamination, and In Situ Resource Utilization

    NASA Technical Reports Server (NTRS)

    Webster, Chris; Blacksberg, Jordana; Flesch, Greg; Keymeulen, Didier; Christensen, Lance; Forouhar, Siamak

    2012-01-01

    The Tunable Laser Spectrometers (TLS) technique has seen wide applicability in gas measurement and analysis for atmospheric analysis, industrial, commercial and health monitoring and space applications. In Earth science using balloons and aircraft over 2 decades, several groups (JPL, NASA Langley & Ames, NOAA, Harvard U., etc) have demonstrated the technique for ozone hole studies, lab kinetics measurements, cloud physics and transport, climate change in the ice record. The recent availability of high-power (mW) room temperature lasers (TDL, IC, QC) has enabled miniaturized, high-sensitivity spectrometers for industry and space (1) Mars, Titan, Venus, Saturn, Moon (2) Commercial isotope ratio spectrometers are replacing bulkier, complex isotope ratio mass spectrometers.

  11. Combined Bisulfite Restriction Analysis for brain tissue identification.

    PubMed

    Samsuwan, Jarunya; Muangsub, Tachapol; Yanatatsaneejit, Pattamawadee; Mutirangura, Apiwat; Kitkumthorn, Nakarin

    2018-05-01

    According to the tissue-specific methylation database (doi: 10.1016/j.gene.2014.09.060), methylation at CpG locus cg03096975 in EML2 has been preliminarily proven to be specific to brain tissue. In this study, we enlarged sample size and developed a technique for identifying brain tissue in aged samples. Combined Bisulfite Restriction Analysis-for EML2 (COBRA-EML2) technique was established and validated in various organ samples obtained from 108 autopsies. In addition, this technique was also tested for its reliability, minimal DNA concentration detected, and use in aged samples and in samples obtained from specific brain compartments and spinal cord. COBRA-EML2 displayed 100% sensitivity and specificity for distinguishing brain tissue from other tissues, showed high reliability, was capable of detecting minimal DNA concentration (0.015ng/μl), could be used for identifying brain tissue in aged samples. In summary, COBRA-EML2 is a technique to identify brain tissue. This analysis is useful in criminal cases since it can identify the vital organ tissues from small samples acquired from criminal scenes. The results from this analysis can be counted as a medical and forensic marker supporting criminal investigations, and as one of the evidences in court rulings. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Comprehensive Monte-Carlo simulator for optimization of imaging parameters for high sensitivity detection of skin cancer at the THz

    NASA Astrophysics Data System (ADS)

    Ney, Michael; Abdulhalim, Ibrahim

    2016-03-01

    Skin cancer detection at its early stages has been the focus of a large number of experimental and theoretical studies during the past decades. Among these studies two prominent approaches presenting high potential are reflectometric sensing at the THz wavelengths region and polarimetric imaging techniques in the visible wavelengths. While THz radiation contrast agent and source of sensitivity to cancer related tissue alterations was considered to be mainly the elevated water content in the cancerous tissue, the polarimetric approach has been verified to enable cancerous tissue differentiation based on cancer induced structural alterations to the tissue. Combining THz with the polarimetric approach, which is considered in this study, is examined in order to enable higher detection sensitivity than previously pure reflectometric THz measurements. For this, a comprehensive MC simulation of radiative transfer in a complex skin tissue model fitted for the THz domain that considers the skin`s stratified structure, tissue material optical dispersion modeling, surface roughness, scatterers, and substructure organelles has been developed. Additionally, a narrow beam Mueller matrix differential analysis technique is suggested for assessing skin cancer induced changes in the polarimetric image, enabling the tissue model and MC simulation to be utilized for determining the imaging parameters resulting in maximal detection sensitivity.

  13. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  14. HPLC-PFD determination of priority pollutant PAHs in water, sediment, and semipermeable membrane devices

    USGS Publications Warehouse

    Williamson, K.S.; Petty, J.D.; Huckins, J.N.; Lebo, J.A.; Kaiser, E.M.

    2002-01-01

    High performance liquid chromatography coupled with programmable fluorescence detection was employed for the determination of 15 priority pollutant polycyclic aromatic hydrocarbons (PPPAHs) in water, sediment, and semipermeable membrane devices (SPMDs). Chromatographic separation using this analytical method facilitates selectivity, sensitivity (ppt levels), and can serve as a non-destructive technique for subsequent analysis by other chromatographic and spectroscopic techniques. Extraction and sample cleanup procedures were also developed for water, sediment, and SPMDs using various chromatographic and wet chemical methods. The focus of this publication is to examine the enrichment techniques and the analytical methodologies used in the isolation, characterization, and quantitation of 15 PPPAHs in different sample matrices.

  15. Thermal Inspection of Composite Honeycomb Structures

    NASA Technical Reports Server (NTRS)

    Zalameda, Joseph N.; Parker, F. Raymond

    2014-01-01

    Composite honeycomb structures continue to be widely used in aerospace applications due to their low weight and high strength advantages. Developing nondestructive evaluation (NDE) inspection methods are essential for their safe performance. Pulsed thermography is a commonly used technique for composite honeycomb structure inspections due to its large area and rapid inspection capability. Pulsed thermography is shown to be sensitive for detection of face sheet impact damage and face sheet to core disbond. Data processing techniques, using principal component analysis to improve the defect contrast, are presented. In addition, limitations to the thermal detection of the core are investigated. Other NDE techniques, such as computed tomography X-ray and ultrasound, are used for comparison to the thermography results.

  16. Simulating muscular thin films using thermal contraction capabilities in finite element analysis tools.

    PubMed

    Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D

    2016-10-01

    In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Critical comparison of diffuse reflectance spectroscopy and colorimetry as dermatological diagnostic tools for acanthosis nigricans: a chemometric approach

    PubMed Central

    Devpura, Suneetha; Pattamadilok, Bensachee; Syed, Zain U.; Vemulapalli, Pranita; Henderson, Marsha; Rehse, Steven J.; Hamzavi, Iltefat; Lim, Henry W.; Naik, Ratna

    2011-01-01

    Quantification of skin changes due to acanthosis nigricans (AN), a disorder common among insulin-resistant diabetic and obese individuals, was investigated using two optical techniques: diffuse reflectance spectroscopy (DRS) and colorimetry. Measurements were obtained from AN lesions on the neck and two control sites of eight AN patients. A principal component/discriminant function analysis successfully differentiated between AN lesion and normal skin with 87.7% sensitivity and 94.8% specificity in DRS measurements and 97.2% sensitivity and 96.4% specificity in colorimetry measurements. PMID:21698027

  18. IMRT QA: Selecting gamma criteria based on error detection sensitivity.

    PubMed

    Steers, Jennifer M; Fraass, Benedick A

    2016-04-01

    The gamma comparison is widely used to evaluate the agreement between measurements and treatment planning system calculations in patient-specific intensity modulated radiation therapy (IMRT) quality assurance (QA). However, recent publications have raised concerns about the lack of sensitivity when employing commonly used gamma criteria. Understanding the actual sensitivity of a wide range of different gamma criteria may allow the definition of more meaningful gamma criteria and tolerance limits in IMRT QA. We present a method that allows the quantitative determination of gamma criteria sensitivity to induced errors which can be applied to any unique combination of device, delivery technique, and software utilized in a specific clinic. A total of 21 DMLC IMRT QA measurements (ArcCHECK®, Sun Nuclear) were compared to QA plan calculations with induced errors. Three scenarios were studied: MU errors, multi-leaf collimator (MLC) errors, and the sensitivity of the gamma comparison to changes in penumbra width. Gamma comparisons were performed between measurements and error-induced calculations using a wide range of gamma criteria, resulting in a total of over 20 000 gamma comparisons. Gamma passing rates for each error class and case were graphed against error magnitude to create error curves in order to represent the range of missed errors in routine IMRT QA using 36 different gamma criteria. This study demonstrates that systematic errors and case-specific errors can be detected by the error curve analysis. Depending on the location of the error curve peak (e.g., not centered about zero), 3%/3 mm threshold = 10% at 90% pixels passing may miss errors as large as 15% MU errors and ±1 cm random MLC errors for some cases. As the dose threshold parameter was increased for a given %Diff/distance-to-agreement (DTA) setting, error sensitivity was increased by up to a factor of two for select cases. This increased sensitivity with increasing dose threshold was consistent across all studied combinations of %Diff/DTA. Criteria such as 2%/3 mm and 3%/2 mm with a 50% threshold at 90% pixels passing are shown to be more appropriately sensitive without being overly strict. However, a broadening of the penumbra by as much as 5 mm in the beam configuration was difficult to detect with commonly used criteria, as well as with the previously mentioned criteria utilizing a threshold of 50%. We have introduced the error curve method, an analysis technique which allows the quantitative determination of gamma criteria sensitivity to induced errors. The application of the error curve method using DMLC IMRT plans measured on the ArcCHECK® device demonstrated that large errors can potentially be missed in IMRT QA with commonly used gamma criteria (e.g., 3%/3 mm, threshold = 10%, 90% pixels passing). Additionally, increasing the dose threshold value can offer dramatic increases in error sensitivity. This approach may allow the selection of more meaningful gamma criteria for IMRT QA and is straightforward to apply to other combinations of devices and treatment techniques.

  19. Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling

    PubMed Central

    Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.

    2013-01-01

    Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111

  20. Trace-fiber color discrimination by electrospray ionization mass spectrometry: a tool for the analysis of dyes extracted from submillimeter nylon fibers.

    PubMed

    Tuinman, Albert A; Lewis, Linda A; Lewis, Samuel A

    2003-06-01

    The application of electrospray ionization mass spectrometry (ESI-MS) to trace-fiber color analysis is explored using acidic dyes commonly employed to color nylon-based fibers, as well as extracts from dyed nylon fibers. Qualitative information about constituent dyes and quantitative information about the relative amounts of those dyes present on a single fiber become readily available using this technique. Sample requirements for establishing the color identity of different samples (i.e., comparative trace-fiber analysis) are shown to be submillimeter. Absolute verification of dye mixture identity (beyond the comparison of molecular weights derived from ESI-MS) can be obtained by expanding the technique to include tandem mass spectrometry (ESI-MS/MS). For dyes of unknown origin, the ESI-MS/MS analyses may offer insights into the chemical structure of the compound-information not available from chromatographic techniques alone. This research demonstrates that ESI-MS is viable as a sensitive technique for distinguishing dye constituents extracted from a minute amount of trace-fiber evidence. A protocol is suggested to establish/refute the proposition that two fibers--one of which is available in minute quantity only--are of the same origin.

Top